Export Compliance Daily is a service of Warren Communications News.
‘Big Bags of Water’

FCC to Examine Wireless Bottlenecks Inside the Home

Broadband speed to the home is only one part of the story. What happens inside the home can have a huge impact on the speeds a user experiences, so for its next study, the FCC broadband measurement group plans to identify wireless bottlenecks inside the home. “Users always complain about their performance, but often it’s their own fault,” said Nick Feamster, an associate professor of computer science at Georgia Tech. At Thursday’s broadband measurement meeting, he presented findings from a small study that could be used to inform future FCC measurements. Commission officials hope to monitor in-home interference using special software on the white boxes already used for its standard Measuring Broadband America test.

Sign up for a free preview to unlock the rest of this article

Export Compliance Daily combines U.S. export control news, foreign border import regulation and policy developments into a single daily information service that reliably informs its trade professional readers about important current issues affecting their operations.

The cost of a service call to an ISP can be up to $25, but most of the time, the problem has nothing to do with the ISP, Feamster said. “The user could have its access point in some ridiculous spot, or it could be outdated equipment, or interference from neighbors, or too many devices,” he said. “We just want to focus on the specific question of is it a home wireless problem, or is it an access problem, or is it elsewhere?”

Contention from other devices, cross-traffic on the wireless network, or non-Wi-Fi interference from baby monitors could all be the culprit. That’s the focus of Feamster’s research. Wireless traffic analysis is “really tricky because the performance is so variable,” with results that could change if someone closes a door or walks by, he said. “We're basically big bags of water.”

"This is not a benchmarking study,” said Chief Walter Johnston of the FCC’s Office of Engineering and Technology’s Electromagnetic Compatibility Division. The agency isn’t proposing to compare ISPs, he said: The goal of this project is to gain “general knowledge on the state of the Internet."

Once downstream throughput exceeds about 10 Mbps, “the bottlenecks are almost always the user’s fault,” said Feamster, summarizing the results of a recent study his team conducted on about 30 homes. “I want to do this not in 30 homes, but in 4,000 homes,” he said. The plan would be to determine whether the access link is the bottleneck, and then “look for pathologies” in the wireless network and in the wide area, he said. To identify wireless bottlenecks, the commission could use a “collection of heuristics,” he said: A high bitrate variation, for instance, often indicates poor wireless channel quality. A high Transmission Control Protocol round-trip time between the client and the access point indicates possible contention in wireless devices on the network, he said. Heuristics is an experience-based way of studying problems where a huge sample size is impractical.

The FCC wants feedback from members of the broadband measurement group about how the study could proceed as a special study in the next quarter. The data would offer “a lot of new insights” and be very extremely valuable for consumers, said OET senior attorney-adviser James Miller. “This looks like an interesting and worthwhile area of study,” said David Young, Verizon vice president-federal regulatory affairs.

Also at Thursday’s meeting, a team from North Carolina State University presented results of a study validating the measurements the FCC has presented in its broadband speed reports. The group analyzed data from April 2012, with a focus on download speed. “From a validation perspective … we got numbers that were very close to those presented in the report,” said Neha Rawal, team lead on the project that analyzed 18 gigabytes of data.

The NCSU group also set out to create a better method for consumers to make sense of the speed measurement reports. Results that the FCC presents -- average speeds at a national level -- are less useful than localized information that accounts for the variation users get between ISPs or locations, Rawal said. She proposed a “consistent speed” percentile, which would demonstrate when a majority of households will get a majority of the advertised speed for a majority of the time. Rawal presented a new set of bar charts to better account for variation in delivered speeds. “An average doesn’t mean that everybody gets this performance,” or that people get it all the time, she said. It might not be a good idea to present charts with error bars and standard deviation lines to a consumer “who may not understand what one standard deviation is,” she said. Rawal demonstrated a shaded bar chart that shows at a glance what percentage of consumers get what percentage of the advertised speed.

Presenting the data has been “something we've struggled with,” Miller said. One of the problems has been being able to “create a chart that isn’t completely incomprehensible,” while still presenting the data in a way that is meaningful, he said. Rawal’s analysis has “given us some food for thought,” Johnston said. The commission will “cherry pick what we can from this” to improve the report, he said.