Export Compliance Daily is a service of Warren Communications News.
‘Good Data’ on ‘Bad Peering’

Anomalies Detected in FCC’s Broadband Speed Tests at Cogent Peering Points

Anomalies were detected in some of speed tests done as part of the FCC broadband measurement project, officials said at Thursday’s meeting of industry stakeholders. The problem is limited to connections between Cogent and various ISPs, officials said. Measurement Lab (M-Lab) representative Meredith Whittaker, of Google, identified the ISPs involved as Verizon and Time Warner Cable. The identification itself prompted concern from group members that the data was supposed to remain private, as FCC officials emphasized that this wasn’t necessarily a problem with the ISPs themselves, but rather with connections between M-Lab and Cogent.

Sign up for a free preview to unlock the rest of this article

Export Compliance Daily combines U.S. export control news, foreign border import regulation and policy developments into a single daily information service that reliably informs its trade professional readers about important current issues affecting their operations.

Standardized tests run through three different types of servers operated by M-Lab, Level 3 and various ISPs should generally agree, officials said. That’s not happening: Discrepancies are being seen at nodes in Los Angeles, New York, Dallas and Seattle. “We're at the very early stages of this,” said Alex Salter, CEO of SamKnows, the U.K.-based company assisting the FCC with measurement. He described “discrepancies” in the data. “The initial analysis would point to a peering issue,” said Whittaker. The access link is provisioned, but then there are issues at some peering points, she said. “We're only seeing this at servers with a Cogent uplink.”

Cogent has “disputes with Verizon and the major cable players over upgrading connections to accommodate the tremendous growth in the Internet,” said Chief Legal Officer Robert Beury. “We can’t get them to upgrade as fast or as much as the Internet requires to keep operating smoothly.” Beury said he couldn’t say for sure whether the company’s peering disputes are related to the current discrepancies. The company’s engineers plan to look into the issue to see what may have happened, he said. Cogent and Verizon last month highlighted a peering dispute between the companies (CD June 21 p1). Cogent CEO Dave Schaeffer told us the dispute has led to a slowdown in some traffic over Verizon’s networks.

Officials expressed concern after Whittaker publicly revealed the ISPs involved, telling Whittaker during the meeting that she wasn’t allowed to do that. Salter said he specifically didn’t mention company names. “We all signed nondisclosure agreements” to not release results of the testing, protested Vice President David Young of Verizon. “And here we are talking about test results that [a Communications Daily reporter] may decide to write about tomorrow,” published in advance of the official FCC report without proper vetting or investigation, he said. “I'm concerned about our ability to function in an environment like this.” Whittaker declined to comment on the allegation that she may have violated an NDA. An M-Lab representative declined to elaborate on the ISP data it presented at the meeting. “We'd prefer to speak when the analysis is further along as under the best of circumstances, these peering arrangements are complex,” said M-Lab engineer Thomas Gideon, technical director of the Open Technology Institute. Verizon and TWC both declined to comment.

Young proposed putting together a “small working group” to examine the anomalies, and potentially to augment the measurement platforms currently in place. A new class of M-Lab servers, with upgraded architecture, could test connectivity to the Internet from multiple paths rather than through a single provider’s network, he said: That “will deliver the type of testing that this program was designed for."

"It’s important to identify that what we have is not an error in measurement,” said Whittaker: What the group has is “good data” about “bad peering.” That impacts networks every day, she said, but it might not be how the group wants to represent “the totality” of an ISP.

"Having data on bad peering is important,” and this is just one more way these measurements can add data to the FCC’s report, Whittaker said. The data reflect a path that traffic crosses on the Internet, and being able to compare this data with data from the unaffected servers “illuminates” the workings of the Internet “in a way that having one or the other doesn’t,” she said.

Young hopes to have measurement supplementation in place by September, in time for the next round of testing. “My concern is that if we just move forward and use the results from the existing servers with the existing known problem, then instead of being a test of an ISP’s broadband performance generally, it’s a test of the ISP’s connectivity with another backbone provider,” Young said. That’s “interesting data, but it’s not really the purpose of the Measuring Broadband America program,” he said.

"We don’t have enough information yet to actually discuss what fixing it looks like,” said Salter. “We may not want to fix it because we may end up impacting the environment we're actually trying to test. There maybe things there that we want to see."

The data currently collected were never intended to show up in the measurement report, FCC officials said. Since there’s time, the group should look at the anomalies as an opportunity to improve the overall data collection process, said Walter Johnston, chief of the Office of Engineering and Technology’s Electromagnetic Compatibility Division. The group has some flexibility, and could also move the planned September data collection to October “if we have to,” he said. “Let’s evaluate what our options are first, before we declare a crisis.”