‘Religious’ Fervor of Neutrality Backers Ignores Net Architecture, ITIF Told
Open-minded agnosticism is the ideal approach toward Internet architecture and network management, engineers central to the Internet’s development and academics told the Information Technology and Innovation Foundation (ITIF) Friday. Several times they dismissed as “religious” the views of net-neutrality supporters who claim that the so- called end-to-end principle was central to the Internet’s development. But Christopher Yoo, director of the Center for Technology, Innovation and Competition at the University of Pennsylvania Law School, credited FCC Chairman Julius Genachowski’s speech last week (CD Sept 22 p1) with the most “nuanced” view of network management from an official to date.
Sign up for a free preview to unlock the rest of this article
Export Compliance Daily combines U.S. export control news, foreign border import regulation and policy developments into a single daily information service that reliably informs its trade professional readers about important current issues affecting their operations.
Network architect Richard Bennett wasn’t so kind to Genachowski, calling false his claim that the Internet doesn’t inherently privilege certain kinds of traffic over others -- namely content over communications. The co-leader of the IEEE group that created the original Ethernet hub standard, Bennett said neutrality backers were promoting policies that “feel and sound good” but ignore the central premise of the Internet -- “design for change,” also the title of Bennett’s new report for the foundation. Engineers started with the idea of the “dumb network core,” today advocated by neutrality backers, because they figured some functions were better performed at the “edges.” But “nobody knew where to draw that line” between network and edge, and the best place for the line may be different today than 20 years ago, he said.
The end-to-end principle has been “secondary” at best for engineers’ work on network designs, Bennett said. The general principle of neutrality is fine, but it’s hard even for engineers to agree on where it’s appropriate to “deviate,” he said, comparing network discrimination to affirmative action. “There are some classes of packets that need a little help” for Internet innovation to proceed, he said. He called researchers associated with Harvard’s Berkman Center, generally known for favoring neutrality, an “amazing source of amusement” to him, for their claimed denial that network features such as low latency are inherently costly. Without a native routing architecture, the Internet has been in “crisis” for years, and will lose its “dynamism” with top-down controls on network design, Bennett said.
“I've never understood the infatuation” with the end-to- end principle, said John Day, professor of computer science at Boston University Metropolitan College. The Internet has “careened from crisis to crisis,” and thanks to the “religious fervor” over neutrality starting among some engineers in the 1980s, it still resembles the old-line DOS- type operating system more than the Unix OS that’s common in servers and workstations today, he said. Internet architecture has suffered from too much “practice” and too little “theory,” Day said. “We've been living on Band-Aids in Moore’s Law for 30 years,” a reference to the sharp gains in computing power as having mitigated the Internet’s alleged deficiencies.
The end-to-end principle is a “conversation ender,” Yoo said, closing off the “tradeoffs” inherent to engineering. Neutrality backers view the Internet as a 1990s “hierarchical” network where incumbents can exert great control over end users, and they ignore current violations of the end-to-end principle in public policy -- Canada won’t let traffic between British Columbia and Ontario pass through U.S. networks because of U.S. surveillance laws, for example, he said. Yoo said he’s most concerned that engineers aren’t experimenting enough with network designs.
But William Lehr of MIT’s Computer Science and Artificial Intelligence Lab said end-to-end “remains to this day very relevant as a touch point for discussion.” There’s no consensus in the technical community on whether edge functions should necessarily move to the core over time, he said. The real debate is “who has the right to change” functions. The FCC won’t “legislate” neutrality because it would “quickly break everything” -- and embolden neutrality critics, Lehr said. Dave Farber of Carnegie Mellon University, whose research led to the first distributed computer system, said the Internet fell victim to a “very difficult research problem” -- maintaining security and robustness in an environment where “you didn’t expect friends to get so nasty.” The Internet has reached speeds now that Farber said in the 1980s would challenge the integrity of the TCP/IP system, he said. “Religious discussions” on neutrality, and the likelihood that regulators will retroactively frown upon needed changes, make it hard to design a network where “experimentation is possible.” The market isn’t always right, but “I'd rather take my chances” there than with regulators, Farber said.
The divergent paths taken by Verizon and AT&T with video show the uncertainty of business decisions, and should give regulators pause on judging which is more viable, Yoo said. Verizon took a pounding on Wall Street for its expensive fiber buildout, while AT&T got applause for rejiggering its existing infrastructure to handle video -- reserving much bandwidth for a single application. Only recently has Wall Street warmed to Verizon’s strategy, he said. The AOL-Time Warner merger, initially thought to presage a larger crash, simply wiped out investment in one company: “We find out in the real world what’s right.”
The problem with assuming that incumbents are out to disadvantage competitors, and must be stopped through neutrality mandates, is that such mandates would also foreclose helpful changes to consumers, Bennett said. Yoo pointed to content delivery networks that cache content apart from the public Internet as an “ingenious architecture” that, if subjected to the same neutrality scrutiny as the Internet, wouldn’t be viable. Case-by-case antitrust review can address many problems better than neutrality mandates, he said. Asked by a professor whether control over certain network functions could be pushed to “higher layers” as a way to prevent discrimination, Farber said “getting a QoS that actually works” has remained elusive -- “best-effort” quality of service is an oxymoron.
A public-relations executive in the audience faulted speakers for a “patronizing view” of neutrality backers that wouldn’t sway public opinion. Bennett agreed he was openly “dismissive” of Larry Lessig, the Stanford professor whose free-culture movement has heavily influenced neutrality supporters. Yoo said the harsh rhetoric came from skeptics’ own marginalization at the hands of neutrality backers, adding there’s a better balance now. “It’s clear” that Internet access pricing models will become more complex, and “we shouldn’t retard those things.”