Blown To Bits

The Crisis in Internet Measurement

Sunday, June 15th, 2008 by Hal Abelson
Doctors order cheap cialis sale dosage may recommend a combination of lifestyle changes and medications to buying generic cialis help with underlying health conditions such as COPD or asthma. order viagra lowest dosage cheapest price It depends on various factors, including the type of cancer, buy generic compazine its growth rate, individual health conditions, and screening practices. Potential buy cheap accutane online benefits may include calming the nervous system, which presumably fosters buy discount pamoate without prescription info healthier habits. It is not a diagnosis but instead a generic ventolin group of side effects that include slowed movements, balance and celebrex sale gait issues, tremors, and stiffness. People experiencing sleep attacks are zithromax for order unaware of their environment and often cannot fight the urge buy artane without prescription to sleep. This is because they will help determine the buy cialis on line most appropriate approach and target those areas during sessions. A stroke.

Two dozen leading Internet researchers met last week at a workshop hosted by Google in Mountain View to discuss the growing crisis in network measurement.

The crisis is this: despite the critical importance of the Internet infrastructure, no one really knows how well the Net is working and how well it will hold up under increasing use. The measurement data that would help network researchers analyze network performance isn’t being collected ‚Äî perhaps it can’t be.

As a consumer, you can tell when your cable TV channels are on the fritz, or when your cell phone reception is poor. But when your Web response is sluggish or you can’t connect to a favorite site, you’re pretty much in the dark. It might be that the site you’re trying to connect to is down or overloaded, but it might also be a loose connection in your house, or a problem with your computer’s settings, or a program bug.

It might also be that your Internet service provider is intentionally slowing down your connection for various applications, a practice known as traffic shaping or, more pejoratively, as data discrimination. University network services and some other ISPs often do this to slow down response on peer-to-peer sharing for music or video files. Or an ISP might actively disrupt the flow of packets by injecting spurious reset commands into TCP streams, one of the techniques used in the “Great Firewall of China” to block connections to politically undesirable sites. And not only China. In 2007 rumors circulated around the Net that Comcast was actively interfering with peer-to-peer file-sharing traffic. Comcast denied it, but measurements performed last October by the Associated Press revealed that Comcast was in fact disrupting those connections.

For the average person, there’s almost no information available on which to base informed decisions about the service provided by an ISP. The factors affecting performance and too complex, and the data is simply unavailable.

And the situation isn’t much better for top network experts. Even for the Internet’s core, there are no good public sources of performance data; indeed, measurement data is often kept secret, since it’s considered to be valuable proprietary information by service providers. Researchers simply don’t know, for example, which segments of the Internet are particularly vulnerable to congestion, or what fraction of Internet traffic is due to viruses and spam.

The experts in Mountain View last week, many of whom conduct their own measurement experiments, discussed ways of sharing data and methods for getting better results. They also considered creating tools that non-experts could use to get information on the performance of Internet connections. The Electronic Frontier Foundation provides links to some tools at
http://www.eff.org/testyourisp, but these are limited and hard to use, and the results are difficult to interpret.

There were ideas for improving testing, but privacy is a real quandary: effective testing requires combining measurements from multiple sources. A public database of detailed network performance measurements would be a boon to research, but the same database could be mined for details about who was using the Internet when, and for what. The dilemma is like the privacy tradeoffs for epidemiological studies, between the needs of public-health experts and the desire to preserve privacy of individual medical records.

For such critical infrastructure as the Internet, the ignorance of consumers and experts alike is troubling and potentially dangerous. In the words of K Claffy of the Cooperative Association for Internet Data Analysis (CAIDA):

While the core of the Internet continues its relentless evolution, scientific measurement and modeling of its systemic characteristics has largely stalled. What little measurement is occurring reveals some disturbing realities about the ability of the Internet’s architecture to serve society’s needs and expectations.

One Response to “The Crisis in Internet Measurement”

  1. Gary Matthews Says:

    A tip sheet for choosing an ISP: http://www.networkoptimizationnews.com/top11things.html