Blown To Bits

The Crisis in Internet Measurement

Sunday, June 15th, 2008 by Hal Abelson
People purchase diovan best price professional should speak with a doctor before changing their diet to cafergot sale ensure no new drinks will interfere with their treatment plan. cialis without prescription Before applying homemade pastes, a person should speak with a buy celexa alternatives info doctor to ensure that it is safe to do so. buy glyburide without prescription Ongoing research supports the recommendation for individuals at an average diovan online stores risk of developing ovarian cancer, especially when they are already online pharmacy kenalog undergoing pelvic procedures such as a hysterectomy, as the procedure prednisolone cheap can provide additional protective benefits. To determine whether a person diovan pharmacy online has allergic asthma relating to cat allergies, a doctor or lowest price for prozac allergist may first assess symptoms and when they occur, as well.

Two dozen leading Internet researchers met last week at a workshop hosted by Google in Mountain View to discuss the growing crisis in network measurement.

The crisis is this: despite the critical importance of the Internet infrastructure, no one really knows how well the Net is working and how well it will hold up under increasing use. The measurement data that would help network researchers analyze network performance isn’t being collected ‚Äî perhaps it can’t be.

As a consumer, you can tell when your cable TV channels are on the fritz, or when your cell phone reception is poor. But when your Web response is sluggish or you can’t connect to a favorite site, you’re pretty much in the dark. It might be that the site you’re trying to connect to is down or overloaded, but it might also be a loose connection in your house, or a problem with your computer’s settings, or a program bug.

It might also be that your Internet service provider is intentionally slowing down your connection for various applications, a practice known as traffic shaping or, more pejoratively, as data discrimination. University network services and some other ISPs often do this to slow down response on peer-to-peer sharing for music or video files. Or an ISP might actively disrupt the flow of packets by injecting spurious reset commands into TCP streams, one of the techniques used in the “Great Firewall of China” to block connections to politically undesirable sites. And not only China. In 2007 rumors circulated around the Net that Comcast was actively interfering with peer-to-peer file-sharing traffic. Comcast denied it, but measurements performed last October by the Associated Press revealed that Comcast was in fact disrupting those connections.

For the average person, there’s almost no information available on which to base informed decisions about the service provided by an ISP. The factors affecting performance and too complex, and the data is simply unavailable.

And the situation isn’t much better for top network experts. Even for the Internet’s core, there are no good public sources of performance data; indeed, measurement data is often kept secret, since it’s considered to be valuable proprietary information by service providers. Researchers simply don’t know, for example, which segments of the Internet are particularly vulnerable to congestion, or what fraction of Internet traffic is due to viruses and spam.

The experts in Mountain View last week, many of whom conduct their own measurement experiments, discussed ways of sharing data and methods for getting better results. They also considered creating tools that non-experts could use to get information on the performance of Internet connections. The Electronic Frontier Foundation provides links to some tools at
http://www.eff.org/testyourisp, but these are limited and hard to use, and the results are difficult to interpret.

There were ideas for improving testing, but privacy is a real quandary: effective testing requires combining measurements from multiple sources. A public database of detailed network performance measurements would be a boon to research, but the same database could be mined for details about who was using the Internet when, and for what. The dilemma is like the privacy tradeoffs for epidemiological studies, between the needs of public-health experts and the desire to preserve privacy of individual medical records.

For such critical infrastructure as the Internet, the ignorance of consumers and experts alike is troubling and potentially dangerous. In the words of K Claffy of the Cooperative Association for Internet Data Analysis (CAIDA):

While the core of the Internet continues its relentless evolution, scientific measurement and modeling of its systemic characteristics has largely stalled. What little measurement is occurring reveals some disturbing realities about the ability of the Internet’s architecture to serve society’s needs and expectations.

One Response to “The Crisis in Internet Measurement”

  1. Gary Matthews Says:

    A tip sheet for choosing an ISP: http://www.networkoptimizationnews.com/top11things.html