Blown To Bits

The Crisis in Internet Measurement

Sunday, June 15th, 2008 by Hal Abelson
Medical buy lumigan and private insurance will typically cover the cost of traditional t-ject 60 no prescription cataract surgery but will not cover laser cataract surgery. The cheap acomplia in canada Alzheimer's Association recommends that people consider whether the health and get cheapest (ovral low price canada safety of the person with dementia is at risk. Care artane purchase low free price homes are long-term care facilities where people with dementia can nasonex online stores live and receive support services. In-home care may be suitable cafergot cheap price for those who require extra help with activities of daily buy cheapest once daily living, such as bathing, using the toilet, eating, and mobility, buying cheap buy alternatives professional such as moving from a bed to a chair. If order discount a online effects a person with dementia is having difficulties taking care of cheapest professional themselves and requires a higher level of care than in-home cheapest diovan care, a care home may be necessary. Doctors may prescribe Xarelto.

Two dozen leading Internet researchers met last week at a workshop hosted by Google in Mountain View to discuss the growing crisis in network measurement.

The crisis is this: despite the critical importance of the Internet infrastructure, no one really knows how well the Net is working and how well it will hold up under increasing use. The measurement data that would help network researchers analyze network performance isn’t being collected ‚Äî perhaps it can’t be.

As a consumer, you can tell when your cable TV channels are on the fritz, or when your cell phone reception is poor. But when your Web response is sluggish or you can’t connect to a favorite site, you’re pretty much in the dark. It might be that the site you’re trying to connect to is down or overloaded, but it might also be a loose connection in your house, or a problem with your computer’s settings, or a program bug.

It might also be that your Internet service provider is intentionally slowing down your connection for various applications, a practice known as traffic shaping or, more pejoratively, as data discrimination. University network services and some other ISPs often do this to slow down response on peer-to-peer sharing for music or video files. Or an ISP might actively disrupt the flow of packets by injecting spurious reset commands into TCP streams, one of the techniques used in the “Great Firewall of China” to block connections to politically undesirable sites. And not only China. In 2007 rumors circulated around the Net that Comcast was actively interfering with peer-to-peer file-sharing traffic. Comcast denied it, but measurements performed last October by the Associated Press revealed that Comcast was in fact disrupting those connections.

For the average person, there’s almost no information available on which to base informed decisions about the service provided by an ISP. The factors affecting performance and too complex, and the data is simply unavailable.

And the situation isn’t much better for top network experts. Even for the Internet’s core, there are no good public sources of performance data; indeed, measurement data is often kept secret, since it’s considered to be valuable proprietary information by service providers. Researchers simply don’t know, for example, which segments of the Internet are particularly vulnerable to congestion, or what fraction of Internet traffic is due to viruses and spam.

The experts in Mountain View last week, many of whom conduct their own measurement experiments, discussed ways of sharing data and methods for getting better results. They also considered creating tools that non-experts could use to get information on the performance of Internet connections. The Electronic Frontier Foundation provides links to some tools at
http://www.eff.org/testyourisp, but these are limited and hard to use, and the results are difficult to interpret.

There were ideas for improving testing, but privacy is a real quandary: effective testing requires combining measurements from multiple sources. A public database of detailed network performance measurements would be a boon to research, but the same database could be mined for details about who was using the Internet when, and for what. The dilemma is like the privacy tradeoffs for epidemiological studies, between the needs of public-health experts and the desire to preserve privacy of individual medical records.

For such critical infrastructure as the Internet, the ignorance of consumers and experts alike is troubling and potentially dangerous. In the words of K Claffy of the Cooperative Association for Internet Data Analysis (CAIDA):

While the core of the Internet continues its relentless evolution, scientific measurement and modeling of its systemic characteristics has largely stalled. What little measurement is occurring reveals some disturbing realities about the ability of the Internet’s architecture to serve society’s needs and expectations.

One Response to “The Crisis in Internet Measurement”

  1. Gary Matthews Says:

    A tip sheet for choosing an ISP: http://www.networkoptimizationnews.com/top11things.html