Blown To Bits

Archive for the ‘The Internet and the Web’ Category

The Crisis in Internet Measurement

Sunday, June 15th, 2008 by Hal Abelson
Not buy cheap atenolol only can these symptoms be difficult to separate, but people buy clomid low cheap price who have migraine with aura can develop a migrainous stroke. order diflucan low price drugs There is a formulary, a list of medications, for PDPs, cialis medicine and each plan must cover at least two drugs in where to buy cipro the "most commonly prescribed" category — but the insurer can buy discount quinine choose which. Also, because generics contain the same active ingredients zofran prescription as brand-name drugs, they don't require the same costly testing. glyburide canada Additionally, while managing diet is an effective strategy for alleviating purchase aldactone online symptoms, some research suggests that males are less willing to buy synthroid online change their diet than females. However, this medication may also buy cheap triamterene side effects liquid cause severe side effects affecting the skin, including the rare celebrex without prescription and serious skin disorder Stevens-Johnson syndrome. Drug-resistant TB takes much clindamycin gel without prescription longer to treat and may require medication with potentially harmful cialis for order side effects. Social media is also helping Multiracial youths across the.

Two dozen leading Internet researchers met last week at a workshop hosted by Google in Mountain View to discuss the growing crisis in network measurement.

The crisis is this: despite the critical importance of the Internet infrastructure, no one really knows how well the Net is working and how well it will hold up under increasing use. The measurement data that would help network researchers analyze network performance isn’t being collected ‚Äî perhaps it can’t be.

As a consumer, you can tell when your cable TV channels are on the fritz, or when your cell phone reception is poor. But when your Web response is sluggish or you can’t connect to a favorite site, you’re pretty much in the dark. It might be that the site you’re trying to connect to is down or overloaded, but it might also be a loose connection in your house, or a problem with your computer’s settings, or a program bug.

It might also be that your Internet service provider is intentionally slowing down your connection for various applications, a practice known as traffic shaping or, more pejoratively, as data discrimination. University network services and some other ISPs often do this to slow down response on peer-to-peer sharing for music or video files. Or an ISP might actively disrupt the flow of packets by injecting spurious reset commands into TCP streams, one of the techniques used in the “Great Firewall of China” to block connections to politically undesirable sites. And not only China. In 2007 rumors circulated around the Net that Comcast was actively interfering with peer-to-peer file-sharing traffic. Comcast denied it, but measurements performed last October by the Associated Press revealed that Comcast was in fact disrupting those connections.

For the average person, there’s almost no information available on which to base informed decisions about the service provided by an ISP. The factors affecting performance and too complex, and the data is simply unavailable.

And the situation isn’t much better for top network experts. Even for the Internet’s core, there are no good public sources of performance data; indeed, measurement data is often kept secret, since it’s considered to be valuable proprietary information by service providers. Researchers simply don’t know, for example, which segments of the Internet are particularly vulnerable to congestion, or what fraction of Internet traffic is due to viruses and spam.

The experts in Mountain View last week, many of whom conduct their own measurement experiments, discussed ways of sharing data and methods for getting better results. They also considered creating tools that non-experts could use to get information on the performance of Internet connections. The Electronic Frontier Foundation provides links to some tools at
http://www.eff.org/testyourisp, but these are limited and hard to use, and the results are difficult to interpret.

There were ideas for improving testing, but privacy is a real quandary: effective testing requires combining measurements from multiple sources. A public database of detailed network performance measurements would be a boon to research, but the same database could be mined for details about who was using the Internet when, and for what. The dilemma is like the privacy tradeoffs for epidemiological studies, between the needs of public-health experts and the desire to preserve privacy of individual medical records.

For such critical infrastructure as the Internet, the ignorance of consumers and experts alike is troubling and potentially dangerous. In the words of K Claffy of the Cooperative Association for Internet Data Analysis (CAIDA):

While the core of the Internet continues its relentless evolution, scientific measurement and modeling of its systemic characteristics has largely stalled. What little measurement is occurring reveals some disturbing realities about the ability of the Internet’s architecture to serve society’s needs and expectations.

End of the Internet?

Sunday, June 8th, 2008 by Harry Lewis

This site claims to have inside information from Internet Service Providers (ISPs) that they are planning to go to the Cable TV model for the Web — basic service buys access to a list of web sites they stipulate, but if you want to wander off that territory, you’d have to pay them extra. $19.95 to get to Google, say, but $29.95 if you want access the New York Times web site (for which the New York Times itself charges nothing). That would be the end of the Internet as we know it, where anyone can put information up on a site and anyone else, for nothing more than their base ISP connection fee, can go see it.

“Net neutrality” is an important basic principle. I like to think of Internet connectivity as the US thought about rural electrification in the last century — something that might not be cost-efficient for private providers in the short run, but would yield enormous social and economic benefits to the US in the long run. If this report is true, imagine a world in which the electric company might supply you with electricity so you could run stoves and refrigerators on its approved list, but would charge you extra if you plugged in an appliance not approved by the electric company itself.

This is a complicated topic, but the fundamental problem is that there are not enough competing suppliers of Internet services. A quarter of the US still has only dialup; half has two suppliers, usually cable and telephone DSL; and a quarter has only one. The percentage of US households that have more than two choices for broadband connectivity is negligible. Under such conditions, the suppliers can contemplate tiered pricing schemes, which make absolutely no sense in terms of resources required — it costs no more to deliver packets from a billion different sources than from only one.

Keeping the Internet Open, Innovative, and Free

Sunday, June 8th, 2008 by Hal Abelson

On June 4, the Center for Democracy and Technology published The Internet in Transition: A Platform to Keep the Internet Open, Innovative and Free. This 25-page report summarizes CDT’s recommendation on Internet policy for the next Administration and Congress.

Readers of Blown to Bits will find the issues here familiar: preserving free speech while protecting children online; strengthening consumer privacy and restoring protections again government surveillance; using the power of the Internet to promote freedom and democracy on a global scale; protecting innovation by resisting attempts to undercut the Internet’s open architecture; and capitalizing upon the Internet as a force to encourage open government.

In the words of the report:

In recent years, policymakers seem to have forgotten what makes the Internet special. Increasingly, policy proposals treat the Internet as a problem to be solved rather than a valuable resource that must be supported. Debates over objectionable content online, protecting intellectual property, preventing terrorism, or restructuring telecommunications policy seem to have lost sight of the Internet’s history and its architecture.

This version of the report is a first draft. CDT and has launched a web site for readers to comments and suggest additional policy initiatives for incorporation into later versions of the report.

There are many detailed proposals and links to other CDT policy reviews. This is a great reference to Internet policy, and well worth reading and commenting on, regardless of where you stand on the issues.

The site is at http://www.cdt.org/election2008/ and the report itself is available at http://cdt.org/election2008/election2008.pdf.

Endwistle’s alias

Monday, June 2nd, 2008 by Harry Lewis

An alias is literally just ‘another’ — another name someone uses, or another identity. An alibi (alias ubi) is ‘another place’ where a suspect in a criminal place claims he was at the time the crime was committed.

The term ‘alias’ has been adopted into tech talk to describe what happens when information is lost in the course of capturing it as bits. When you see the pixellation of a low-resolution image, or the staircase effect on what is supposed to be a straight, smooth line, you are seeing an aliasing phenomenon. The staircase is as close to a straight line as can be drawn using only a few pixels, but if what you were depicting really was a staircase, you’d get exactly the same representation. Different realities, when reduced to bits, wind up as the same representation, and there is no way to know from those bits alone which reality they came from.

Information is always discarded when anything continuous is represented as bits. The question is not whether such data loss happens, but whether it matters. And whether it matters depends on how the representation is going to be used. The author photo on this site is a good representation of us, but not if you wanted to recognize us from behind. In a digital audio file, it may not matter if very high frequencies are discarded, since most people over the age of 20 couldn’t hear them anyway.

What does this have to do with Mr. Entwistle, who is standing trial on charges of murdering his wife and child? We noted earlier that his computer gave up some bits that the prosecution planned to use against him: the URLs of some adult-oriented web sites he had visited. Apparently the prosecution will argue that these bits are relevant because the URLs gave a glimpse of Mr. Entwistle’s sexual dissatisfaction, thus helping establish a motive for the murder. Not so fast: the defense doesn’t deny that those sites were visited, but offers another interpretation of the same bits. As the Boston Herald explains,

Attorney Elliot Weinstein argued turning to steamy online porn sites is not necessarily an indication of a joyless sex life; it could also mean a couple was looking to spice up their marriage.

“It might improve sexual activity . . . it might be a curiosity,” Weinstein said during the final pretrial arguments in Middlesex Superior Court in Woburn.

Searching for porn may just be for “interest,” or “excitement” or to “expand knowledge,” Weinstein added in his appeal to strike any online sex surfing as evidence of prior “bad acts.”

The judge will decide whether these bits are relevant, and if they are, the jury will get to decide whose interpretation of them is more plausible. But the defense’s basic point is sound: decontextualized bits can represent more than one reality, and our digital fingerprints, while revealing, are an imperfect representation of who we really are.

More on the Lori Drew case

Tuesday, May 20th, 2008 by Harry Lewis

I wrote a few days ago about the overreaching federal prosecution in this sad case. Blogger Susan Crawford has a good explanation today of just how great the stretch is, and how far the same principle could be taken by ambitious prosecutors to criminalize speech acts never meant to be prohibited by any existing law.

Blogs Are Great, but Is Anyone Reading Them?

Sunday, April 20th, 2008 by Harry Lewis

The New York Times reports this morning that When the Ex Blogs, the Dirtiest Laundry Is Aired. Divorced people are using their personal blogs to let the world know what creeps their former spouses are.

There is nothing really surprising about this. For years people have been worried about the mean, nasty stuff young people say about each other on Facebook, in MySpace, and on blogs. Adults are just catching up to youth culture. It’s also true that teenagers were walking around with MP3 players and earbuds a few years before middle-aged men with briefcases were doing it. One of the women quoted isn’t worried about the impact on her children for exactly that reason. As the Times reports, “It is a generational issue …. We think it will be a big deal, but it won’t be to them. By the time they are old enough to read it, they will have spent their entire life online. It will be like, ‘Oh yeah, I expected that.’ ”

Yet I find the article interesting in several ways, beyond the head-shaking instinct. Why is it apparently mostly women doing this? Is it really a healthy form of catharsis, as a number of those posting comments have suggested?But perhaps most surprising is the statement that 10% of adult Internet users have created their own blogs. I tracked down that number, and it is understated: The actual percentage, from this table, is 12%. Is that level sustainable? The same report says that only 39% of adult Internet users read other people’s blogs! One imagines a strange world in which millions of people are writing blogs about intimate personal matters, and almost no one is reading most of them.