Viewsdesk - chasing the global public sphere

December 2004

Digital Culture02:54, December 27, 2004

In a daring move, the anonymous creators of this flash movie follow in the footsteps of writer Jon Katz and his 1994 ten-year-prediction on newspaper death. We’re presented with a possible vision of the future ahead of us, where the news organizations have parished; everyone contributes in the creation of news; where the internet contain all the information ever known to mankind, and the New York Times has gone offline.

What would such a world be like? Dare I say, its not all utopia? Holiday fun for the whole family.

Digital Culture and Technologies12:47, December 23, 2004

I’ve previously discussed, with Mark for example, that internet connectivity is so important that it should be treated as a public utility, rather than just a matter for private interests. Apparently, the businesses disagree with me. (Why am I not surprised?)

Besides recently losing an important battle in Philadelphia, where Verizon Communications made sure the state would not subsidize internet access, the legislation has now been converted to a fill-in-the-blanks DIY-kit, complete with dotted lines for the state name.

The legislation, dreamed up by American Legislative Exchange Council (ALEC), is called the Municipal Telecommunications Private Industry Safeguards Act (word file).

Read more, and be sure to check out the list of representatives on the Private Enterprise Board of ALEC, and you might get a hint to from where this is coming from.

Internet Governance19:41, December 15, 2004

News stories on Internet Governance are few and very far between. After I wrote a short posting the other week about the first meeting of the UN Working Group on Internet Governance, I’ve kept my eyes open for any follow-ups in the mainstream media. At lease to my knowledge, there was none.

Rebecca MacKinnon writes an excellent post on this black hole, and remind the readers that another important meeting; part two of WSIS, will be held in Tunis in less than a year from now. Will it be the bloggers who cover it, or will mainstream media be there too?

The question is why these matter have such low priority in the news output? The problem, I believe, is that it’s of only academic interest for us in the west. At some level, we can rely on our societies to provide us with access to the Internet and guarantee our freedom of speech – so we sleep well at night knowing that someone is looking out for us. This is of course not the case globally.

Internet Governance02:12, December 10, 2004

Berkman Center at Harvard Law School is hosting a conference called Internet + Society 2004: Votes, Bits and Bytes between the 9th and 11th of December (I know I’m a day late posting this – but is the glass half-empty or half-full?) The conference will take a sceptical look at the Internet’s effect on politics and how it might influence society.

Amongst other things, on Friday morning there is a panel discussion on citizenship that looks particularly interesting, and will deal with issues around technologies’ impact on participatory democracy and global citizenship. One of the panellists will be Hossein Derakhshan (aka Hoder), an Iranian blogger.

Other areas that will be discussed in the coming two days are business and – of course – what impact the Internet had on the 2004 US presidential election.

And no, I’m afraid I’m not there either – but I will sure follow the conference on their webcast. …Unless nobody’s got a plane ticket to spare, that is?

Internet Governance and Technologies03:09, December 7, 2004

In several articles I’ve briefly commented on the fact that there is a problem – from a democratic point-of-view – with how Internet access is being sold to consumers today. The technology used does not encourage what I believe is promoting the fundamental uniqueness of computer mediated distribution. This article is an attempt to elaborate on the subject – hopefully, so that even non-geeks can grasp at least some of it.

Is There a Problem With Technology?
A truism of network-building is that the most difficult part of offering high speed internet is not building the backbone, but getting transporting the signals the last two kilometers from the provider and into the homes of consumers.

Today, broadband Internet connectivity is sold to consumers in a few basic flavors: ADSL, Cable and fiber. Fiber is rare, and globally the real competition is between ADSL and Cable – what method is most common depends on what country you look at. In the US for example, cable access is more common than ADSL while in Sweden it’s the other way around.

ADSL, as the most common variant of xDSL-technologies are called, is in theory a superb way of hooking people up to the Internet. It allows for telco’s to connect their telephone station to the backbone, and then use the existing copper wire to reach in to peoples homes. This means no modifications to the building, nor to each and every apartment. It’s simply plug-and-play from the consumers point-of-view. (Today, ADSL also comes in two modified (and thus faster) versions: ADSL2 and ADSL2+. These will be treated synonymously with ADSL in this article.)

Cable is also cheap because the infrastructure is (mostly) already there. The medium used is the coax cables that ordinarily carry television broadcasts.

Fiber, on the other hand, is extremely fast and future proof, but seldom a realistic possibility. Each apartment building needs to be directly hooked up with dedicated fiber cables, routers have to be installed in the basement, and ethernet network cables be installed in every apartment. In effect, very few households have fiber-access, in comparison to ADSL and cable.

Now to the design flaw: ADSL and cable is asymmetrical, a fancy word for saying that the speed at which you can send data and the speed you can receive data, is not the same. For example, personally I have a connection that is called 8/1 – in other words, I can receive at 8 megabit per second (mbps) and transmit at 1 mbps.

In Sweden, the country I’m for obvious reasons most acquainted with, the ISP’s are franticly competing to offer high speeds at very low cost using ADSL-technology. Last I checked, 24 mbps cost SEK 399/month (about US$60/month) from Bredbandsbolaget. So, what’s bad about that? Well, nothing if you don’t consider that the connection they’re talking about is a 24/1 mbps service – offering only 1 mbps of upload.

Why Is Up-Speed Important?
What separates the Internet from other media is that the audiences are active in participating as senders – just not being passive recipients. Radio, television and print media – they’ve all been a way for consumers to receive, but there was no way to transmit or respond to the messages. Networking changed all that, and I dare to say that this reason alone is the reason why the Internet is so extremely popular. All the “killer-apps” we’ve seen is because of a feedback channel; e-mail, Internet banking – even simple web browsing is interactive. But what also changed was that everyone could become a sender in a more revolutionizing way – setting up homepages and transmit material in a way that more resembled newspapers than anything else. (Remember the homepage-craze in 1995? Everyone had one, though nobody admits it today.)

Without the decentralized structure of the Internet – that anybody could publish content anywhere – there’d be no Internet. Not by definition, not in practice. The closed computer networks that attracted consumers 15 years ago (most notably America Online, CompuServe and Prodigy) died or were forced to become open systems within about five seconds after the Internet gained momentum. There simply was no way a single company could envision or create such the diversity of services offered on the Internet. (Some wise-guy will comment that it had to do with availability of pornography – but even so it does not contradict my theory. It still had to do with people being able to broadcast material, whether its pictures of last years Christmas celebration with the family or pictures of people with absolutely no costumes on.)

So, I argue that the reason for the Internets success is that it allowed people to express themselves. And for that an uplink channel is needed. A concrete example of what I mean is this very blog you are reading now. It’s hosted on a crappy second-hand computer I bought for SEK 1000 (about US$140) and is standing in a closet in my apartment. Using the Internet and its inherent channel to transmit information I’ve transformed a closet into a broadcasting device with a (potentially) global audience.

My fear, however, is that the possibility of transmitting is being effectively removed by current standards, endorsed by the owners of the networks. Why you ask? Because of a flawed technology, coupled with greed from the ISP’s to make money with content services.

Is There a Problem With the Companies?
For a capitalist, the ultimate goal is consumer addiction. This is a wet dream for them because nothing is as good as a costumer that returns again and again – without money having to be spent on persuading him to do so. Short of getting people addicted by their own free will, they can at least take every precaution that people don’t have a choice.

Thus, companies strive – in one way or another – towards creating virtual monopolies. They want to have an exclusive channel into your brain without the hassle of competition. For this reason, many media companies have merged in the last decades, to create huge conglomerates that, for example, control the whole chain of production, distribution and advertisement of a movie. It makes sense to them. Very much sense, actually. Not only do they produce the movie, but also they can make absolutely sure you see it, whether you want to or not.

So what on earth has this got to do with broadband Internet? Well, Time Warner owns Road Runner, one of the biggest providers in the US, for example. All Internet service providers, trying to maximize income, see huge possibilities in controlling content – not setting it free.

The asymmetric model being built into technology fits the companies perfectly, since they want passive consumers who are in need of content. Even though there are alternative technologies, such as VSDL, capable of offering symmetric bandwidth, ISP’s often chose not to pursuit them, and where they existed have discontinued them – at least in Sweden.

A recent Swedish example can illustrate my point. As the available bandwidth to the consumer increase so does the opportunities to send services their way. IP-telephony (or VOIP) has been around for a while, and for consumers it is a fantastic way to lower their cost for national and international telephony. Since they already have the bandwidth they can just sign up to any service and make really cheap phone calls. Well, for customers of Bredbandsbolaget it’s not quite that easy. Since they offer their own VOIP-service and do not wish to have competition, they’ve simply blocked the use of other services, and thus limiting the bits allowed to freely pass through the line for which the consumer is paying.

The same pattern is repeated with the launch of broadband broadcasted television that was launched by Bredbandsbolaget in cooperation with Viasat – Sweden’s largest satellite-TV owner. At the cost of 5 Mbps and a mere SEK 299/month you get 27 tv-channels blasted right into your TV. (What they fail to mention is that it’s just as expensive as their non-broadband packet, and then you get to keep your 5 Mbps – why can’t really see the use just yet?)

Instead of separating the infrastructure from the content, the two are increasingly intertwined. The promise of broadband distribution is that you should be able to buy television, in the form of zeroes and ones, from just about anybody. Not just repackaging an old service digitally and selling to you again. What this leaves us with is the monopolies the cable companies created in the 80’s – effectively controlling what was broadcasted to the homes. They owned the cables and operated the service. Where are the competition were did my free choice go?

The Future of the Network
The problem with this approach is that it’s counter-productive for the prosperity of the Internet as a growing community. Again, consider what the lessons from the early 90’s taught us – closed networks perish because they are not really networks of independent nodes but a controlled and centralized environment. I repeat what I said above: Without the decentralized structure of the Internet – that anybody could publish content anywhere – there’d be no Internet. Not by definition, not in practice.

I believe that in order to satisfy the increasing demands of bandwidth, as services get more complex and more mission-critical, P2P-technology (also called peer-to-peer) will become a vital part of the distribution chain. And I’m not talking about Napster or eDonkey but about a transparent system that distribute content by letting the computers organize themselves to minimize strain on the network and enable lightning-fast transfers. The business models of copyright holders and content providers will have to adjust to such a system in order for the companies to survive (…but that’s another article).

Already today there are systems, based on P2P, that allow people to download media content, released under a less restrictive license. Torrentocracy is one of them that I mentioned in an article just the other day.

Even the, usually pretty business-conservative paper, The Economist admits (subscription walled) that utilizing P2P-technology is a natural development and would enable faster, better and more fault-redundant services to the consumers. Creating a hub-and-spokes-network as what largely done today is not ideal, they say.

The problem is that P2P will only work if the nodes in the network are symmetrically hooked-up. If users consume (receive) faster that they produce (send) the scheme will fail, since P2P is mathematically a zero-sum game. By definition, every bit you download has to be uploaded by someone else.

Effects for Democracy
To speak bluntly; most companies that sell Internet capacity to you today does not want you to distribute content – they’d much rather sell it to you themselves. They want you to be passive because that gives them opportunity to sell you things, while not changing their tried-and-tested business-model. Content owners need to thoroughly revise their business model for a digital age, where scarcity is no longer determining the value of commodities. Bits are not atoms.

I fear continuing on this path of turning the Internet to yet another place where you passively consume (televised) messages will indeed hurt the Internets inherently democratic effects of allowing its participants to have a voice. P2P could enable ordinary citizens to distribute their own content, much like what bloggers all over the world are doing with ideas right now. Participation is the cornerstone of a healthy democracy, and the mainstream media have failed to be inclusive enough – by any standard.

The question is of course if governments should regulate Internet connectivity and treat it like a public commodity. Critics to such an idea usually claim that the Internets success is to a great deal the story of entrepreneurs and that therefore it’d be wrong to regulate. However, one should also keep in mind that the Internet is the result of millions of dollars spent on research – funded by the US government. So, maybe we should cut the governments some slack? Besides, nobody at their right mind is talking about expropriation of the Internet, just some guidelines for what is acceptable behavior. Like what FCC in the US (and PTS in Sweden) do for telecommunications.

Somebody said that we often overestimate the progress that will be made in two years, but underestimate what will happen in ten. To me, that is both good and bad news given this scenario. For one, the technologies we see today are not what are going to be still here in ten years. Bandwidth demand will continue to increase, and I’d expect them to level out at about 1 Gbps per household (I’m not kidding). However, one should still be mindful of the things I’ve explored in this article. Structures and patterns of behavior are determined at an early stage, even though unsymmetric technologies get replaced, we might have gotten used to just downloading the last 40 seasons of Friends from our old buddy the cable company.

(Thanks to Mark for a great discussion on the matter.)

Digital Culture23:58, December 4, 2004

The French newspaper Le Monde has not only launched reader blogs under their own brand, but they’ve also allowed reader and journalist blogs to be presented at the same level, French blogger Lo?c Le Meur reports. After two days of readers voting among the blogs to create a ranking, four reader-blogs are in the top-ten.

This kind of competition is not new within the blogosphere, but appears more or less in every blogroll on the planet, aswell as on Technorati for example – but to my knowledge this is the first time it’s been done under the brand of a traditional newspaper. And that is indeed important news.

Definitely its a bold move from the paper, and I really hope it the outcome is successful for all parties. It could very well be a way for the public sphere to be more inclusive, and open up for amateurs and professionals to work together toward a less hierarchic news production apparatus.

Digital Culture and Technologies14:27, December 1, 2004

The guy behind Torrentocracy, Gary Lerhaupt (blog), had an idea. What if you combined RSS, P2P and your television set? It would, they say, be a system that would let you, for the first time, enjoy the true power of television.

While space in traditional media is (by comparison) scarce and controlled by few, the promise of the new technology is that information and ideas will spread freely and that citizens will gain access to a plethora of opinions and voices. Being both a technical platform, combining some hardware and Linux software, the project is also negotiating deals with content owners to release materials to the public sphere under a more permissive licence, such as Creative Commons.

The system enables you to with your remote control select, download and watch material. Of course you need a geek in the house to set the system up, but it sounds simple enough. To quote Gary himself: “If you ever wondered how and when your computer, the internet and your television would merge into one seemless device with access to anything and everything, then at this very moment […] Also Sprach Zarathustra should be resounding through your head.”

The suggested system available today is clearly too complicated for the average user, I believe Gary has a point. The P2P-distribution technologies are a great way of getting content to the masses. (Provided our upload speed in effect is not taken away from us. More on that matter some other time. Watch this space.) Some material has already been released, most notably sequences from the documentary Outfoxed. Torrentocracy also carried the presidential debates, hopefully enabling a few more people to listen to the most powerful man on earth and his “opponent” battling it out.

The point is, even though the MPAA is whining about copyright infringements and reports that Bittorrent alone use 35% of all available bandwidth on the planet; one should remember that there are legit uses for the technology.

So would content owners really be interested in this? Well, clearly what is missing is stream of money back to the studios or TV-channels before the bigger studios would be interested. However, remember that Michael Moore actually encouraged people to download and spread his documentary Fahrenheit 9/11 online to maximize the number of people who saw it before the American election. So, while it may not be a preferred system to distribute the latest Star Wars flick, it might well prove to be interesting for idealistic New Documentary directors, for whom money is not necessarily the greatest reward.