Saturday, April 22, 2006 

Putting P2P in perspective...

Before the WWW, the internet operated essentially like an imperfect decentralized P2P network with early University-based computers, making direct connections between US campuses. Around 1990 the DNS/WWW paradigm began to be over-laid on top of this network. This enabled the advent of the browser and name-paths to file-servers on which were placed the ‘web page’ interface whereupon wide spread adoption began. Web, E-mail and Database servers became ‘server-farms’ and eventually ‘server-clouds’ as the system was scaled-up to cope with exploding demand.

However, as demand went through the roof, no amount of money could guarantee QOS, as even the biggest systems on the planet with thousands of interlinked servers, routinely failed. All the while people got used to the notion that the normal web-user was by default, a ‘receiver’ or buyer of information in the equation, and not normally a seller or ‘provider’. However, applications where users could interact directly with each other (Hotmail, ICQ, Messenger) and publish and sell to each other (Blogging, E-Bay) became among the most popular applications on the web. As the inter-linked server/client networks grew to unprecedented proportions, spam, e-mail-borne-viruses, trojan-horses, identity-spoofing, ‘man-in-the-middle’ attacks, ‘phishing’ and identity-theft became uncontrollable and started choking and attacking the very fabric of the web. By March 2004 60% of all email traffic was illegal ‘unsolicited email’ from strangers (SPAM).

“I get a lot of spam, probably as much as anybody in the room.
My e-mail address is well-known… The Hotmail® servers that we run, which are the free and subscription e-mail capabilities we offer, today over half of what goes through there is actually mail that's spam that people are not interested in receiving.”

- Bill Gates (quote taken from the Microsoft website)

So, Hotmail is effectively a giant spam-machine, and even Bill Gates both suffers from the problem and cannot solve it. Products like Qurb, Esafe, MX Tunnel and MailFrontier-Gateway add yet more levels of complexity and/or servers onto the internet to try and block spam, effectively passing on the problem, but not dealing with the cause. The WWW client/server paradigm has achieved massive scale and overwhelming success, producing countless valuable services, but there are now notable flaws and vulnerabilities in the model. In the mean time, P2P file-sharing has become an unstoppable phenomenon, with unprecedented levels of adoption.

On these new systems, people experienced constant availability of a truly massive amount of shared digital content and these tens of millions of people began illegally exchanging hundreds of millions of music, software and video files, with total strangers. The web had become a giant impersonal shopping mall littered with illegal activity, where users were largely disconnected and insulated from the impact that one user can have on another. In this unregulated playground there is an intangible (although sometimes very real) cost involved in flouting copyright law. Some negative effects of file-sharing are as follows:

• The user runs a risk of litigation or closure of their service due to litigation.
• ‘Free riders’ people who download files but do not contribute files to the network
• People with slow connections slow down performance of file exchange
• Poisoning of the network with corrupted files to discourage users.
• Customer churn (and thereby peer un-availability) can cause inconsistent service
• Receiving incorrect-files or viruses in files via anonymous file exchange
• Exposure to highly illegal material mixed in with typical music and video file lists.

Already more than 12,000 file-sharers have been sued by the Recording Industry Association of America (RIAA) and the Motion Picture Association of America (MPAA). Some record companies have been accused of poisoning P2P networks with bogus files to discourage users, and in 2002 a company called OVERPEER released a product that floods P2P networks with fake files in an attempt to stop the trading of "unauthorized" mp3s.

Since Napster, peer-to-peer applications have been evolving steadily and in diverse ways, with new forms appearing as varying hybrids of web and pure-network technologies. The DNS and client/server systems have provided P2P applications with easy access to the web’s 600+ million user-base, and thereby, to an unprecedented forum from which to usurp and deal in almost any kind of digital-file. (whether previously sold on the web or not) However it has been the resulting law-suits targeted at applications with server-based file-lists and the quest for user-anonymity to protect against the current wave of litigation against individual users, that have been the main drivers of product innovation, back toward forms of decentralization for many new P2P systems. The illicit nature of user-activity with these applications have in many ways interfered with their unfettered commercial growth, creating no clear market-leader, an over-supply of vendors and a fragmentation (and to some degree) a stigmatizing of the user-base.

At the same time, some of the most successful legal new P2P based variants like Skype, and a new crowd of ‘legal’ music sharing applications, like Weedshare, Bitmunk and PeerImpact still use servers (and WWW infrastructure) to perform functions like coordinating and locating peers. BitTorrent and variants like Supranova and Lokitorrent, (which recently closed their service due to legal pressure) also require centralized web-based ‘trackers’ (a form of server which co-ordinates the transfer of metadata across a BitTorrent network) to function effectively.

Although new technical innovations in P2P have often been driven by the need to avoid prosecution for illegal file-sharing, web protocols have still been very ‘sticky’ for developers due to the various levels of functional assistance they provide, so there has been reluctance by commercial developers to explore and develop pure forms of P2P. For all the web’s vulnerabilities to attack and corruption, there is considerable ‘lock-in’ to WWW legacy systems, with the marketplace in general having built up a history of relative trust and tolerance and familiarity with it’s flawed processes.