Today is a most auspicious anniversary – the 25th anniversary of the World Wide Web, that world-changing technology that rivals the Gutenberg press in the way it has revolutionised our means of communication and sharing information.
Or rather, it's one of the 25th anniversaries of the World Wide Web. There is a touch of arbitrariness to the choice of today – it was in March 1989 when a British computer scientist, Tim Berners-Lee, wrote a proposal to develop an information management system at the famed Cern laboratory in Switzerland where he worked, using hypertext to create a network of interlinked documents. Berners-Lee didn't post the first website until August 1991, and it didn't go live for anyone to access until later that month.
But arbitrary as today’s celebration may be, it hardly reduces the occasion’s significance – like most of the major landmarks in technology and science, there is not one moment of genesis, but rather a long, complex process of development and iteration. Indeed, the web is remarkable for having one single identifiable inventor and a timeline of landmark moments in its development.
The internet itself, the communications infrastructure upon which the World Wide Web was built, has an altogether more diffuse genesis, with all sorts of moments, figures and incidents that can be credited as the eureka moment. As a result, the likes of Vince Cerf, Robert Kahn and Leonard Kleinrock, the men who pioneered the early Darpa research in the late 1960s and 1970s, are all plausibly described as the "fathers of the internet".
Web vs internet
A good indication of the significance of Berners-Lee's creation is the way it essentially became synonymous with the internet, with most people assuming they were the same thing, even though the web was just one of the numerous applications, such as email or usenet groups, that used the internet. Not until the web arrived and the rise of web browsers to view it and search engines to navigate it did the internet itself become mainstream.
However, on March 12th, 1989, it was far from clear Berners-Lee had created anything revolutionary at all. His proposal, initially limited to managing information about accelerators and experiments at Cern, was deemed "vague but exciting" in the famously lapidary response of his supervisor, Mike Sendall.
The following year, with the help of colleague Robert Cailliau, Berners-Lee expanded on the idea, realising it could extend to the whole internet, rather than just the Cern network. They had an uphill battle persuading people, and yet their persistence meant that, in years to come, Cern is likely to be remembered more as the birthplace of the World Wide Web than for all its particle accelerating and adventures discovering the Higgs Boson.
One of the little vignettes that illustrates the sort of struggle Berners-Lee had to get recognition for his invention features an unlikely cameo from Steve Jobs. Berners-Lee had written the code for the World Wide Web on a computer by NeXT, the firm Jobs built after leaving Apple in the mid-1980s. When Berners-Lee discovered Jobs would be attending a NeXT developers conference in Paris, he took his computer and software with him to demonstrate this new technology to the great man in person. At the time, Berners-Lee was just another developer and had to wait towards the end of the line of programmers waiting to demonstrate their creations to Jobs. He was getting close to Berners-Lee, when an aide cut in to say he had to leave to make his flight. Jobs hurried away, never making it as far as Berners-Lee's demonstration.
The two men who left the most indelible mark on the progress of computing in the second half of the 20th century were deprived of a historic meeting that day – and in the end they never did meet.
Historical juncture
A quarter of a century into the web's history, the world wide web is at a critical juncture, in large part down to more of Steve Jobs's creations – the iPhone, iPad and above all the App Store have changed the way we use the internet. Thanks to the rise of native apps on smartphones and tablets, the days when the web and the internet were synonymous are over. Given that the majority of the world's population who go online for the first time in the next decade will be doing so on mobile devices using native apps, the web's position as the "front end" to the internet may soon be over.
It is telling that, where once all ads and billboards included a web address, they now tend to use a Facebook page or Twitter handle instead. While both Facebook and Twitter have web front ends, they are closed, corporate, for-profit platforms.
Rumours of demise
Indeed, as far back as 2010,
Wired
magazine ran a provocative cover story stating that "The Web is Dead", in which Chris Anderson pointed out the web's share of US internet traffic has been declining since the early 2000s. That's a trend that will only have accelerated since, especially when half of the US tries to stream
House of Cards
or
True Detective
at the same time. "Today the internet hosts countless closed gardens; in a sense, the web is an exception, not the rule," Anderson concluded.
That highlights one of the most remarkable aspects of the World Wide Web – Berners-Lee gave it away, claiming no rights or ownership over it, in the true spirit of academic learning. It would never happen today, where success is measured in multibillion dollar acquisitions. What we celebrate today, then, is not just the 25th anniversary of an idea, but the 25th anniversary of a great gift to humanity.
Net result: five ways the web changed the world
1.
Media and communication
From music to movies, newspapers to TV programmes, everything is now online, thanks to the likes of Netflix, iTunes and Spotify.
The web has revolutionised communication to the point where most people can publish and disseminate information within seconds. This unhindered ability to communicate and disseminate information has also led to powerful citizen-driven movements, such as the Arab Spring in 2010.
2. E-Commerce
You can now borrow, buy, lend and invest from the comfort of your couch. No longer do you have to go into a shop to purchase something or queue in the bank to make a deposit.
The web has given consumers everything at their fingertips and businesses access to a global marketplace. It has also reduced overhead costs as online retailers can offer products without the need of a bricks and mortar store. It has changed the way they advertise and communicate with employees. The web didn't just give businesses access to a new market, it gave the public one too. Thanks to auction sites such as eBay, there is no need to sell unwanted gifts and old goods at car boot sales.
3. Work
No need to actually go into the office any more – the web has allowed people to do work from home. You can hold conference calls, interface with customers, send documents and collaborate with colleagues online. You can also work with people anywhere in the world, with teams distributed across different time zones can all coming together on the same project. The web has also created new kinds of jobs, with most web companies employing thousands of people.
4. Education
Thanks to the web, you don't even have to enter a university building to do a college course these days. And rather than go to the doctor, people consult the web for diagnoses. Researching an unfamiliar sub- ject no longer requires a trip to the library or consultation with the Encyclopaedia Britannica.
We're all wiser thanks to Wikipedia – or perhaps dumber – and we don't have to spend as much time memorising facts anymore. From a poetry quote to a medieval recipe, someone has created a website in its honour. Just about anything can be found out at the click of a mouse.
5. Privacy
People now broadcast every detail of their lives across the web through video, text, audio and photo. But where do we draw the line? As we put our heart and souls into the public domain via Twitter, Facebook, YouTube, WordPress and LinkedIn, the concept of privacy has changed.
Even governments aren’t all that innocent, with revelations of spying on the public’s online activities.
PAMELA NEWENHAM