This World Wide Web you’re looking at right now wasn’t always something most people considered worth a second glance — let alone
hours days weeks years of nonstop staring. In fact, even some of the big info-nerds of the day ignored or dismissed it early on.
One of the earliest public demonstrations of the Web came back in 1991, when a man named Tim Berners-Lee sat at a table with a computer in a Texas hotel conference room, willing to give anyone with a few minutes to spare a personal introduction to his invention — a concept and a structure that would soon spark a worldwide information revolution.
Every person in the room that day likely came to depend on this invention by the close of the 1990s, if not sooner. But when first confronted with the Web in that hotel, most simply said whatever the equivalent of “meh” was at the time, and went in search of a drink.
“It was quite a warm December evening in San Antonio,” recalled Professor Wendy Hall of the UK’s University of Southampton, who was in that room for the 1991 Hypertext Conference where Berners-Lee had been denied a speaking spot to show off the most important human creation of a generation or three. “In the courtyard outside the demo room was a tequila fountain and everybody was outside drinking free margaritas, so nobody was inside. This was the first demo of the World Wide Web in America.”
Keep in mind, this wasn’t a conference of Luddites. It was a gathering of people interested in hypertext before anyone really knew the phrase (most people still don’t know it, because it was quickly supplanted in our culture by “the Web”). This was a crowd of information nerds, and yet, they still were unimpressed by the World Wide Web on day one. And it wasn’t really even day one, either. Berners-Lee had actually first submitted a paper detailing his invention almost three years earlier, on March 12, 1989.
And this is the date I mean to commemorate over the course of the next week — the actual birth date of the World Wide Web, a creation Mr. Berners-Lee and his colleagues gave to the world completely free of charge on that day 25 years ago. In this post and three that follow, I’ll look at the first two and a half decades of the Web, from its awkward infancy, to those crazy boom and bust years, the lull that followed (I like to call them the “lost in the wilderness years”), and finally, the countless bits and pings of the lovable monstrosity that is today’s mobile and social Web.
But I’m less interested in the many technical milestones of the Web between then and now (I’ll still cover plenty of them, don’t worry), as I am in the infinite ways it’s changed the way we as individuals, and as a society, live and think. As I sit in my favorite chair right now, laptop in lap, my 6-year-old daughter diligently weaves tiny rubber bands together into the form of a toy horse according to instructions delivered by a friendly voice on the YouTube video streaming on a
tablet sitting on the coffee table.
The sight of a 6-year-old girl deep in the act of creation while tapping and swiping the screen in front of her triggers a flood of images — a massively compressed zip file of emotionally charged moments from the past two decades unpacked from my memory as I see myself growing up online — from an awkward teenager getting Web design tips from CNET in 1994 to a husband and father writing these words for CNET in 2014.
So let’s dive right in. Like most journeys that pass through the 1990s, this might get a little weird at points.
The Big Bang
It’s not hyperbole to say that I owe much to the Web, and I think that notion can be extended almost to my entire generation — and really all generations that have interacted with the Web. Of course, during the years that Berners-Lee and company were pitching their project to disinterested conference crowds, I myself was too busy watching “Fresh Prince of Bel-Air” and imprinting rude gestures on my Hypercolor T-shirt to pay any attention to what was coming out of CERN — let alone learn what CERN is.
Still, like many others on the bubble between Generation X and the so-called Millennials (I was 9 when the Web was born in 1989), I was old enough to appreciate on some level how big and exciting the coming changes would be, yet young enough to be open to something radical and non-linear — a new medium with the potential to not just shift paradigms, but to extract the whole damn paradigm transmission and leave it in a flaming heap by the side of the road.
Facebook/Bethany Watts Therrien)
Back in those days, I was spending lots of time escaping typical preteen traumas at a keyboard, first an Apple IIe and a Commodore 64, then an IBM PC XT clone. Then one day, my family added a modem to our PC clone setup and I soon understood the concept of the Big Bang. Not because I finally had access to the online Grolier encyclopedia, but because the connections I struggled to make at school, with family, and elsewhere in the world were suddenly possible from behind that keyboard.
I was no longer socially handicapped by my unwieldy afro, Hubble-scale specs, and strong tendency to be a pushover; instead I was part of a judgment-free universe (yes, the Internet was friendly once upon a time) that was expanding exponentially.
Within a few years, I had even grown bold enough to carry my early digital interactions into the real world. I went to meetups hosted by the Toad the Wet Sprocket listserv admin and purchased PC parts from a strange guy in a Denver apartment that had no furniture but did have more than 1,000 used monitors. These encounters were awkward, but hey, progress isn’t always pretty.
Keep in mind, this was still just the early ’90s and I was just a tween in the Denver suburbs logging onto BBSes, Prodigy, or America Online mostly for the purpose of trading stamps, coins, football cards, games, and software (although one screen name on my mid-’90s AOL buddy list also turned into my humiliating first kiss in the real world — it was so bad I remember sighing and actually saying “damnit” out loud, and I never heard from the girl again.)
But even at that early stage, and using services that were mostly walled gardens cut off from each other, it seemed like a bottomless well of possibility and potential. I was also aware of the cool kids over at The WELL and the fanatics playing Neverwinter Nights. Disparate online worlds had formed, and unbeknownst to most people in 1992, the system to unite them all was about to be commercialized, but first its access point needed a makeover.
A ‘Mosaic’ of early adopters
If you really boil it down, the problem with the early Web that probably led Berners-Lee to play second fiddle to the tequila fountain in San Antonio in 1991 was its lack of animated GIFs.
The first Web browsers were either text-only affairs like LYNX or used unwieldy means of processing images like popping them up in separate windows. It was not quite yet the short-attention-span multimedia extravaganza that would soon come to define modern pop culture.
National Center for Supercomputing Applications/University of Illinois Board of Trustees)
Enter Marc Andreessen, the National Center for Supercomputing Applications, and Mosaic in 1993. While initially it did not even have a back button, its simple installation for Windows systems, intuitive interface, and integration of graphics made it the first widely-used Web browser. As the software was spreading, so was the infrastructure for the young Web. During 1993, the number of Web servers worldwide quickly grew from just dozens to several hundred.
By 1994, the roads for the information superhighway had been laid down — to borrow an ancient digital metaphor — and the vehicles had also been manufactured in the form of software like Mosaic. All that remained was to recruit some Web drivers, or uh, surfers, or whatever. Unfortunately, many of the folks in the mainstream media at the time weren’t quite hip to the Internet’s crazy new symbology to do much of that proselytizing. Case in point: this clip of NBC’s “Today Show” hosts bantering off-air during a commercial break and finally flat-out asking co-workers “what is internet?”
When the Pew Center started its research into the Internet and American Life in 1995, it found 14 percent of the country was already online. These were the early addicts like myself who tied up our families’ phone lines and ran up big bills with online services — and then with long-distance charges.
One of the first major purchases I ever had to save up to pay for with my own money was $220 worth of long-distance calls (explanatory link meant to be sarcastic — if it doesn’t read that way, you must be younger than me) incurred during a single week in the mid-’90s when America Online’s Denver dial-in numbers were chronically busy, “forcing” me to get online by dialing-in via a Cheyenne, Wyo., number instead.
I had to mow many a Colorado lawn to pay off that tab, but I don’t ever recall thinking it wasn’t worth the time and effort.
This early addiction to the exciting and limitless online world would eventually lead very smart people to think it was not only possible to nuke our aforementioned paradigm transmission, but to challenge the very foundations of economics, leading to a remarkable boom and bust that… but now I’m getting ahead of myself.
The master protocol
Fairly quickly after the introduction of Mosaic, the Web’s hypertext transfer protocol (HTTP) would become the preferred means of sharing information for public consumption over the Internet’s pipes. Newsgroups could not support the critical masses of the information-addicted like myself that were beginning to coalesce in those early days. Neither could other predecessor/parallel protocols to the Web like Gopher, WAIS, Telnet, or even the mighty FTP, which all saw their status degraded as the Web rose to prominence over the next two decades.
It was when AOL, Prodigy, and others began to take down the wall and added a browser to their subscription offerings in early 1995 that I began to fully understand the gravity of this insane, intangible, indomitable force that is the Web.
There were early indications of the weird places this unprecedented access to information would take us. The project that would begin as the Cardiff Internet Movie Database before becoming just the Internet Movie Database and eventually IMDb mimicked the old brick-and-mortar library model, but with the digital twist of crowdsourcing a wealth of information on a popular topic to create a whole new reference resource.
In the offline world, that probably would have been the end of the story.
But in this bold new frontier, college students and film buffs didn’t just use the Internet Movie Database to settle bets and research film history midterms. They also mined it to waste countless hours playing Six Degrees of Kevin Bacon. The Web dropped a chaos bomb, and many of us didn’t just embrace it, we set up shop to begin mining it for gems of pure awesomeness, exploiting its riches to enable unbridled creativity and achieve new levels of procrastination.
Failure of imagination
At this point, I was 15 years old and I had been online in some capacity for four or five years already, yet I didn’t really get it until I started spending time on the completely unfettered Web. Even as I spent countless hours the previous few years in America Online forums, newsgroups, and chat rooms, I still thought the most likely career for me might be at a local radio station, or maybe enjoying a quiet life as a librarian. Talk about a failure of imagination — I was like a kid peering through a window at the wonders of Henry Ford’s assembly line in the 1920s while still pondering a career making ox carts. If there had been a tequila fountain nearby, I might have found myself shelving books today instead of writing this.
The Web was the ultimate killer app because it made the potential of the wider Internet so obvious. It was the conquest of time and geography in digital form.
Columnist James Coates put it a little more eloquently back in May of 1995, writing about the Web’s penetration of the biggest online dial-up services: “Right now, only a handful are tasting the wonders to come. But it won’t be long before these humming handfuls give way to the howling hordes and browsing the Web becomes as common as surfing the cable channels or twisting the radio dial.”
Coates nailed it all the way back then, even though I still don’t understand how to dial someone with a radio.
But what few people saw at the time was the steep trajectory the Web would take over the next few years as it launched itself into our collective consciousness.
The wild success of the Web that would define the final years of the century would also determine the course of my life, at least until things eventually got derailed. But again, I’m getting ahead of myself. More on that in the next installment when I take a look at the boom and bust years of the late 1990s.
CNET comments are currently down for maintenance, and should be back soon. In the meantime, please share your memories, and parts of early Web history I’ve missed, on Twitter at @crave and @ericcmack.
Article source: http://feedproxy.google.com/~r/cnet/pRza/~3/Ordx8P-kNwU/