Posts Tagged ‘The Story’

Metal Gear Solid V: Ground Zeroes Review

Metal Gear Solid V: Ground Zeroes Review

Metal Gear Solid V: Ground Zeroes Review

Price: £29.99
Developer: Kojima Productions
Publisher: Konami
Platforms: X360, Xbox One, PS3, PS4
Version Reviewed: Xbox 360

Whatever you may think of Kojima Production’s decision to split off Ground Zeroes from the rest of Metal Gear Solid V and release it as a full game, there’s no denying that it is a remarkable creation. In terms of its politics, its technology, its systems, and its artistic direction, Ground Zeroes is absolutely fascinating. It departs radically from many of the conventions the series has established over the years, while at the same time it is truer to the motto of “Tactical Espionage” than any of its predecessors.

Ground Zeroes is set in 1975 – a year after the events witnessed in Metal Gear Solid: Peace Walker, and casts you as Big Boss on a mission to infiltrate a heavily guarded detention camp in order to rescue two prisoners. Prior to the game’s start, there’s a brief summary of events leading to the Ground Zeroes mission, and a short cut-scene that introduces “Skull Face”, the leader of the mysterious XOF organisation which opposes Big Boss’ FOX unit.

Metal Gear Solid V: Ground Zeroes Review
It’s a refreshingly terse opening to a Metal Gear Solid game, and makes it immediately apparent that Ground Zeroes strives to be different. Kojima’s writing has grown increasingly indulgent since the release of the first MGS, his games burdened by exhaustive cut-scenes and rambling dialogues. Ground Zeroes, on the other hand, is nearly all about play, only removing you from control during a couple of key moments while you’re on mission.

In fact, Ground Zeroes is a very restrained game in general. Aside from the much-discussed running time, the weirder elements of the Metal Gear Solid universe have been dialled back, with only the appearance of Skull Face acting as a nod to the series’ penchant for science fiction and the supernatural. Similarly, Ground Zeroes’ approach to stealth is very straightforward – stay low, stay shadowed, stay quiet. The most advanced gadgets in Big Boss’ arsenal are an “iDroid” that gives a real-time updated map of the detention centre, and a pair of binoculars that can mark guard positions on a map.

What most definitely isn’t dialled back, is the technology that powers the game. Ground Zeroes looks, sounds, and feels superb. Even on the Xbox 360, visually it’s a cut above most other games. This is because the FOX engine’s approach to graphical fortitude has nothing to do with resolutions or anti-aliasing or post-processing effects or any other technical gimmickry. Rather, it’s about attention to detail. FOX’s physically-based rendering techniques are based on vast amounts of research into how different types of light react with different types of surfaces in different conditions, and replicating the results in a virtual environment.

Metal Gear Solid V: Ground Zeroes Review
It’s tempting to say the results are spectacular, but that would be to miss the point. FOX isn’t about spectacle, it’s about creating a convincing environment, and Ground Zeroes’ Camp Omega is very convincing indeed.

The reason we bring this up is because Ground Zeroes’ pinpoint production values feed into the design intent for the rest of the game. Ground Zeroes is entirely about attention to detail. Navigating your way through the maze of tents and fences and rocky coastline without being spotted by a patrol or a searchlight requires careful planning and speedy execution.

Deciphering the story behind Camp Omega involves searching every corner of the Black Site to collect audio logs, listening into guard conversations, and interrogating them for information. There’s a particularly brilliant section where you have to find a specific location within the camp by figuring out the route taken there from the ambient sounds on an audio cassette. It’s all geared toward making you feel like a spy, the way you collect snippets of information and piece them together to form a plan.

Article source:

Article source:

Scuba diving trumps surfing on Saturn’s Titan moon

These would be considered rough waters on Titan.

NASA/Steven Hobbs)

There was a lot of hubbub this week among space geeks about the first spotting of waves on the freaky methane lakes that cover much of Titan, perhaps the most Earth-like spot outside of the real deal in our solar system. But it’s still waaay premature to pack up your space wetsuit and start nagging NASA or Elon Musk to hitch a ride beyond the asteroid belt.

Saturn’s spooky moon has a planet-like atmosphere and liquid covering much of its surface, making it one of the most likely nearby places to harbor (probably very weird) alien life. But while Titan shares a number of Earth-like characteristics such as its craggy peaks, running rivers, and even thunderstorms, it doesn’t appear to have strong enough winds to whip up methane waves on its large lakes.

At least, we haven’t been able to see them during the time we’ve been looking closer with the Cassini spacecraft, which has been cruising around above Saturn and Titan for years now. But as we learned last year, things could be shifting on Titan as the longer seasonal cycle on the moon is finally bringing summer to its lake-filled northern half for the first time since we’ve been watching closely.

Some astronomers think winds and surf season could be in full effect by 2017, so there was plenty of excitement earlier this week at the Lunar and Planetary Science Conference outside of Dallas where researchers discussed measurements of Titan’s surface that seem to hint at the presence of waves, according to Nature.

At least that’s the way the headlines put it this week.

Read further on, however, and the story is that the images taken by Cassini between 2012 and 2013 showed something abnormal on the surface of Punga Mare that could be waves or more accurately, ripples, given that the disturbances were calculated to be no more than a few centimeters high.

“Titan may be beginning to stir,” Ralph Lorenz, a planetary scientist at the Johns Hopkins University Applied Physics Laboratory, told the conference. “Oceanography is no longer just an Earth science.”

If the stirrings continue to increase, we could get to witness some very interesting activity on Titan, hopefully before Cassini is scheduled to hurl itself into Saturn’s atmosphere in 2017.

In the meantime, however, Titan still has the smoothest sailing in the solar system. The latest radar measurements, published earlier this month online in Geophysical Research Letters, find that Titan’s second-largest lake, Ligeia Mare, “possesses a mirror-like smoothness.”

“If you could look out on this sea, it would be really still. It would just be a totally glassy surface,” Howard Zebker, professor of geophysics and electrical engineering at Stanford, said in a release.

Zebker also suggests that the lack of motion in Titan’s ocean could be due to something else, like a more viscous topping on the lake surface.

“For example, on Earth, if you put oil on top of a sea, you suppress a lot of small waves,” he said.

His team’s research also determined the depth of Ligeia Mare, which it found to be nearly 500 feet deep in at least one spot.

So maybe it makes more sense to plan a scuba diving vacation on Titan than a surfing excursion. Either way, you can get a feel for the exotic locale in this modeled fly-over:

Article source:

Titanfall Review

Titanfall Review

Titanfall Review

Price: £45
Developer: Respawn Entertainment
Publisher: EA
Platforms: PC, Xbox One
Version Reviewed: PC

Titanfall crams more action into ten minutes than most games do in ten hours. There’s plenty to like about Respawn Entertainment’s first attempt to revolutionise the multiplayer FPS, but what is perhaps most admirable is how mindful Titanfall is of the commitment it asks of you. This is a game which slots into a lunchtime like a lightning port into an Apple device. And not a second of your attention on the game is wasted. Even during the ninety-second breaks between matches, you’re tinkering with your custom pilot and Titan classes, and debating which of the game’s brilliant Burn Cards to carry with you into the next round.

In a genre which has failed to get better for years, and so has opted to get bigger instead, Titanfall is a reinvigorating antidote to the bloated bombast of Battlefield. It is lean, svelte, fit and fast. It’s also brimming with ideas of varying sizes, and when it comes down to it, is simply enormous fun to play. But it also doesn’t quite deliver on all of its revolutionary promises, and neglects to include some pretty basic features anyone would expect in a PC multiplayer game.

Titanfall Review
Titanfall’s strive for innovation is clear from the off, through the presence of a campaign mode in an entirely multiplayer game. This campaign uses nine of Titanfall’s fifteen maps to tell the story of an intergalactic war between the GOOD rebel Militia and the EVIL IMC. It’s easy to tell who is good and evil because the Militia have American accents whereas the IMC have British and South African accents, and everyone knows that British and South African people are all horrible bastards who eat live kittens for breakfast and grow moustaches for the specific purpose of twirling them, the monsters.

The story is told through brief cut-scenes at the beginning of each mission, and a small dialogue window in the top-right of the screen that explains the goings on during missions. Frankly, it’s total rubbish; abysmally written bilge that somehow manages to make a straightforward war between two factions seem labyrinthine in its complexity through extended references to characters you hardly spent any time with, and endless plot contrivances designed so that the story remains the same regardless of the outcome of the players’ battle.

Titanfall Review
The multiplayer campaign is Titanfall’s biggest disappointment. Fortunately, the speedy nature of the game’s matches means it is also short. You can play through both sides of the campaign in under four hours, which unlocks two additional Titan Chassis’ for you to use. The tale is also told in such a way that you can largely ignore it, and focus on the far more interesting personal story of your giant gun-toting robot versus everyone else’s giant gun-toting robots.

Before we get to the main event, it’s worth pointing out that even without its eponymous mechs, Titanfall would still be a highly enjoyable FPS. This is thanks to the wonderful acrobatics of the Pilots. The combination of wall-running and double-jumping is transformative, making Titanfall a truly three-dimensional shooter. It also proves a somewhat lateral solution to the problem of bunny-hoppers spoiling the feel of the game. Now that feels like a perfectly natural evasion tactic, as does pinging off the walls like a pinball and leaping through a second-floor window before dashing around the corner to clip your opponent as they gleefully follow you in.

Article source:

Article source:

Man unknowingly becomes ‘More Sexy N Intelligent Than Spock’ (in part)

Sexier than this?

Guttural Truth/YouTube; screenshot by Chris Matyszczyk/CNET)

Being drunk can carry with it difficult consequences.

These might be summarized by the phrase “doing something incredibly stupid.”

It’s easy to forget in the morning, you might imagine. But some consequences only hit you years later. When you apply for a passport, for example.

A man in Dunedin, New Zealand, has just discovered that losing a bet four years ago has become more real than he thought.

As the New Zealand Herald reports, the 22-year-old man is now Full Metal Havok More Sexy N Intelligent Than Spock And All The Superheroes Combined With Frostnova.

It doesn’t fit easily on a dating profile, does it?

However, the story goes that after he lost at poker he was forced to change his name to something just one character under the legal limit in New Zealand.

How these particular 99 characters came to pass is a mystery. One can only assume that he is himself something of a Trekkie or at least a technophile. Or that the name was forced upon him by technophiles who won the bet.

What’s quite startling is that this name was accepted at all.

Names that have recently been rejected by New Zealand authorities include Majesty, King, Knight, Princess, Justice, Anal, V8, 89, Mafia No Fear, Lucifer, full stop and *.

Honestly, who would want to call themselves Anal? Someone proud of their OCD aspects?

It’s not clear whether the man will change his name back to something more edible.

I wonder, though, how he currently introduces himself. “Hi, I’m Full,” might incite strange reactions.

On the other hand, “Hi, I’m More Sexy N Intelligent Than Spock And All The Superheroes” would surely go down well on a first date.

Article source:

Next Angry Birds game a turn-based RPG

Angry Birds RPG

If you wanted more bird-flinging, it looks like you might be in for a wait: Developer Rovio has revealed that the next title in the Angry Birds series is a turn-based RPG.

Our heroes are, of course, the birds, with each bird being a different character class; the red bird, for example, is the knight (obviously; see the teaser trailer below), and the yellow bird is the wizard. You’ll lead them into battle across a fantasy-themed Piggy Island, defeating the pigs, apparently, on their home turf — presumably the story will reveal the whys and wherefores.

Angry Birds RPG

It also seems that crafting will feature quite heavily. You’ll be able to build your own weapons; according to AngryBirdsNest, these can be items such as a wooden sword, frying pan, or “stick thingy with a sponge on top.” All characters, gear, and potions will be upgradable, too.

The game is due to soft-launch either March 13 or 14 in Australia and Canada, arriving shortly thereafter for the rest of the world, and for
Android later this year.

This soft launch has roughly similar timing to the Australia and Canada soft launch of Rovio’s “Angry Birds for girls,” Angry Birds Stella, about which few details are known other than its ridiculous gender specificity.

(Source: CNET Australia)

Article source:

The Web at 25: Out of the ashes and onto the Friendster

Resting on the Yukon River in 2003. I went a little further than most when I fled the fallout of the dot-com bust.

Johanna DeBiase)

In part 1 of “The Web at 25,” I recalled the early days of the Web and how it exposed young, emerging nerds like myself to whole new worlds online. In part 2, the story continued as I came of age alongside the Web during the era of the dot-com boom and bust. Today, on the actual 25th birthday of Tim Berners-Lee submitting the concept that became the World Wide Web, I’ll revisit the long, painful hangover (it was a literal hangover, in my case) that followed until the eventual emergence of Web 2.0 that laid the foundation for today’s social and mobile Renaissance.

In 2002 in San Francisco, the South of Market district that once bustled with startups paying high rents had become a wasteland of empty offices. There was a mass exodus of tens of thousands from the Bay Area, including me. After years of being a teenage Web monkey and writer in high demand for just my most basic skills, I took the only job offered me, sight unseen, in Galena, Alaska, at an AM radio station where I was one of two full-time employees.

After a decade of living the digital revolution, I had gone all analog. And unlike in Silicon Valley, where loyalty was only as strong as the next best offer and hopping from startup to startup was common, I had signed a two-year contract. The penalty for breaking that contract was to pay back the thousands of dollars in moving expenses it took to relocate someone from the lower 48 to a tiny fly-in village that’s closer to Siberia than to the state capitol in Juneau. I was already in plenty of debt thanks to some paychecks that never arrived from now-bankrupt startups, so I wasn’t about to leave. I was locked in until at least 2004.

My Web design skills were still sub-par in 2002. (Click to enlarge.)

Eric Mack/CNET)

For a while, I tried to juggle the responsibilities of running the only radio station within 250 miles in all directions — news, weather, and country music in the morning, more news and classic rock in the afternoons — with trying not to freeze to death and still keeping a toe in the digital waters from a distance.

I gathered up other dot-com refugees from my Ironminds days and edited an online concern called Nine Planets that probably looked way too much like McSweeney’s in retrospect. That only lasted for a few months, before eventually being demoted to dwell in the forgotten spaces, much like the no-longer ninth planet the name references. In mid 2002, I posted this excuse for the demise of the site, which would also come to serve as my final goodbye to the Web 1.0 world:

The straight and strange truth is that Nine Planets currently lives in a remote rural village on the Yukon River in Western Alaska to which no roads lead. This makes it particularly difficult for Nine Planets to get a decent computer system and/or Internet connection. 24 hours of sunlight during the summer made it particularly difficult to continue spending extra hours at work where the only digital devices reside. When Nine Planets did find a decent computer on eBay, it took a month to be shipped from Maine and appears to have gotten royally f****d up in the process. But more darkness and equipment are on their way, and hence more depression and content will surely follow.

More content did not follow. At least not from me.

A cyborg no more
After that, I spent the next few years living life as a “normal” person. I was no longer an early adopter; I did not own a cell phone or have access to broadband; I did not use the word “content” in an online context. After spending most of my life plugged in, I was now less wired than my mother, who was just beginning to use email at the time.

Strangely, I didn’t miss it much. Perhaps because there was plenty to distract me. Besides the fascinating landscape, people, culture, weather, and Northern Lights, the local bar charged just three bucks for any drink you wanted. ANY drink. During evenings and weekends, I was generally a bit tipsy. But during working hours I was getting more intimate with tech that I used to take for granted. I could service and maintain a 12,000-watt AM transmitter on my own, and helped the village set up an ad hoc cell phone network run off of our station’s tower. It was more empowering than being able to design Web sites poorly, and by the time my two years was up, I was also starting to sober up (thanks in no small part to my future wife). More on what all that was like here.

Even largely removed from the Web, I watched Web 2.0 slowly emerge. If something caught on in my tiny village so far removed from civilization, it was going to take off. Strangely, I found that this made Galena a better barometer for the direction the Web would take than the words of supposed gurus in the Silicon Valley hype zone.

Village teenagers and other young adults went absolutely nuts for Friendster early on in the life of the pioneering social network, and of course MySpace followed, and even Facebook was talked about in 2004, before it was available to those beyond Ivy League schools. This kind of chatter was rarely heard at my own high school just seven years prior. The Web had completed the transition from the fringes of youth culture to becoming the bedrock of its mainstream foundation.

Slightly older transplants like myself were also hip to nascent social networks, and started to flock to early blogging platforms like Blogger and LiveJournal to chronicle our great Alaskan adventures. The emergence of Google as a superpower and its many successes in organizing the Web were also evident and undeniable as far away as the Yukon as it filed for its IPO in 2004.

Out of the wild
When I finally left Alaska in 2005, having successfully survived the ordeal and scored a brilliant wife in the process, there were more than 8 billion Web pages online, more than one for each person on the planet. Broadband had become much more commonplace, opening the door for the success of YouTube, Skype,
iTunes, and even wackier digital environments like Second Life.

This period may be the second dot-com boom that nobody noticed. Or they did, but didn’t want to say anything and jinx it all, given what happened last time. By 2006, Google had indexed more than 25 billion Web pages, or almost four for every person on Earth, along with 1.3 billion images, and the search engine was processing 400 million queries per day.

During the Web’s years in the virtual wilderness of sorts, and while I was living in the literal wild, the next-generation www was quietly being built and seemed to emerge all at once around that time. Digg and others helped introduce us to the power of sharing and viral content; Flickr and YouTube enabled a more visual Web worth sharing; and a crowdsourced free encyclopedia popularized the term “wiki” as it became the biggest reference source online, with more than 750,000 articles by 2005.

With this arrival of Web 2.0 came the maturing of Web culture, and the creation of a new generation of celebrities created, nurtured, and exploded by social media. (Where have you gone, Amanda Congdon?)

Before returning to the lower 48 to reconnect with the digital world and sun in the winter, I spent half a year in Asia and witnessed the other dimension of the reinvigorated Web that was soon to crash on these shores. In China, due to the high cost of cellular voice calls, everyone was texting, all the time. Like, even more than we do now. It was already all mobile, all the time over there, and it was easy to see why. Young middle-class Chinese breezed through their days, dashing off brief communiques 10 at a time to lay out and adjust the day’s agenda on the go.

Seeing this helped me to understand the success of Twitter that would soon follow in the United States, even as microblogging bewildered many people who simply could not understand the point of communicating in short bursts. Even then, before Facebook finally overtook MySpace, before the iPhone, it was clear that the world was becoming more social and more mobile.

Texting has been big in China for some time. Like, really big.


By 2006 I was married with a child and a mortgage on the way and settled back in the lower 48, although still far from the once-again-bustling San Francisco Bay Area. New Mexico still seems like a happy compromise between the total isolation of the Alaskan bush and the more crowded coastal tech hubs. Also, remember that pitch we heard back at the beginning of the Web about a future filled with telecommuting masses? Well, it turns out to be kind of awesome, especially for a new dad.

The Web and I came of age at the same time and had to be separated for a bit to get through our growing pains independently, but by 2007 we were both fully embracing our adulthood. The cool thing about coming into your own is that it allows you to focus on just creating and building amazing things. In the next and final installment of this series I’ll wrap it up with a look at today’s golden age of the fully grown Web.

CNET comments are currently down for maintenance, and should be back soon. In the meantime, please share your memories, and parts of early Web history I’ve missed, on Twitter at @crave and @ericcmack.

Article source:

Digging for Atari’s ‘corporate shame,’ the buried E.T. games

An original E.T. game cartridge, signed by the lead designer. Millions were made, and most of them were buried in a New Mexico landfill after the game was deemed one of the worst ever.

Daniel Terdiman/CNET)

AUSTIN, Texas — E.T. wants to go home. But first there will have to be a massive excavation of a city’s garbage dump.

As is part of video-game industry lore, in 1983 Atari ran screaming from its ill-advised E.T. game and hastily and quietly buried millions of cartridges. Somewhere. No one was quite sure where.

It turns out the where was Alamgordo, N.M., and almost certainly deep in the giant city garbage dump. Last year, a team of filmmakers announced they’re working on a documentary about the infamous E.T. game disaster — which cost Atari $500 million and drove it into financial ruin. And at
South by Southwest this week, they talked at length about the project and their plans to excavate the games and make their movie.

The filmmakers behind the movie about the excavation of the infamous E.T. game cartridges make a point about why they were buried.

Daniel Terdiman/CNET)

The history of Atari’s disaster is pretty well known. In 1983, on the heels of the unbelievable success of Steven Spielberg’s “E.T.,” the suits at Atari ordered a game version. Pronto. They wanted it to hit shelves in six weeks. In an industry where quality mainstream games usually took months, this was a tall order. The result? An effort generally thought to be one of the worst games in history — shallow, ugly, boring. It sold 1.5 million units immediately because of the movie’s success, but then sales ground to a full-stop halt.

Flash forward 30 years and the folks at Lightbox and Fuel Entertainment got together to make a movie about this legend. They got Microsoft on board to distribute it as part of its Xbox film series, and they were off for New Mexico.

At SXSW, Johnathan Chinn, co-president and producer at Lightbox, and Mike Burns, CEO of Fuel Entertainment, explained where they’re at with things. For one, they’re almost certain the games are buried in the city dump in Alamogordo. But even if that turns out to be true, it’s a massive facility, and it will not be a simple matter of digging one hole and declaring victory. The dig could take time, they said, and they’ll want help. They hope that people will show up to assist and, perhaps, cheer them on.

Most likely, they’ll start the excavation — with the permission of the dump, of course — sometime this spring. Perhaps as early as April. Assuming they find the cartridges in short order, the dig could be over quickly. But it could also drag on. Regardless, the movie itself is about much more than just the creation of the video game and its subsequent tarnished history. Instead, the filmmakers said, they realized there was an opportunity to wrap that story around a larger tale of Atari’s rise and fall. From being a company founded by the larger-than-life Nolan Bushnell, which hired the young Steve Jobs and Steve Wozniak, to one whose name and intellectual property has been sold and bought and dispensed with and rescued time and time again.

At the heart of that roller-coaster ride, though, is the misguided attempt to cash in on Spielberg’s theatrical triumph. “My sense is that this is a story of corporate shame,” Chinn said. “They just wanted it to go away, but here we are making a film about it. The moral of the story is: Don’t just bury your mistakes.”


While there will be a lot about Atari in the film, the real drawing card will be the hunt for the buried games. A good bit of that could be the film production team’s many trips to and time spent in and around Alamogordo. “I did speak to a bunch of witnesses in Alamogordo, who were kids in 1983,” Chinn recalled, “who claimed that they snuck into the landfill and stole cartridges that were totally playable. Other people, including people at Atari, claimed that there wasn’t anything interesting there.”

Article source:

The Web at 25: I was a teenage dial-up addict

Beautiful since 1995. (Click to enlarge.)


This World Wide Web you’re looking at right now wasn’t always something most people considered worth a second glance — let alone hours days weeks years of nonstop staring. In fact, even some of the big info-nerds of the day ignored or dismissed it early on.

One of the earliest public demonstrations of the Web came back in 1991, when a man named Tim Berners-Lee sat at a table with a computer in a Texas hotel conference room, willing to give anyone with a few minutes to spare a personal introduction to his invention — a concept and a structure that would soon spark a worldwide information revolution.

Every person in the room that day likely came to depend on this invention by the close of the 1990s, if not sooner. But when first confronted with the Web in that hotel, most simply said whatever the equivalent of “meh” was at the time, and went in search of a drink.

“It was quite a warm December evening in San Antonio,” recalled Professor Wendy Hall of the UK’s University of Southampton, who was in that room for the 1991 Hypertext Conference where Berners-Lee had been denied a speaking spot to show off the most important human creation of a generation or three. “In the courtyard outside the demo room was a tequila fountain and everybody was outside drinking free margaritas, so nobody was inside. This was the first demo of the World Wide Web in America.”

The first public demo of the Web in the United States in 1991. (Click to enlarge.)


Keep in mind, this wasn’t a conference of Luddites. It was a gathering of people interested in hypertext before anyone really knew the phrase (most people still don’t know it, because it was quickly supplanted in our culture by “the Web”). This was a crowd of information nerds, and yet, they still were unimpressed by the World Wide Web on day one. And it wasn’t really even day one, either. Berners-Lee had actually first submitted a paper detailing his invention almost three years earlier, on March 12, 1989.

And this is the date I mean to commemorate over the course of the next week — the actual birth date of the World Wide Web, a creation Mr. Berners-Lee and his colleagues gave to the world completely free of charge on that day 25 years ago. In this post and three that follow, I’ll look at the first two and a half decades of the Web, from its awkward infancy, to those crazy boom and bust years, the lull that followed (I like to call them the “lost in the wilderness years”), and finally, the countless bits and pings of the lovable monstrosity that is today’s mobile and social Web.

But I’m less interested in the many technical milestones of the Web between then and now (I’ll still cover plenty of them, don’t worry), as I am in the infinite ways it’s changed the way we as individuals, and as a society, live and think. As I sit in my favorite chair right now, laptop in lap, my 6-year-old daughter diligently weaves tiny rubber bands together into the form of a toy horse according to instructions delivered by a friendly voice on the YouTube video streaming on a
tablet sitting on the coffee table.

The sight of a 6-year-old girl deep in the act of creation while tapping and swiping the screen in front of her triggers a flood of images — a massively compressed zip file of emotionally charged moments from the past two decades unpacked from my memory as I see myself growing up online — from an awkward teenager getting Web design tips from CNET in 1994 to a husband and father writing these words for CNET in 2014.

So let’s dive right in. Like most journeys that pass through the 1990s, this might get a little weird at points.

The Big Bang
It’s not hyperbole to say that I owe much to the Web, and I think that notion can be extended almost to my entire generation — and really all generations that have interacted with the Web. Of course, during the years that Berners-Lee and company were pitching their project to disinterested conference crowds, I myself was too busy watching “Fresh Prince of Bel-Air” and imprinting rude gestures on my Hypercolor T-shirt to pay any attention to what was coming out of CERN — let alone learn what CERN is.

Still, like many others on the bubble between Generation X and the so-called Millennials (I was 9 when the Web was born in 1989), I was old enough to appreciate on some level how big and exciting the coming changes would be, yet young enough to be open to something radical and non-linear — a new medium with the potential to not just shift paradigms, but to extract the whole damn paradigm transmission and leave it in a flaming heap by the side of the road.

The author looking lost in awesome specs around the time the Web was born. (Please don’t click to enlarge.)

Facebook/Bethany Watts Therrien)

Back in those days, I was spending lots of time escaping typical preteen traumas at a keyboard, first an Apple IIe and a Commodore 64, then an IBM PC XT clone. Then one day, my family added a modem to our PC clone setup and I soon understood the concept of the Big Bang. Not because I finally had access to the online Grolier encyclopedia, but because the connections I struggled to make at school, with family, and elsewhere in the world were suddenly possible from behind that keyboard.

I was no longer socially handicapped by my unwieldy afro, Hubble-scale specs, and strong tendency to be a pushover; instead I was part of a judgment-free universe (yes, the Internet was friendly once upon a time) that was expanding exponentially.

Within a few years, I had even grown bold enough to carry my early digital interactions into the real world. I went to meetups hosted by the Toad the Wet Sprocket listserv admin and purchased PC parts from a strange guy in a Denver apartment that had no furniture but did have more than 1,000 used monitors. These encounters were awkward, but hey, progress isn’t always pretty.

Keep in mind, this was still just the early ’90s and I was just a tween in the Denver suburbs logging onto BBSes, Prodigy, or America Online mostly for the purpose of trading stamps, coins, football cards, games, and software (although one screen name on my mid-’90s AOL buddy list also turned into my humiliating first kiss in the real world — it was so bad I remember sighing and actually saying “damnit” out loud, and I never heard from the girl again.)

But even at that early stage, and using services that were mostly walled gardens cut off from each other, it seemed like a bottomless well of possibility and potential. I was also aware of the cool kids over at The WELL and the fanatics playing Neverwinter Nights. Disparate online worlds had formed, and unbeknownst to most people in 1992, the system to unite them all was about to be commercialized, but first its access point needed a makeover.

A ‘Mosaic’ of early adopters
If you really boil it down, the problem with the early Web that probably led Berners-Lee to play second fiddle to the tequila fountain in San Antonio in 1991 was its lack of animated GIFs.

The first Web browsers were either text-only affairs like LYNX or used unwieldy means of processing images like popping them up in separate windows. It was not quite yet the short-attention-span multimedia extravaganza that would soon come to define modern pop culture.

The Mosaic browser.

National Center for Supercomputing Applications/University of Illinois Board of Trustees)

Enter Marc Andreessen, the National Center for Supercomputing Applications, and Mosaic in 1993. While initially it did not even have a back button, its simple installation for Windows systems, intuitive interface, and integration of graphics made it the first widely-used Web browser. As the software was spreading, so was the infrastructure for the young Web. During 1993, the number of Web servers worldwide quickly grew from just dozens to several hundred.

By 1994, the roads for the information superhighway had been laid down — to borrow an ancient digital metaphor — and the vehicles had also been manufactured in the form of software like Mosaic. All that remained was to recruit some Web drivers, or uh, surfers, or whatever. Unfortunately, many of the folks in the mainstream media at the time weren’t quite hip to the Internet’s crazy new symbology to do much of that proselytizing. Case in point: this clip of NBC’s “Today Show” hosts bantering off-air during a commercial break and finally flat-out asking co-workers “what is internet?”

When the Pew Center started its research into the Internet and American Life in 1995, it found 14 percent of the country was already online. These were the early addicts like myself who tied up our families’ phone lines and ran up big bills with online services — and then with long-distance charges.

One of the first major purchases I ever had to save up to pay for with my own money was $220 worth of long-distance calls (explanatory link meant to be sarcastic — if it doesn’t read that way, you must be younger than me) incurred during a single week in the mid-’90s when America Online’s Denver dial-in numbers were chronically busy, “forcing” me to get online by dialing-in via a Cheyenne, Wyo., number instead.

I had to mow many a Colorado lawn to pay off that tab, but I don’t ever recall thinking it wasn’t worth the time and effort.

This early addiction to the exciting and limitless online world would eventually lead very smart people to think it was not only possible to nuke our aforementioned paradigm transmission, but to challenge the very foundations of economics, leading to a remarkable boom and bust that… but now I’m getting ahead of myself.

The master protocol
Fairly quickly after the introduction of Mosaic, the Web’s hypertext transfer protocol (HTTP) would become the preferred means of sharing information for public consumption over the Internet’s pipes. Newsgroups could not support the critical masses of the information-addicted like myself that were beginning to coalesce in those early days. Neither could other predecessor/parallel protocols to the Web like Gopher, WAIS, Telnet, or even the mighty FTP, which all saw their status degraded as the Web rose to prominence over the next two decades.

It was when AOL, Prodigy, and others began to take down the wall and added a browser to their subscription offerings in early 1995 that I began to fully understand the gravity of this insane, intangible, indomitable force that is the Web.

There were early indications of the weird places this unprecedented access to information would take us. The project that would begin as the Cardiff Internet Movie Database before becoming just the Internet Movie Database and eventually IMDb mimicked the old brick-and-mortar library model, but with the digital twist of crowdsourcing a wealth of information on a popular topic to create a whole new reference resource.

In the offline world, that probably would have been the end of the story.

But in this bold new frontier, college students and film buffs didn’t just use the Internet Movie Database to settle bets and research film history midterms. They also mined it to waste countless hours playing Six Degrees of Kevin Bacon. The Web dropped a chaos bomb, and many of us didn’t just embrace it, we set up shop to begin mining it for gems of pure awesomeness, exploiting its riches to enable unbridled creativity and achieve new levels of procrastination.

Failure of imagination
At this point, I was 15 years old and I had been online in some capacity for four or five years already, yet I didn’t really get it until I started spending time on the completely unfettered Web. Even as I spent countless hours the previous few years in America Online forums, newsgroups, and chat rooms, I still thought the most likely career for me might be at a local radio station, or maybe enjoying a quiet life as a librarian. Talk about a failure of imagination — I was like a kid peering through a window at the wonders of Henry Ford’s assembly line in the 1920s while still pondering a career making ox carts. If there had been a tequila fountain nearby, I might have found myself shelving books today instead of writing this.

The Web was the ultimate killer app because it made the potential of the wider Internet so obvious. It was the conquest of time and geography in digital form.

Columnist James Coates put it a little more eloquently back in May of 1995, writing about the Web’s penetration of the biggest online dial-up services: “Right now, only a handful are tasting the wonders to come. But it won’t be long before these humming handfuls give way to the howling hordes and browsing the Web becomes as common as surfing the cable channels or twisting the radio dial.”

Coates nailed it all the way back then, even though I still don’t understand how to dial someone with a radio.

But what few people saw at the time was the steep trajectory the Web would take over the next few years as it launched itself into our collective consciousness.

The wild success of the Web that would define the final years of the century would also determine the course of my life, at least until things eventually got derailed. But again, I’m getting ahead of myself. More on that in the next installment when I take a look at the boom and bust years of the late 1990s.

CNET comments are currently down for maintenance, and should be back soon. In the meantime, please share your memories, and parts of early Web history I’ve missed, on Twitter at @crave and @ericcmack.

Article source:

Newspeg puts journalistic spin on Pinterest-style bookmarking

Screenshot by Kelsey Adams/CNET)

Pinterest can be a great way to bookmark and organize all your favorite celebrity photos, recipes, DIY inspirations, home-decorating tips, and videos of famous Internet cats. But for those who want to save and organize more serious articles, Newspeg may be for you. Created by journalists, for journalists, Newspeg skips the fluff pieces and goes straight to hard-hitting headlines.

Formerly a journalist at major newspapers including The Washington Post, the Chicago Tribune, and the San Francisco Examiner, Newspeg CEO and founder Mark Potts has a long history with digital news and is currently very active in the business of digital media. His latest endeavor is designed to be a way of sharing and saving news in a way that’s useful to readers as well as publishers and journalists.

Borrowing from the look and feel of Pinterest, Newspeg is intended to be “a site where people can really easily share and save news stories, in a visual kind of way, in a way that picks up graphics from the story but also lets people know where it came from,” Potts told The sources of stories are shown, and stories will link back to the original publication site, so publishers get credit.

Newspeg works via a browser plug-in. To share a story you run across on the Web, you click the “Peg It” button in your browser toolbar, which adds the headline and a link to the story to one of the “topics” you’ve created. You’ll be given a chance to pick the accompanying photo, add a comment, and even change the headline.

In addition to readers using Newspeg to gather and organize the latest stories on their favorite topics, it also could be a great way for writers to organize their own articles by specific topics, or organize research for their future work as well.

Potts told that he hopes news organizations will consider using Newspeg as a content management system, and he is interested working with such groups to build custom, possibly branded versions of the platform. But he encourages everyone to try it out and give feedback. So if you want to try sorting and viewing your news sources in an attractive, visual way, check out Newspeg!

Article source:

The 404 1,437: Where we need to use your can (podcast)


Leaked from today’s 404 episode:

One man’s Twitter struggle to tell the story of his Subway cockroach sandwich.

– An app that listens for danger when you’re not paying attention.

Toilet-sharing service Airpnp makes your home a public restroom.

Keurig coffee makers go DRM.

Episode 1,437


iTunes (HD)
iTunes (SD)
iTunes (HQ)
iTunes (MP3)



Article source:

Categories: News Tags: , , , , , , ,