Archive

Posts Tagged ‘pictures’

Sony announces The Last of Us Remastered for PS4

Sony announces The Last of Us Remastered for PS4

The Last of Us Remastered for the PS4 includes revamped Full HD visuals, commentary for cinematics, and bundled DLC.


Sony has announced plans to release a remastered edition of hit PlayStation 3 title The Last of Us for the PS4, promising revamped Full HD visuals at 1080p.

Recently the subject of a film deal with Ghost House Pictures and Sam Raimi, The Last of Us follows the exploits of the player-character Joel and Ellen Page-inspired companion Ellie as they work together to survive in a post-apocalyptic world devastated by a mind-controlling fungus. Its gripping storyline has led to numerous awards, with many critics proclaiming it a must-have title for all PS3 gamers.

For those who have made the jump to the non-backwards-compatible PS4, though, Sony has promised a rerelease. Dubbed The Last of Us Remastered, the new version of the game will include higher-resolution character models, improved shadows and lighting, upgraded textures and other visual tweaks – all, developer Naughty Dog has promised, running at a targeted 60 frames per second Full HD.

As well as the improved graphics, the Remastered edition will include commentary for all cinematics from creative director and writer Neil Druckmann alongside voice actors Troy Baker and Ashley Johnson who play Joel and Ellie respectively. The PS4 rerelease will also come bundled with the Left Behind single-player expansion, the Abandoned Territories multiplayer map pack, and an as-yet unreleased map pack dubbed REclaimed Territories.

Sony has raised eyebrows with its promised pre-order bonuses, however. Those buying the game from selected retailers can receive extra Supply Points for Factions mode along with boosted abilities – increased healing and crafting speeds, increased reloading speeds and ammunition capacities – for the single-player campaign, leaving those who prefer to buy their games at the time of release at a disadvantage.

A formal launch date has yet to be announced, with Sony aiming for a summer release.

Article source: http://feedproxy.google.com/~r/bit-tech/news/~3/c9oCGXSxYtM/1

Article source: http://feedproxy.google.com/~r/GamingRipplesWeb/~3/lvG-sOiuk0w/

Tour the Milky Way in 20 billion pixels

Milky Way
(Credit:
(Screenshot by Michelle Starr/CNET Australia))

Most of us will never leave the Earth — but that doesn’t stop us dreaming of the stars. There are a few tools that let you explore, though — and NASA has just launched a killer.

Created from the Galactic Legacy Mid-Plane Survey Extraordinaire (Glimpse) project, it’s the most comprehensive visual map of the Milky Way Galaxy released to date — and yet it only shows just over half of the galaxy’s stars. Stitched together from more than 2 million images taken by the Spitzer Space Telescope over the course of a decade, the zoomable, 360-degree image comes in at 20 gigapixels. Since its launch in 2003, Spitzer has spent a total of 4,142 hours taking pictures of the Milky Way in infrared light.

“If we actually printed this out, we’d need a billboard as big as the Rose Bowl Stadium to display it,” Spitzer Space Science Center imaging specialist Robert Hurt said in a statement. “Instead we’ve created a digital viewer that anyone, even astronomers, can use.”

When viewed in the visual spectrum, sections of the Milky Way — a flat spiral disc — are occluded by dust. By taking images in the infrared spectrum, through which stars can be seen through the dust, Spitzer allows us a more complete picture of our galaxy so that astronomers can map the spiral arms and determine the galaxy’s edges.

With Glimpse data, astronomers have been able to create the most accurate map of our galaxy’s center to date, and see star formation and faint stars in the outer, darker regions that, prior to Spitzer, were unexplored territory.

“There are a whole lot more lower-mass stars seen now with Spitzer on a large scale, allowing for a grand study,” said Barbara Whitney of the University of Wisconsin, Madison, co-leader of the Glimpse team. “Spitzer is sensitive enough to pick these up and light up the entire ‘countryside’ with star formation.”

There are two ways to view the mosaic: using Microsoft’s WorldWide Telescope platform, which includes context and cross-fade to visual light; and CDA Aladin Lite, which doesn’t show the entire mosaic, but instead offers shortcuts to regions of interest, such as nebulae, and image exports.

The Glimpse data is also being used as part of a NASA citizen scientist project. People can visit the Milky Way Project Web site and help NASA catalogue areas of interest, such as bubbles, clusters, and galaxies.

You can visit the interactive image here, and download it in full resolution here.

(Source: Crave Australia via NASA Jet Propulsion Laboratory)

Article source: http://feedproxy.google.com/~r/cnet/pRza/~3/ey5iT7etbL8/

Christian Bale is top choice to play Steve Jobs, says report

Christian Bale and Steve Jobs

Don’t expect any Batsuits in the new Steve Jobs movie.


(Credit:
Warner Bros and Apple)

Move over, Aston Kutcher, there may soon be a new Steve Jobs in town. An as-yet-untitled movie for Sony Pictures centering around the Apple co-founder has a script by Aaron Sorkin (“The Social Network”), but it doesn’t yet have a star attached. Director David Fincher (“Girl with the Dragon Tattoo”) is said to be the likeliest person to helm the production. According to a report from TheWrap, Fincher said he would only do the film if Christian Bale signs on as the lead.

The source is a classic unnamed “individual familiar with the project.” Whether that person is a Sony exec or a coffee fetcher, we don’t know. Fincher and Sorkin teamed up previously for “The Social Network,” a geek-flavored film about the rise of Facebook. It would seem natural to turn them loose together on the Steve Jobs story.

Many fans strongly associate Bale with his Batman role, but he does have a notable physical resemblance to Jobs. Back when news first came out about Ashton Kutcher getting his Apple on, CNET was among those rooting for Christian Bale instead, calling him a “strong contender.”

Right now, we’re running on speculation and rumor. Bale would have to ditch his gruff Bat-growl, but he has the acting chops to pull off the complex role. Perhaps he’ll learn from his predecessor’s mistake and avoid the fruitarian diet that landed Kutcher in the hospital.

As long as we’re at it, can we please cast Zach Galifianakis as Steve Wozniak? Share your thoughts in the comments. Would you cast Christian Bale in the role of Steve Jobs?

Article source: http://feedproxy.google.com/~r/cnet/pRza/~3/BL6wbq0QrCE/

Where should CNET Road Trip go in Arkansas, Oklahoma, Kansas?

America’s so-called Doomsday plane, which can keep top military leaders airborne in the event of a major crisis.


(Credit:
Daniel Terdiman/CNET)

The days are warm and sunny here in Northern California, and though it’s only the middle of March, it already feels like summer is just around the corner.

One reason is that I’ve started the planning in earnest for Road Trip 2014, my ninth-annual journey to highlight some of the best destinations around for technology, military, aviation, architecture, science, nature, and so on.

From Doomsday plane to Frank Lloyd Wright: The best of Road Trip 2013 (pictures)

For seven of the past eight years, CNET Road Trip has taken me all around the roads of the United States, giving me the opportunity to visit the Pacific Northwest, the Southwest, the Southeast, the Rocky Mountain region, the Northeast, and the West Coast. In 2011, I crossed the pond and covered seven countries in Europe, and last summer, I criss-crossed much of the Midwest, traveling through Illinois, Indiana, Ohio, Michigan, Wisconsin, Nebraska, Iowa, and Missouri.

This year, I’m working on covering some of the last major areas in the Continental United States that I’ve never visited on Road Trip. While the exact itinerary is still very much unclear, I know I’ll be spending a good chunk of time in Texas, and then making my way into Arkansas, Oklahoma, and Kansas.

Thanks to my own research and the helpful suggestions of readers, I’ve already got a list of a few potential destinations, but I’m turning to you again, fine readers, for ideas for can’t-miss places I need to include in the project.

This map, which CNET reporter Daniel Terdiman has used each year since 2006 to record Road Trip routes, reveals a couple of big holes in the country that signify places that he has yet to visit.


(Credit:
Daniel Terdiman/CNET)

So, if you have an idea for a Road Trip stop in Arkansas, Oklahoma, or Kansas, please send it to daniel–dot–terdiman–at–cnet–dot–com. Here’s what I’m looking for: a place in any of those states that would appeal to a national audience, that has a heavy tech or geek element, and that is highly visual, lending itself to a big photo gallery.

Some things that might work are manufacturing facilities for iconic brands, famous monuments, large-scale works of art or architecture, and famous or important military or aviation facilities. Past examples of Road Trip items include a behind-the-scenes look at America’s Doomsday plane, New York’s Grand Central Terminal, a look inside NORAD’s former home at Cheyenne Mountain, behind-the-scenes at Frank Lloyd Wright’s Fallingwater, the high-tech gear aboard the most advanced submarine on Earth, and so on.

I’d like to reward readers who come up with a great idea. So while I do have a list of potential destinations, if you send me a suggestion I haven’t already thought of myself, and that I end up adding to my itinerary, I’ll send you a small gift in exchange.

I hope to hear from you, as I know that many of you have extensive experience traveling, and I’d love to be able to benefit from that experience — and share the wealth with my readers. I look forward to hearing from you.

Article source: http://feedproxy.google.com/~r/cnet/pRza/~3/srbwGXPFkNs/

Google Maps hack turns any Street View into an urban jungle

Times Square as a jungle

My, Times Square is looking particularly lush today.


(Credit:
Screenshot by Amanda Kooser/CNET)

If you want to see New York as a wilderness area, you can watch the CGI makeover in “I Am Legend,” or turn to the Urban Jungle Street View site. Urban Jungle takes advantage of a little-known part of Street View called depth data. This allows the positioning of objects in the correct 3D space, so it really looks like a tree is growing out of the middle of Times Square.

It can be hard to navigate around once you’re in the Urban Jungle map because the usual Street View directional cues are absent. Also, everything that might look familiar is covered in vegetation. This is really more about the novelty of slathering Street View locations with greenery. It works best in locations with tall buildings, but feel free try it out on your own house.

The Urban Jungle experiment may not be around for long. Einar Öberg, the site’s creator, confesses on Twitter that he’s “breaking terms of use like it’s no tomorrow.” That’s mean you had better get in on the jungle-making fun while you still can.

I tried to use the Urban Jungle Street View to navigate into special Street View locations like the Large Hadron Collider and the Earl’s Court Tardis, but wasn’t able to get it to work. This would be a nifty feature if there is way to enable it. If there is, and I just missed it, then set me right in the comments. I really want to see the Tardis console draped with vines.

Bizarre roadside views from Google Street View (pictures)

(Via Boing Boing)

Article source: http://feedproxy.google.com/~r/cnet/pRza/~3/cSC6YYMd1WY/

Oculus Rift Dev Kit 2 now on sale for $350

Oculus’ vision of virtual-reality gaming took another step closer to reality on Wednesday with the public availability of its second Rift Developer’s Kit (DK2).

Game makers can purchase the headset for $350 today from the Oculus Web site, with an expected ship date sometime in July.

Oculus said many of the headset’s features are ready for the average gamer. These include “key technical breakthroughs” such as “low-persistence, high-definition display and precise, low-latency positional head tracking.”

The DK2 takes cues from Oculus’ Crystal Cove prototype, including a low-persistence OLED to tamp down on simulator sickness and improve the potential for presence. It has a high-definition 960×1,080 per-eye display for improved clarity, color, and contrast.

The external camera’s improved low-latency positional head tracking will allow players to peer around corners or lean in to more closely examine virtual objects, Oculus said, while improved positional tracking precision will allow better retention in-game of real world movement.

Oculus Rift Development Kit 2 is on sale now for $350, with an expected ship date sometime in July.


(Credit:
Oculus)

Other DK2 improvements include better orientation tracking, built-in latency testing, an on-headset USB port, better optics, a redesigned Software Development Kit, and optimization to integrate Oculus Rift with the Unity and Unreal Engine 4 game engines. Oculus also has eliminated the “infamous” control box.

Despite the improvements, Oculus said the “overall experience” is still lacking, and not yet ready for gamers.

“DK2 is not the Holodeck yet, but it’s a major step in the right direction,” the company said.

Developers now have access to Oculus Rift Dev Kit 2 (pictures)

Article source: http://feedproxy.google.com/~r/cnet/pRza/~3/GID93fg9-jI/

Voice is not enough: Motion is key to Android Wear

Forget poetry, the future is wearables in motion.


(Credit:
Motorola)

Google and Motorola rolled out their joint vision of Android Wear, the Moto 360, and the future of wearables on Tuesday. (LG also gave us a taste of its upcoming G Watch.) Based on the few videos and all the information released for developers, it appears that Google’s wearable platform is a fancy port of Google Now “cards” and voice control in a pretty spiffy, new form factor.

While this is the focus of the developer preview out this week, don’t be fooled.
Android Wear will be much more than just some full-faced watches that respond to speech, taps, and swipes. For the past few years now, Google has been telegraphing that it is much more interested in how we ambulate our entire bodies, not just our index fingers and vocal cords.

Last August, I went to New York to get my hands on the much-hyped Moto X. I spent a few weeks with a review unit and then sent it back and moved on to demo the other anticipated Android phones of the season — like the Nexus 5. But when it came time for me to put my money where my mouth was and buy my next daily use device a few months later, I went with the already slightly aged and less powerful Moto X.

What sold me on the
Moto X was its integration of a few features that are almost certainly heading for the Moto 360 and likely other Android Wear devices — touchless control and activity recognition, and the seamless marriage of voice control and contextual awareness that still is not really offered on any other device.

Normally, my Moto X has an “active display” function that pulses on and off to show me the time and any new notifications. I can touch the screen to get more details on new notifications. That is, unless the phone is face down or in my pocket — then it doesn’t pulse on at all to conserve battery life. So, flipping my phone down and then back up is a very easy way to see new notifications with a flip of the wrist.

Hmmm. What other form factor might benefit from responding to such motion?

Get a move on
The Moto X also was among the first phones to take advantage of a new activity-recognition feature that lives in Location Services in Android and can discern if a user is walking, driving, or standing still, among other states. The Android Wear developer preview encourages programmers to become familiar with using activity detection and even geofencing to trigger contextual notifications on wearables. For example, if your phone detects that you’re riding a bike, apps could automatically forward all notifications to the Wear-powered device on your wrist.

If you still don’t think Android Wear is about motion and gestures as much as talking and tapping, take another look at Google’s own introductory video. There’s a rather comical scene in which a woman sprints to catch a plane, and her smartwatch detects the activity and automatically estimates how many calories she just burned; or the woman whose watch detects that she’s dancing and offers to look up the song that’s playing.

This last one in particular took me back to the floors of CES in Las Vegas this year where wearables abounded. Some of the more impressive devices were those that made use of programmable gestures. A small device called Kiwi demonstrated how it can be programmed to perform the same Shazam-like action when the user draws a musical note in the air — this is perhaps a little more intuitive than having to get jiggy with it anytime you’re curious about the title of a song.

And Google has clearly demonstrated that it is interested in merging gestures with contextual awareness as much as it is in getting us to speak to it no matter where we are.

In addition to its work on activity recognition in Android and with Motorola, Google recently bought a small Swiss app developer called Bitspin that is best known for making Timely, an Android app that is really a fancy alarm clock and makes use of — you guessed it — motion detection and gestures. What a, uh, “timely” acquisition that was for Google to make in the months leading up to the reveal of Android Wear.

Android Wear unveiled: LG G Watch and Moto 360 (pictures)

Expect Android Wear to ultimately go even further than simply responding to the flick of a wrist and figuring out if you’re walking or biking. In the full SDK, Google plans to introduce the ability to gather more sensor data. Android APIs currently include support for not just harvesting data from a phone’s accelerometer, but also from a gyroscope, and sensors for temperature, light, pressure, proximity, humidity, rotation, linear acceleration, and even magnetic fields.

That’s a whole lot of context that would be all the more powerful when paired with an arsenal of gestures.

Dick Tracy had part of the equation right — a good wearable needs to be able to be spoken to, but to be truly smart, understanding body language is just as important.

Article source: http://feedproxy.google.com/~r/cnet/pRza/~3/EcGo3vimkO4/

‘Star Wars: Episode VII’ tells tale 30 years after ‘Jedi’

Chewbacca? Han Solo? Luke Skywalker? The upcoming “Star Wars: Episode VII” will take place after events from “Return of the Jedi.” Here’s hoping for an elderly Ewok battle!


(Credit:
Lucasfilm)

Some news for fans excited to find out what life is like post-Ewok battles and that metal bikini: Lucasfilm has announced a start date for shooting “Star Wars: Episode VII.” Director J.J. Abrams and his team will begin shooting principal photography in May at London’s historic Pinewood Studios.

Lucasfilm also announced that “Star Wars: Episode VII” will take place about 30 years after the events of “Star Wars: Episode VI Return of the Jedi,” and will star “a trio of new young leads along with some very familiar faces.”

Whether those faces are Oscar-winner Lupita Nyong’o, “Girls” actor Adam Driver, or whether actors John Boyega, Matthew James Thomas, Ray Fisher, Jesse Plemons, and Ed Speleers are indeed officially vetted for “Star Wars” roles by director J.J. Abrams is anyone’s guess. “No further details on casting or plot are available at this time, Lucasfilm said.

Familiar faces to appear in the new “Star Wars” film may include original trilogy stars Carrie Fisher, Harrison Ford, and Mark Hamill, though they are keeping an air of mystery around the project and their rumored roles. In January on an AMA Reddit interview, Hamill stated, “The only character I know for sure is returning is my friend R2-D2. He hasn’t stopped beeping about it.”

While casting rumors continue to invade the Web, the best bet for fans wanting to know the truth is to check regularly for announcements like these at the official Lucasfilm site, StarWars.com.

“Star Wars: Episode VII” will come hit theaters on December 18, 2015.

Actor’s vintage ‘Star Wars’ shots revealed (pictures)

Article source: http://feedproxy.google.com/~r/cnet/pRza/~3/hVF5VoPUhE0/

James Bond’s lethal vehicles

James Bond’s iconic Aston Martin.

No, Mr. Bond, I expect you to drive…. Britain’s deadliest secret agent needs suitably lethal wheels fully stocked with extras, from rocket launchers to ejector seats. An exhibition opening this month in London lets you spy on 50 years of James Bond’s rides, from 007′s iconic 1960s Aston Martin to the miniature, one-third scale Agusta Westland helicopter used in the climax of “Skyfall“.

James Bond’s deadliest cars, boats, and jet packs (pictures)

Bond in Motion also features the BMW controlled by an Ericsson phone; the
car driven by the one woman who managed to tie James Bond down; the jet pack from “Thunderball”; the submersible Lotus Esprit S1 from “The Spy Who Loved Me,” nicknamed “Wet Nellie”; and the mini-helicopter from “You Only Live Twice,” nicknamed “Little Nellie”.

We employed our spy skills to sneak in for an early look, so join us by hopping into the iconic Aston and taking the wheel for a high-speed chase through Bond history — oh, just don’t touch that big red button…

Bond in Motion is at the London Film Museum in Covent Garden from 21 March, running throughout 2014.

Article source: http://feedproxy.google.com/~r/cnet/pRza/~3/1Rr_QszUmco/

OK, Glass, have an NBA player dunk in my face

Sacramento Kings guard Ray McCollum slams home a dunk during a scrimmage while wearing Google Glass.


(Credit:
James Martin/CNET)

SACRAMENTO, Calif.–”This is the real Google,” taunted Sacramento Kings guard Orlando Johnson.

Johnson leaned in, dribbling a basketball, ready to explode to the hoop. Only teammates Ray McCallum and Jason Thompson stood in the way. Through the
Google Glass I was wearing, I watched Thompson prepare to stop Johnson. From Thompson’s exact point of view.

Moments earlier, I’d watched as McCallum had dribbled in, jumped high in the air, and dunked the ball hard. My view? A look at the rim from a couple of feet away, close enough to see the stitches on the net, again from Thompson’s vantage point,

Each of the three Kings was wearing Glass, and each was recording as they worked their way through an informal shootaround hours before the night’s game against the New Orleans Pelicans. As they played, they taunted and bragged, well aware of the technology they were wearing. “Google, record that,” one shouted as he made a sweet shot. “Google, stop Ray,” Johnson commanded.

Johnson, McCallum, and Thompson were wearing Google Glass as part of a new program the Kings have started that is designed to let fans see things like shootarounds, pre-game workouts, and even in-game huddles from the players’ perspective. Using technology developed by San Francisco’s CrowdOptic, the Kings plan on making feeds from Glass being worn by players, announcers, the team’s mascot, and even its cheerleaders, available during games to anyone running its app on their own Glass, on TV, and on the arena’s JumboTron. Unfortunately, players will not wear Glass during actual game action.

A Sacramento Kings cheerleader dances while wearing Google Glass. Her view was broadcast to the team’s JumboTron during a game against the New Orleans Pelicans.


(Credit:
James Martin/CNET)

The Kings’ experiment is an interesting one that promises to offer fans a unique new look at game day action. Along with other experiments, like accepting Bitcoin, using drones to shoot video inside the team’s Sleep Train Arena, and even incorporating Oculus Rift, the Kings are trying to take the lead among NBA teams when it comes to using technology to enhance fans’ experiences.

And no wonder, given that the team’s ownership group is packed full of tech heavyweights like Tibco Software founder Vivek Ranadive; Paul, Hal, and Jeff Jacobs (whose father founded Qualcomm); Leap Motion President (and former Apple vice president) Andy Miller; and former Facebook chief privacy officer Chris Kelly. Thanks to those connections, the team, in its search for new tech to try out, is “literally one phone call away from every tech CEO in the world,” said Kings senior vice president for marketing and strategy Ben Gumpert.

But back to Glass. Here’s how it works.

When Glass records video, it can broadcast that feed, and CrowdOptic’s software can capture it, send it back out, allowing anyone running its app to “inherit” the feed. Although there’s a short delay, it means that an average Glass wearer — or later, someone running the CrowdOptic app on a smart phone — will be able to see just what I saw when I watched Thompson, Johnson, and McCallum play 1-on-2: an up close and very personal view of getting dunked on.

NBA dons Google Glass to put you in the game (pictures)

To start with, the Kings bought 10 pairs of Glass, meaning that at any one time, there are few possible feeds that fans could inherit. But over time, as the team buys more, or fans’ own Glass or smartphone feeds are incorporated into the mix, CrowdOptic’s algorithms will be brought to bear to help find the most compelling views for fans. As Jon Fisher, the company’s CEO explained, its technology is able to analyze multiple feeds coming from a similar location and choose the best one to share. Ultimately, when there’s hundreds, or even thousands, of feeds choose from, “the fans will be in charge,” said vice president of business development (and former NFL linebacker) Jim Kovach. “They’re going to see what they want to see.”

As far as the players are concerned, wearing Glass and using the hot wearable technology to give fans a little more access is a no-brainer. According to Thompson, the best way to use it is when doing “tricks and dunks, and flashy things….[You can] see different things, like the way people talk.”

That’s exactly what CrowdOptic is hoping pro sports teams will realize. In addition to the Kings, the company is working with a half-dozen other (as yet unnamed) NBA franchises, as well as some college teams. The technology, said Kovach, lets fans have a much closer look at players’ personalities. “They have their quirks, and you can’t pick that up from the stands,” Kovach said, referring to things like players messing around during workouts, or on the sidelines. “It’s just interesting to see.”

Sacramento Kings players Orlando Johnson, Ray McCallum, and Jason Thompson (left to right) scrimmage while wearing Google Glass.


(Credit:
James Martin/CNET)

To be sure, this technology isn’t ready for widespread deployment. Though the Kings have tested it out during two recent games, the team has so far only pushed the feeds to the arena’s JumboTron screen. For now, network support is the limiting factor. But soon, Glass wearers will be able to see what it’s liked to get dunked on by an NBA player.

“This is a new century,” Thompson said. “It’s 2014, and this is definitely the future, not just of basketball, but of the world.”

Then again, maybe McCallum put it better as he scrimmaged against Johnson and Thompson. “Oooooooh, Google,” the 22-year-old guard said as he drained a pretty bucket over his teammates.

Article source: http://feedproxy.google.com/~r/cnet/pRza/~3/giTuRIID3qM/