Google’s Project Tango will enable cool augmented reality games (hands-on demo)

Larry Yang of Google demonstrates Project Tango's augmented reality shooter game.
Larry Yang of Google demonstrates Project Tango's augmented reality shooter game.

Larry Yang, a former member of Microsoft’s Xbox 360 hardware team, gave me a demo recently over at his new job at Google. Yang is the lead project manager for Project Tango, which equips a mobile device with 3D-sensing capabilities so that you can create games and other apps that are aware of their surroundings. He showed me a game that uses Tango, and it was like stepping into a future where games, 3D sensing, motion-sensing, cameras, and animated overlays all combine into a very cool augmented reality experience.

Project Tango is yet another way to create the feeling of AR, where you insert animated objects into a real 3D space that you can view via special glasses or a tablet’s screen. In our demo, we used a special tablet, but you could also use something like upcoming AR glasses. Augmented reality is expected to become a $120 billion market by 2020, according to tech advisor Digi-Capital. But first, companies such as Google have to build the platforms that make it possible. Google is demoing the technology this week at the 2016 International CES, the big tech trade show in Las Vegas.

With Tango’s technology, the mobile device can use the sensors to detect the physical space around it, and then it can insert the animations into that 3D space. So you can hunt through your own home for those killer robots and shoot them with your smart device.

Tango taps technologies such as computer vision, image processing, and special vision sensors. When I arrived at Google’s headquarters in Mountain View, Calif., Yang greeted me with a tablet that had Tango running on it. I asked to go to the restroom. He held the tablet out in front of him and ran a query for the restroom. The tablet screen showed an image of the scene in front of Yang, but it included an animated green line that led the way through the building to the restroom. Yang held the tablet out in front of him and followed the green light through the corridors to the restroom. The tablet knew exactly where Yang was and which direction he was facing, thanks to the motion-tracking sensors.

I'm actually shooting robots on a screen, not the guy sipping his coffee.
I'm actually shooting robots on a screen, not the guy sipping his coffee.

Above: I’m actually shooting robots on a screen, not the guy sipping his coffee.

Image Credit: Dean Takahashi

And once Yang mapped out the Google building’s interior, Tango remembered the layout. Project Tango devices can use visual cues to help recognize the world around them. They can self-correct errors in motion tracking and become reoriented in areas they’ve seen before.

When we arrived at a break room, Yang showed me a demo of a shooter game called Project Tango Blaster.

Here's the kind of image we saw on the Tango Blaster tablet screen.
Here's the kind of image we saw on the Tango Blaster tablet screen.

Above: Here’s the kind of image we saw on the Tango Blaster tablet screen.

Image Credit: Google

Tango Blaster uses the motion-tracking ability of the Project Tango developer platform, which consists of a modified Android tablet. The tablet has been modified to include a wide-angle camera and a 3D depth sensor. Both support the Project Tango software stack.

Yang put the tablet into the top of a toy gun and pointed it in one direction as the game started. He fired at the killer robots in front of him. But he had to keep spinning around to find more of them. Those robots were wandering around the room, maneuvering around tables and chairs. I tried it out and had to shoot the robots quickly before a timer ran out. It got me moving and sweating a little bit. So yes, it’s another one of those game experiences that get lazy gamers off the couch.

Check out the video below of Yang demonstrating Project Tango. Here’s a link to a bunch of the other apps.

CES2016 - 6
CES2016 - 6