Skip to main content
Have a request for an upcoming news/science story? Submit a Request

Envision Center hosting Google Project Tango technology demo for faculty researchers

  • Envision Center, STEW B001
  • Events

Close your eyes. Walk into the room with the tablet computer in your hands and your ears open. Release your fear of stumbling blindly into a conference table, chair, podium or wall.

“The force” is not guiding you around the obstacle-strewn space. No Jedi trick here, more like sonar in a submarine or the echolocation of bats. Your guide is the variation in the sounds coming from the tablet, which clue you in on whether you’re about to bruise yourself on an obstacle or navigating a clear path.

The device enabling this is from a Google research and development effort called Project Tango that now includes Purdue’s Envision Center. The center is looking for faculty to partner on research projects making use of the technology, which the center’s staff will demonstrate at an event set for noon to 1 p.m. Friday, April 24.

The demonstration will be in the Envision Center, Stewart Center, Room B001, located off the tunnel between the Stewart Center and the Purdue Memorial Union. No registration is required. For more information contact George Takahashi, Envision Center technical lead, 46-67888, gtakahas@purdue.edu.

“We want to let researchers know we have this device and some of the ways we’re using it,” Takahashi says. “We would like to get other faculty interested in partnering on research with it related to any ideas they might have for employing technology like this.”

Google says the idea of Project Tango is to give mobile devices the ability to navigate the physical world like humans do by combining advanced computer vision, image processing and sensors.

Google's prototype Project Tango tablet utilizes infrared sensors for real-time detection of objects and surfaces in a surrounding environment, which can be combined with data from cameras, accelerometers, gyroscopes, GPS capability and other features now common in smartphones and tablets.

An Envision Center-developed application enables the Google tablet to convert the detected information into sound to aid unsighted navigation. This same process also could be used for navigating virtual environments without being able to see them. The center developed the app in partnership with engineering practice Professor Brad Duerstock, director of the Purdue Institute for Accessible Science.

Operated by ITaP, the Envision Center uses a blend of technology and art to help enhance research and teaching, among other ways by graphically representing data and information. It specializes in technology and techniques such as data visualization and analysis; virtual simulation; human-computer interaction; and media creation, including video, animations and publication-quality stills.

Besides operating the hardware and software, expert staff and students at the Envision Center consult on ways to use the technology in research and teaching. They also collaborate on grants, including building proof-of-concept demos for proposals.

Originally posted: