News

MGF 2012: Khronos' Neil Trevett on why augmented reality, computer vision and Kinetic-style features are the future for mobile

#mgf2012 Making use of the predicted 100-fold performance boost

MGF 2012: Khronos' Neil Trevett on why augmented reality, computer vision and Kinetic-style features are the future for mobile
Dealing with hardware issues at Mobile Games Forum 2012 was Neil Trevett, who as well as his day job as veep of mobile content at Nvidia, is president of Khronos Group.

A good person then to talk about 'How next generation mobile devices are testing the limits of current processors'.

Fast, faster

Starting with an overview of the current situation, he pointed out that while smartphone is the largest volume device sector, tablets have taken off faster

"Over time, I think Android will dominant the mobile markets because of the diversity of OEMs supporting it," he argued. Of course, Nvidia hardware currently only runs Android.

"The market will end up like the Windows and Mac market for PCs, with 80 percent being Windows," he added.

As an example, Trevett pointed to high spec, low priced devices such as Asus' MeMO 370T 7-inch tablet, which runs Android 4.0 on Tegra 3 architecture, and will be priced at $249 in the US.

Riding the wave

Nvidia plans to release a new generation of processors every 12 months, with Tegra 3 to be followed by codenamed Wayne, Logan and Stark chips, with a system performance boost across CPU and GPU of around 75 times from 2010's Tegra 2 predicted.

So what should developers be doing with this power, Trevett asked?

He argued that the most interesting trends will be found through taking advantage of a device's sensors such as using a camera as a Kinetic-style sensor, with image recognition and gesture processing.

Trevett pointed to the work being carried out in terms of vision-based augmented reality, that he said would give these devices will magical properties.

All about sensors

Part of this push comes from the industry standards Khronos Group.

Its StreamInput standard is designed to stop the fragmentation of, and co-ordination and synchronisation between, sensors such as cameras, touch, microphones, wireless controllers, GPS and accelerometers so it can be made available for apps and games.

Even more advanced is codenamed Computer Vision Hardware Acceleration Layer (CV HAL), which is a low level library that hardware vendors can plug into.

Both standards are expected to be made available in their initial forms in 2012.
Contributing Editor

A Pocket Gamer co-founder, Jon is Contributing Editor at PG.biz which means he acts like a slightly confused uncle who's forgotten where he's left his glasses. As well as letters and cameras, he likes imaginary numbers and legumes.