In Insert Coin, we look at an exciting new tech project that requires funding before it can hit production. If you'd like to pitch a project, please send us a tip with "Insert Coin" as the subject line.
Now that Google Glass and Oculus Rift have entered the zeitgeist, might we start to see VR and AR products popping up on every street corner? Perhaps, but Meta has just launched an interesting take on the concept by marrying see-through, stereoscopic, display glasses with a Kinect-style depth sensor. That opens up the possibility of putting virtual objects into the real world, letting you "pick up" a computer-generated 3D architectural model and spin it around in your hand, for instance, or gesture to control a virtual display appearing on an actual wall. To make it work, you connect a Windows PC to the device, which consists of a pair of 960 x 540 Epson displays embedded in the transparent glasses (with a detachable shade, as shown in the prototype above), and a depth sensor attached to the top. That lets the Meta 1 track your gestures, individual fingers and walls or other physical surfaces, all of which are processed in the PC with motion tracking tech to give the illusion of virtual objects anchored to the real world.