wearing augmented reality glasses (note: characters are fictional)
Do you think your natural senses are inadequate? Have you ever wished for more information about your surroundings? Augmented reality (AR) glasses could solve your problem. AR refers to the idea of overlapping a computer display on top of your field of view. Last week Google demonstrated its work on this technology, Project Glass, although the focus was on the glasses' camera and not the display. Project Glass is still a few years away from being a consumer product. Right now it seems only good for streaming videos and pictures, and we are told that it has a small computer display. The concept video below shows Google’s idea of how we might one day use such technology, but such versatility could be many years in the making.

Many people use their smartphones to compare prices for products seen in “brick and mortar” stores. This requires a shopper to take the smartphone out of its pocket and either snap a photo or manually search for a product. With AR glasses, one could simply stare at a product for a few seconds, and online prices would be automatically displayed in the shopper's vision. The glasses should make information more natural and personal to access—to the point of forgetting that you are using technology to look up prices. In the eyes of a user, the physical and digital worlds are seamlessly blended to create an information-rich environment.

One major obstacle for this technology is control. How do you communicate with a pair of glasses? The three most likely solutions are voice command, gesture command, and using a smartphone as a remote. It seems natural to build AR glasses like the next generation of Bluetooth headsets—an extension of a smartphone rather than a standalone device. This way many verbal commands could be processed by software like Siri, and a lot of time handling the smartphone would be eliminated. Another method of control is gesture recognition. Much progress has already been made in this field through Microsoft's Kinect and MIT's SixthSense. To take pictures, a user could frame a shot with his/her hands. The AR glasses would recognize the gesture and act accordingly. Pointing at something might initiate search queries. Handshakes could activate some social media function. Sign language could be translated in real time. 

Google’s main selling point seems to be that parents will want an unobtrusive way of filming their children. When you photograph your month-old baby with AR glasses, he/she is looking into your eyes, not the camera. Wearing the glasses all day, parents can start recording at a moment's notice. The baby's first words, first steps, and countless other moments would be preserved that might have gone missed. But this accessibility extends beyond parents. Think of YouTube videos. How many more embarrassing moments, personal injuries, and hilarious cat videos will be uploaded when people wear cameras on their heads most of the time. Tutorial videos might also get a boost in quantity with instructors providing a first-person view of what they are doing. AR glasses could also have telepresence applications. Anyone wearing a pair could have a live video stream (as was demonstrated by Google with the skydivers and bicyclists). Concerts, conferences, street performances, movies, and many other events could have countless online viewers seeing through the eyes of one who is physically present.

Google’s Project Glass is loaded with various sensors. It can sense the wearer's global position, direction of view, and angle of view. With so much data ready to be streamed online, Google could create a highly detailed and constantly updated digital model of the world. Nature trails, tour routes, store departments, museums, caves. . . all could be easily mapped and uploaded to Google Earth. Photos could be automatically sent to Street View. Anyone wearing Google glasses could serve as a virtual cartographer for Google Earth. With so much geographical data, people could have GPS guidance for finding their favorite book section, the nearest food stand, or even the nearest restroom.


  1. Another competitor for Google's Project Glass is the Olympus MEG4.0 which attaches to the frames of normal glasses. It connects to a smartphone via Bluetooth and works as a wearable display rather than an independent system.

  2. Obviously, this technology raises serious concerns about privacy (and piracy). While wearable displays have little to bother people, wearable cameras are the heart of this controversy. Having cameras aimed in your direction while eating at a restaurant can be unnerving, and might even lead to (alleged) assaults. Andy Ihnatko, Dan Benjamin, and the folks at Engadget do a great job of discussing the social consequences that Google glasses might bring.