A few posts back, we delved into the world of first-person user interface. This is a fascinating world where, rather than offering a convenient extension of reality, user interface design actually mimics reality, First person interface, lets users experience design without leaving the familiarities and perspective of real life behind.
Two great examples of first-person user interface design can be found in augmentation and direct interaction. Augmentation works on an informational level. It enhances the real world images using a wide variety of helpful facts and images about a user’s surroundings. Direct interaction allows users to input location information in exchange for a real-time image or list of features catering specifically to the user.
In this post, we’ll take a look at these two examples as well as a few sites that demonstrate how each works in design. We’ll begin with augmentation.
The best thing about the first-person user interface is its ability not only to mimic reality but also to bring it up a notch. First-person user interface can augment real world’s images with info not normally visible. These augmenting applications use user interface features layered onto the heads-up images or camera to inform and expand our sense of vision and place.
For example, using IBM’s Seer application, users can see how long the wait is for a taxi or for an afternoon Coke during the ‘Wimbledon tennis tournament’, even while they manipulate navigation tools to get their bearings.
Even more fascinating is the application called Wikitude. This application takes user interface elements all the way to the next level by displaying augmented information depending on the direction the user point his camera. Excellent for travelers and tourists, it provides details such as landmarks and interesting facts and statistics about a user’s current location.
A variety of design elements tie these apps together. First, each makes good use of a ‘bottom right corner indicator’, giving users a summary of direction, which can be adjusted to preference. In addition, all the applications employ the use of icons, which act as a legend, to allowing users to select information of interest.
For example, there are icons for restrooms and nearby eateries as well as ATM locations and a host of other possibilities. In Wikitude, the icons also display the distance of each attraction or point of interest in relation to the user’s current position. Others, like Layar allow users to pick and choose the type of info, they want displayed.
Interrelating with Stuff that is Near You
Direct interacting user interfaces work through informational inputs from the user. For instance, users simply take a picture of CDs, DVDs, books, video games and SnapTell can identify each with detailed summaries of price, and reviews.
The augmented ID concept application from TAT is a proposed application. Here, with a click of a camera, facial recognition is the key input. In return, user receives digital info about people in the immediate vicinity.
Sometimes, rather than a digital photo, the key input is an object. For instance, the Touch research project interprets objects with RFID tags attached as the video illustrates below.
The 6th Sense project from the MIT Media Lab takes things one step further with a projector that shows the price, weight or size and more directly on each product as you locate it in a store or library.
All this is just the tip of the iceberg. With first-person interface design, the sky is the limit. Its exciting to think what could possibly come next. Feel free to share your ideas.