How would you like to bring your design style into the real world? This may at first seems like a daunting task. It doesn’t have to be, of course, but, most likely due to the seeming difficulty of doing so, software interface design doesn’t usually reflect real-world experiences.
However, the real world we each experience first-hand, will translate more and more into the user interface design producing what’s called first-person user interfaces. Here, user interfaces take note of the real world and deliver it, unaltered, using specific design features.
Since this real-world style works well with the way people actually perceive the world, it fits easily in the navigation context. In this post, we’ll take a look at how the first-person user interface helps us bring reality to navigation devices and see, for ourselves, just how easy it’s getting to consider the real world as just one great big user interface.
Navigating the Real World
It’s common for people to lose their cell phones, but these days, it’s also common for your cell phone to know exactly where you are. The advent of GPS and cell tower triangulation technology has made “non-existence” a thing of the past. Moreover, clever combinations of location and directional orientation, such as with a digital compass used in addition to GPS enabled applications, now make it possible to reveal both position and direction.
A world of new ways of using interfaces, then, open up for designers. Now, designers can design based on a user’s current location and the direction they’re facing. Take a look at the TomTom navigation system screens shown below. Notice the difference in perspectives from left to right.
The beauty of first-person interface lies in its ability to directly translate real-life perspectives. When people are on the go, this is without question, an excellent tool of convenience, as following various routes become significantly easier with its use than without. First person-user interfaces help eliminate the time, and frustration spent decoding 2-dimensional images into 3-dimensional context.
The TomTom software basically recreates real-world images and, as with the latest version of its software, updates it with graphics and text that enable the user to map out a given route much easier. However, what happens when the real-world is not just recreated, but is simply photographed, line by line and curb by curb?
Nearest Tube is a first person navigation application that uses a cell phone camera’s changing views to locate the nearest subway stations. Users simply aim the camera in the direction they’re headed and the application delivers the information needed using red markers to indicate the location of subway entrances.
The pointers in the video above, shows only the subway stations within the current user’s field of vision, but much more is possible. There are actually three different types of information delivered based on the way the camera is held: location of subway lines, subway stations and locations of subway stations ahead, in the distance. Shifting the camera gradually delivers everything you’d need to know about subway stations, near and far, based on your current location.
In many ways, design has taken on a defined role as reality extension rather than as reality reflection. This is understandable when one think of the trade-offs involved – better communication, more useful conveniences and so on. However, times are changing and moving forward fast into fascinating real-world user interface experiences that give an interesting take on the meaning of “reality.”