Augmenting reality

Augmented reality, or AR, apps work by annotating real world scenes with information gathered from the virtual world. Imagine wandering around a foreign city as a tourist and as you look up at a historic building, your view of it is enhanced by information scoured from the internet about the building’s designer, construction style, its cultural and historical significance and comments left by previous visitors. Then as you turn around, information about a dozen other tourist attractions is added to the scene infront of you so you can choose your next destination. In future this could involve Jordi-la-forge style headsets; data projected onto your spectacles or even tiny displays on a chip embedded inside the eye. If you have an Android powered mobile phone or an iPhone, then augmented reality needn’t be a futuristic dream as there are apps for both handsets available now.

I got a HTC Hero running Android 1.5 just before Christmas, and have been playing around with two augmented reality apps: Layar and WikiTude. Both apps work by pintpointing your location using your handset’s GPS receiver (or cell location) and its inbuilt compass to detect the direction that you’re facing. The apps then trawl a variety of online sources for information about your surrounding area, from Wikipedia (for historical information), panoramio and flickr (photos), twitter (general chatter), and foursquare or brighkite for comments from other visitors. As you hold your camera in front of you, all of this information is added to your phone screen on top of the image seen by the camera in real time. Its already a very impressive application of geotagged information on the web being delivered over the mobile internet to a powerful computing device in your hand but I can see a wealth of other interesting opportunities that this could be applied to in future…

For example, homebuyers in the market for a new build house might wander around a building site where only the foundations have been laid, but through the camera on their phone they see what the buildings will eventually look like, how close together they will be, how the public spaces will be landscaped etc. A new immersive take on the 3D fly-throughs often rendered by architects. Imagine wandering around the site of the Olympic Village in London and seeing how it will look two years from now (or wandering around at the opening ceremony in two years time, and seeing how it looked before the bulldozers moved in).

This could have great applications in education too: Last year I visited the Port Arthur former penal colony in Tasmania. Lots of old crumbling buildings that were used as cells, work houses, staff accommodation etc. Today visitors wander round with a paper leaflet or follow a tour guide to learn about the site but the experience would be much more immersive if the information was delivered to my phone screen. Imagine students seeing how the buildings would have looked 180 years ago or looking at archealogical sites and pointing their phone at the ground to see what lies beneath.

Another application of AR is for giving directions – your AR display becomes your own personal signpost. Hold it in front of you and it shows you what is North, South, East or West of your current location, or which direction you should travel to reach your destination. I can see this being used in theme parks, festivals and large campuses as well as for general directions from A to B.

Current apps overlay text or thumbnail images, but I see no reason why they couldn’t also overlay video – avatars of a tour guide, or locals from 100 years ago in period dress acting out their daily lives – the display screen becomes a portal that allows you to see the scene in front of you as it would have been at a different point in time.

I’m a Formula1 fan and I can imaging walking around a Grand Prix circuit and using AR to point out the names of different corners of the race track, significant events that happened in previous races at the spot where I’m standing and even show replays on the screen of my phone, overlaid on the image of the track in front of me.

Current augmented reality apps are still in their infancy and one of the problems I’m already seeing is that there are more and more apps becoming available and there’s no compatibility between them. If I want to see London Underground stations I have to install Layar, if I want to see flickr photos I have to install Wikitude and so on. In order to stop the fragmentation of augmented reality apps we need standardisation of the data format used for delivery geotagged data to the AR app. I suspect that the app developers are reluctant to standardise becuause they see “their” data as their USP (although wikitude now let users create their own data via the wikitude.me site. If the app developers can crack this, then there should be an exciting future ahead for augmented reality.

Leave a Reply