At its most basic, augmented reality (AR) refers to the superposition of supplemental digital information over a live view of a user’s environment. The dominant idea behind the technology is to provide more context, depth, experience and utility to everyday activities; for example, the first-down lines displayed in football broadcasts.
The military has employed head-up displays (HUDs), a form of AR, to provide visual cues for items such as airspeed, horizon line and weapon crosshairs for more than five decades, while Boeing was among the first to bring AR into the commercial workplace. In fact, Boeing researchers coined the term “augmented reality” when they developed a head-tracking HUD connected with a waist-mounted computer for overlaying airplane wiring schematics atop a sprawling assembly blank.
As the hardware and applications associated with AR continue to evolve, those who benefit from it are growing in number. In recent years, especially with the rise of computing power and camera systems in handheld devices and eyewear (such as Google Glass), AR has made slow but steady inroads into people’s pockets (and onto their heads). Apps ranging from games to restaurant finders that display chain logos over a live camera view are now available.
But the question remains: Will AR technology have a place in the IT world and, if so, what sort of applications might be expected?
AR in IT “In an office or IT environment, these augmented reality tools can be used for seeing information about the things being looked at,” says IT futurist, author and speaker Ross Dawson. “For example, there’s this idea of workflow learning, meaning we get necessary information or learning modules when we need them. If we’re working on a development project where a problem is particularly troublesome, we can call in help such that it looks like others are physically in the same room with us. Anybody from around the world can be right there, looking at and engaging with the same things.”
Dawson notes that the chances of AR taking off in corporate and data center environments is “quite high,” if only because humans are data-driven animals — even more so on the job than off. AR provides more information in a given situation in a more intuitive manner.
Consider scanning a shelf of inventory or lines of code. An AR system could visually highlight changes from the previous day and identify who made them.
In the same vein, IT groups are only now beginning to make the jump from data analytics into visualization, and from there into 3D visualization. AR offers an opportunity to bring visualization off the monitor and into the world. Imagine being able to see data center server hot spots or areas of network congestion as visible entities hovering above the real world. How much more effectively could IT pitch a project to the CFO if data projections could be seen as floating objects subject to collaborative manipulation as hand gestures represent the adjustment of different variables?
Naturally, many of the popular objections to AR will bleed over into the IT workplace. Justifiable concerns about privacy remain unanswered, and many object to the intrusive aesthetics of wearable systems such as Google Glass. Technology itself may remedy some of these. The University of Washington and others are making progress toward LED-driven contact lenses, which might someday do away with cumbersome eyewear. Some objections will persist longer than others, but the merits of AR seem likely to outweigh the detriments over the long term.
“The applications are there,” says Dawson. “The degree to which we can make AR technologies look good and feel good will absolutely drive their uptake. But the value of providing more information, allowing people to do their jobs better, means that AR is very likely to grow dramatically and be a significant part of work life moving forward.”