If you’re distracted while driving, it is difficult to shoot your enemy. During World War II — a notoriously distracting period in human history — this profound observation provoked the development of a variety of largely mechanical user aids that we now categorize as “head-up displays” (HUDs): effectively transparent information displays that allowed fighter and bomber pilots to keep their attention on the actual horizon, not the gauges or handheld maps inside the cockpit, the better to put bullets into enemies or bombs on top of their war stuff.
Despite what you might think based on Iron Man or Star Wars, layering information is only “futuristic” if you ignore the incremental practical progression from gun sights (raised tabs of metal that, when aligned, show that a straight barrel sits more or less on a ray from the bullet to the target) to reticles (crosshairs or sights) to simulated horizon lines or projected radar displays, which, during the multinational innovation laboratory of a global conflict, progressed pretty dang quickly. (Am I saying that gun sights were the first augmented-reality technology? Well, I’m not not saying it.)
At the beginning of World War II, airplane armament was aimed using reflector sights and gyroscopically stabilized reticles that could “lead” a target — clever stuff, but not hugely advanced over technology used during the previous World War. By the end of WWII, some bombers had onboard microwave radar systems with television-based displays, although only a select number of specialized night flyers, like the zippy wooden-framed de Havilland Mosquito, were equipped with what we would consider to be a true HUD, which reflected the radar information, including an artificial horizon, onto a piece of glass just in front of the pilot controls.
So that’s the head-up display, completely sorted in the early ’40s: Take some hopefully useful information from some electronics, project it onto some glass, keep your hands on ten and two, try not to put your machine where any bullets are, and hopefully get your airframe safely back home. And while postwar engineers continued to develop the HUD for the burgeoning commercial aircraft market, it didn’t take too long before automakers — staffed by many veterans themselves — began to see the utility of a HUD in a car, albeit at an abbreviated pace.
Designers at General Motors were at least sketching out the idea of putting a HUD in a vehicle by 1965, during the concepting for the Mako Shark II, a concept car that informed the curves of the late-’60s, early-’70s C3 Chevrolet Corvette. (The fiberglass-body one, stereotypically driven by pricks in period movies, but also astronauts and half the population of the mid-century Midwest. They’re beautiful cars, and readily available for purchase at a reasonable price even today, since Chevy sold zillions of them, and they look faster than they actually drive.) However, the HUD never made it off the page into any Mako Shark II concept car that was actually built.
It would take over 20 years before a regular car buyer could actually purchase a car with a head-up display. After the acquisition of Hughes Aircraft by General Motors in 1985, and a subsequent merger with GM’s in-house electronics division Delco, the faintly phosphorescent stars aligned: Fifty 1988 Oldsmobile Cutlass Convertibles (Indy Pace Car Edition) were equipped with a Hughes-derived head-up display that projected a digital speedometer and turn-signal indicators onto the windshield. (Full disclosure: I’ve done freelance work with General Motors in the past.) General Motors hired legendary test pilot Chuck Yeager to drive the drop-top Cutlass around the track to really tie together the whole aircraft lineage story during the car’s launch, and soon began to offer a HUD as an option in cars across the company’s various car brands (including Corvettes), although technically Nissan beat them to market for first mass-market car HUD, with the 1989 model 240SX and Maxima.
These days, almost every luxury car brand offers at least an optional head-up display that does pretty much the same thing: reflect some information from a small TFT panel onto the windshield, usually something similar to what is shown on the instrument cluster display — speed, GPS-guided turn-by-turn directions, maybe what song is streaming through Spotify. (This rebooted Lincoln Navigator’s head-up display is a good example — large, elegantly kerned, art directed with restraint.)
Yet driver affinities for HUDs are largely mixed. Some drivers I know wouldn’t buy a car without a HUD; others find them to be an annoyance. (Famously driver-focused Porsche doesn’t even offer a HUD in their sports cars, but finally relented to include them as options in their SUV and grand tourer, the Cayenne and Panamera.)
Even the best automotive HUDs are limited by a simple technical limitation: They can only project a two-dimensional image into the field of view of a driver. You may be able to fiddle with a dial or change exactly where the ghostly pane floats in your view, but a two-dimensional display can only ever look like a translucent tablet screen superimposed over the real world.
“I cannot emphasize this variable enough,” says Juliana Clegg, CEO of Falcon AR, one of several companies designing what they hope will be the car head-up display of the future. “Depth, depth, depth.”
“Depth — variable depth between the driver and, say, 30 meters in front of the vehicle — is vital to make a real safety impact.”
Volumetric HUDs, like those from Falcon AR or competitors like WayRay, will allow something like true augmented reality to not just project flat information in front of a driver’s face, but can position little overlays — lane markers, or GPS arrows — such that they appear to be floating in the real world. A tiny, flat arrow that indicates a left turn is handy, but your phone or touchscreen center-stack display can already do that, if you glance down or askance. A 3-D arrow that curves in front of you to show exactly where the next intersection is in real life is much more handy.
If any of the augmented-reality HUDs can be made to practically work, that is. Falcon AR has more modest ambitions: a relatively inexpensive technology that adds depth to a head-up display for just the driver. WayRay, by comparison, aims to create a holographic display that can project across the entire front windshield — think navigation information for the driver, Netflix for the passengers — but requires specially bonded layers of windshield glass that add more cost to the overall expense of the car.
WayRay’s CEO Vitaly Ponomarev says that manufacturers might be able to subsidize the added expense through “advertising options” — virtual billboards could entice passengers to stop for a quick plate of chicken, for instance — which might sound far-fetched but seems entirely plausible if, say, the self-driving taxi business ends up being structured like the smartphone app economy. (And automakers like General Motors and Ford are already mulling using the tremendous amount of data they collect from cars to sell to advertisers.)
However it shakes out, the problem of keeping a driver’s eyes scanning the real world, not a screen, will continue to be solved by attempting to put a different type of screen in front of drivers. And as cars increasingly are packed with sensors like RADAR and LIDAR, as well as machine-learning-edumacated cameras, there is a real opportunity to increase drivers’ awareness of hazards in their field of view: kids on bikes, potholes, high-calorie milkshakes. Cars already attempt to be aware of these hazards so that the safety systems, like lane-keeping or automatic emergency braking, can operate.
Whether or not telling the driver about these hazards through an augmented-reality overlay is a safety boon remains to be seen through.