Mobility of the Future

An Interactive Trade-Fair Installation Showcasing Concepts For The Mobility Of The Future

Year
  • 2022
Tasks
  • Development

How can we present research on future mobility in a fun and attention-grabbing way?

Electric cars have rapidly become more common, leading to challenges like energy storage, charging infrastructure, and securing renewable energy sources. The Institute of Vehicle Concepts, part of the German Aerospace Center, researches carbon-neutral electric mobility and develops concepts for related technologies.

To enhance their trade fair presentations, the institute asked ART+COM to create an interactive exhibit. The centerpiece is a small model car with a camera that drives through a physical city model. Visitors can choose the car’s destination, triggering animations that explain the institute’s research on energy management for electric vehicles in simple terms.

The live camera feed from the model car is combined with AR effects, creating a large-format augmented reality experience that immerses visitors in the scene. This engaging approach quickly draws in the audience at the trade fair stand.

A self-driving car drives through the Mobility of the Future installation, showing its view overlayed with AR content on a large horizontal screen.

Results & Process

With the Mobility of the Future installation, we took the concept of AR and switched it around: Instead of letting the visitor control a handheld or head-mounted device to see the real world with additional virtual content overlayed, we mounted a drone camera into a small car and streamed „what the car sees“ onto a large vertical screen.

By doing that, the miniature city suddenly appears life-sized, the visitors like giants. Additional AR elements like roads, trees, or traffic signs further add to the illusion, surely drawing the attention of visitors of the trade fair.

Implementing this came with two core challenges: The client wanted the car to drive on its own without any visible controls or tracks. Additionally, to make AR possible, standard devices normally use a variety of different sensors such as a gyroscope or accelerometer as well as image recognition algorithms to detect where the device is in the real world. As the car was not equipped with any sensors, we had to come up with our own custom AR solution.

Implementing a self-driving car

To let the car drive on its own without actually implementing any steering logic, we repurposed a CNC milling machine, which can be controlled very finely using GCode, a standard in CAD and 3D printing. Instead of a mill, we attached a magnet to the CNC milling machine and a magnet counterpart to the front of the car. By mounting the CNC rails under the table, we could then drag the car across the table using the magnet!

Of course this was not as easy as it sounds: Frank Fietzek went through round and round of finetuning the car to drive as smoothly as possible. I hacked into the Arduino Gcode platform to increase the Hz rate of the sender to be able to send positions as fine-grained as possible.

First tests of the self-driving mechanism using a more or less glued together car and a plotted road to see whether the car stays on track.

Implementing „time-based AR“

The even larger challenge however was to implement an AR solution that worked without using any sensors (because we couldn’t fit any in the car) or image recognition algorithms (because we didn’t have the time to experiment with these).

We therefore made a bold assumption: If the CNC mill is that accurate that each round it drives the exact same path at the exact same speed and if the car is calibrated so finely that it is also being dragged exactly the same way each round, then we know what the car will see at each point in time during its round.

With this assumption, all we needed to do was to record the car’s camera view during one round drive through the city. Then, we could use this video recording as a blueprint to rotate the virtual AR camera so that it looks at the correct position for specific points in time and interpolate between them to get a smooth image. And to our surprise: It worked and still does today! The installation is still being shown at the German Aerospace Center HQ in Stuttgart, Germany.

First, I recorded a the car's camera view and its position for every point in time while it drives through the city.

I could then manually set its rotation for a point in time by simply rotating the virtual car, so that it's view overlays with the still image.

Used Tools

Below, you’ll find a non-extensive list of tools I used in this project.

Hardware

A drone camera, one large screen, a CNC mill control, several Arduinos and other microcontrollers, a custom-made car

Software

Unity, Arduino

Languages

C#, Arduino