Steering wheel with embedded touchscreen enables humans to interact with autonomous vehicles.
We have already seen various developments related to autonomous driving, such as the add-on kit that turns any car partially autonomous, and a new software which could lead to self-delivering cars. Now, a German based company, ZF, has come up with an innovative way to enable drivers of autonomous cars to direct a vehicle using a gesture-controlled steering wheel.
The gesture-control aims to make autonomous driving easier and safer. It is designed for use in level 3 autonomous cars, which will switch between autonomous driving and manual control. The idea is still in the concept phase, but the device will have an embedded touchscreen allowing users to select their desired destination. Gestures commonly used on smartphones would trigger the horn, turn on the lights or adjust climate control. There will also be an LCD screen and LED lights to signal when the car is in auto-mode and when the driver is in control. Another feature includes a driver airbag concept allowing deployment around the screen interface, helping to protect the driver in case of a crash.
This new idea improves the relationship between car and driver, facilitating the transition to self-driving cars. How else could technological developments influence the design of autonomous cars? How else could drivers communicate with self-driving vehicles?