Skip to main content

Auto Infotainment: Evolution of UI to UX

Image Source: metamorworks/Stock.adobe.com

By Brandon Lewis for Mouser Electronics

Published November 8, 2023

User interface (UI) design simplifies how a user interacts with a system. For example, a UI designer ensures that buttons intuitively display new information or activate functions when engaged. But safety-critical environments like automotive applications add another layer of complexity to UI design, as an elegant UI that distracts drivers from the road for even a split second reduces overall vehicle safety.

For this reason, automotive UI is evolving into automotive user experience (UX). Automotive UX differs from UI by defining how a vehicle interacts with a driver instead of the other way around. While a UI typically displays information on a screen and lists available functions, a UX actively conveys information to the driver through methods such as visual, aural, or touch.

A well-integrated automotive UX technology stack notifies drivers of important information while posing little or no distraction. However, integrating new UX technologies like heads-up displays (HUDs), hands-free audio, and gesture control requires new vehicle architectures and higher-performance embedded processors that transform how automotive system engineers approach design.

HUDs Keep Eyes on the Road

HUDs are not only one of the most significant innovations in infotainment technology in recent memory; they are one of the most important advancements in automotive technology since the rise of digital cockpits.

HUDs use light to project information onto a car’s front windshield directly in a driver’s field of view (Figure 1). State-of-the-art HUDs integrate video sources to animate the projected light in use cases like displaying directional indicators or pulsating when the speed limit is exceeded—all without the driver ever having to look down at a single gauge on the instrument cluster.

Figure 1: HUDs improve the driving experience and safety by projecting information onto a car's front windshield. (Source: Mike Mareen/stock.adobe.com)

Such “smart” gauges are built on automotive-grade embedded processors and digital micromirror devices (DMDs)—light modules comprised of tiny mirrors that project images. In a HUD deployment, input video from a host processor goes through the DMD over an LVDS interface. The DMD adjusts the timing and amplitude of LEDs to produce the requested color and brightness needed to display full-color content on the HUD.

A HUD powered by advanced digital light-projection circuitry can play a critical role in future vehicle safety, such as an advanced driver assistance system (ADAS) that presents important data to drivers without requiring them to divert their gaze from the road. Outside vehicle-generated data, the additional real estate available through a HUD can be leveraged for alerts about potential road hazards, traffic signs, and more.

Audio Enhancements Enable Hands-Free Control

HUDs will become more tightly integrated with other vehicle subsystems as they increase in popularity. However, the most common HUD integrations today are with smartphones—devices that aren’t native to the car.

The integration of HUDs and smartphones offers an immersive, multi-sensory approach to navigation, playing music, making and receiving phone calls and text messages, and other non-driving functions that are attainable for most car owners today. Commands can be issued vocally and confirmed by the system through visual or audio means to help drivers maintain focus regardless of their surroundings.

This is truly hands-free control and a powerful application of technology that streamlines UX and increases vehicle safety; when drivers can vocally control the infotainment system, they can keep their hands on the wheel.

A vital aspect of an efficient hands-free system is ease of use, and audio is a much more intuitive interface for non-driving-critical applications like navigation, calls, music, and temperature controls. But it wasn’t always that way: Initial hands-free systems installed in cars had complex menus that were difficult to navigate, especially when searching for infrequently used functions. Another challenge these older systems faced was managing different drivers, which resulted in nuisances connecting a primary driver’s phone after someone else used the vehicle (Figure 2).

Figure 2: Early hands-free in-car audio systems connected to smartphones over Bluetooth® were often difficult to configure. (Source: Tomasz Zajda/stock.adobe.com)

Many automotive technologies share an origin story with hands-free audio, starting as an aftermarket technology and progressing to a bolt-on infotainment function as user demand outpaced automotive development lifecycles. From a user perspective, this often resulted in an assortment of infotainment-related platforms with diverse menus, systems, and options. From an architectural standpoint, this meant multiple boxes from multiple vendors across disparate infotainment systems.

Today, there is a move toward functional consolidation of platforms from different vendors onto a single platform. Aside from reducing space, power, cost, and design complexity, minimizing the different audio and visual interfaces required by each subsequent system results in fewer applications and a less-convoluted UX for drivers and passengers alike.

Infotainment within Reach

The final hands-free modality revolutionizing today’s vehicles is, ironically, touch. Touch controls ergonomically extend the operation of buttons, sliders, and menus on traditional control consoles, increasing the possible size and functionality of in-car displays. But that’s last-generation technology.

Modern automotive touch interfaces introduce driver haptic feedback—touch-based responses to commands, such as a button vibrating, so users feel when a command has been accepted. Haptics can also generate safety alerts, such as shaking the steering wheel if the car starts to veer off the road.

Next-generation vehicles will move from touch to touchless thanks to built-in gesture controls. Instead of looking down at a screen to identify buttons and other controls, drivers can manipulate a range of infotainment, navigation, and other vehicle functions using touchless hand gestures that don’t divert their focus from the road.

The Impact of UX on Safety and Design

An effective UX improves convenience and safety by keeping the driver's attention on the road. It enables more complex interactions than are possible with just gauges and buttons, and responsiveness is faster when a driver can hear and see alerts on a HUD rather than having to scan a dashboard for flashing lights.

The Road to the Future Is Paved with Information

Looking to the future, when vehicle systems and sensors generate more information, there will be an even greater opportunity to improve infotainment platforms using AI. Instead of just supporting multiple "driver profiles" that the user can customize, analyzing infotainment data with AI presents the opportunity to predict user habits. At that point, and with capable manycore infotainment ECUs, the UI can truly become a UX wherein the vehicle anticipates driver preferences.

Today, it's important to deploy advanced infotainment technologies sparingly, especially where driver interaction is concerned. Too much information can easily obscure what is truly important and valuable; drivers don't need constant reminders that the tires are full of air. Context matters, so information should only be displayed when helpful or pertinent.

With the right enabling technologies, a well-designed UX will play an essential role in how people feel about their cars. An intuitive UX creates a dynamic experience, helping drivers connect positively and emotionally with vehicles. With the right technology and components, combined with ease of use, automotive UX will be one of the key considerations for new car buyers in the decades to come.

About the Author

Brandon has been a deep tech journalist, storyteller, and technical writer for more than a decade, covering software startups, semiconductor giants, and everything in between. His focus areas include embedded processors, hardware, software, and tools as they relate to electronic system integration, IoT/industry 4.0 deployments, and edge AI use cases. He is also an accomplished podcaster, YouTuber, event moderator, and conference presenter, and has held roles as editor-in-chief and technology editor at various electronics engineering trade publications. When not inspiring large B2B tech audiences to action, Brandon coaches Phoenix-area sports franchises through the TV.

Profile Photo of Brandon Lewis