Human Machine Interfaces in 2024 & Beyond
Source: Lucija/stock.adobe.com
Introduction
Through movies and TV, we've seen the evolution of "futuristic" human-machine interfaces (HMIs) from the many dials and levers of the factory in Metropolis to the holograms of Star Trek and touch interfaces and augmented reality (AR) of Minority Report. Even the command module to the 3D holographic interface of the near-future Marvel Cinematic Universe seems just around the corner. But where are we today?
This article dives into the technology in current HMIs, including touch, gesture, voice, and even brain-computer interfaces, and explains how each has been implemented.
Primer On HMIs
HMI—also referred to as operator interface terminal, local operator interface, or operator terminal—is a broad category of technology touching on the myriad ways humans can communicate with, control, receive feedback, and program machines. The term could be used to describe any intermediary between humans and machines; however, the term is most often used when discussing industrial HMIs, such as the consoles and terminals used to control and provide feedback to large industrial systems or complex machines. HMI can refer to the system used for input from an operator, such as a keypad or touch control, the feedback system, or any combination of technologies used to input and provide feedback from a machine system. HMIs are often seen in major industries such as commercial, industrial, automotive, healthcare, and aerospace/defense.
Historically, control consoles, keypads, mouse computer controls, command-line interfaces, graphic user interfaces (GUIs), and light-up displays have been used to communicate with machines. To enhance efficiency and streamline the interface design, many modern systems are moving to computers, tablets, smartphones, and virtual control methods with minimal HMI at the site of a system.
More recently, touch displays have become more prevalent, and gesture- and voice-controlled systems are now mainstream. Several new technologies leveraging gesture and voice control are accelerating the adoption of more modern HMI technologies, including AR and virtual reality (VR).
Where HMIs Are Today
Current projections predict the HMI market will grow by a compound annual growth rate (CAGR) of roughly 10.4 percent to US$11.6 billion by 2030. There is a growing demand for HMIs as many legacy industrial systems and machines are replaced, upgraded, or retrofitted to more modern HMI technologies. These enhancements are typically made to enhance efficiency and provide a higher level of operational data and control to fit better modern techniques that leverage machine learning (ML), artificial intelligence (AI), and sophisticated analytics.
Current HMIs are a mix of legacy versions from older systems still in operation, modern systems, and more recent experimental systems that feature more futuristic interfaces. As with many industrial systems, mainstream HMIs lag somewhat behind the technological advancements in HMIs due to safety, the need to train operators, vendor and distributor dynamics, and other market dynamics. It can take years to decades for new HMI technology to penetrate specific applications, and some legacy HMI technology is even built into standards, safety protocols, and regulations.
The following are just some of the present-day HMI technologies:
- Console/terminal interface (physical actuator)
- Command-line interface
- GUI
- Web-based and cloud-based interface
- Touch display (touchscreen) interface
- Tablet and smartphone interface (i.e., using portable, wirelessly connected hardware)
- Gesture-based interface
- Voice control/voice-activated interface
- AR and VR headset and control accessory interface
- Brain-computer interface
Until recently, most machine systems used console- and terminal-based interfaces, but touch display interfaces are now commonly included with modernized machine systems. Another more recent advance is the accessibility of smartphone- and tablet-controlled interfaces. These are often available through a vendor's app or even third-party apps that enhance the control options of existing equipment. Modular upgrades or complete retrofit systems are available for many legacy machine systems with more modernized interfaces, such as touch display, web-based/cloud-based, and tablet/smartphone interfaces.
Voice-controlled systems that use natural language processes enabled by ML/AI are now ubiquitous in smart home controllers, speakers, and audio/visual entertainment systems—though less commonly used in industrial systems due to security and safety reasons. In industrial and healthcare applications, voice-controlled interfaces are often only a secondary HMI due to the pace, safety/security concerns, and likely familiarity of users/operators; the primary HMI is often a method that involves tactile input or hand control.
For most of these modernized systems, the new interfaces are typically modules licensed or directly purchased by manufacturers or vendors and are generally not designed in-house. This is a significant deviation from the past, when many vendors custom-designed their own interfaces as part of the product development cycle. Using third-party modules for interfaces creates some challenges in terms of troubleshooting and customer service. Still, it greatly enhances a machine system to provide modernized interfaces that are now much more complex than legacy interfaces.
The latest entrants to the HMI marketplace are AR/VR and brain-computer interfaces. Currently, AR/VR headsets are in the early stages of release and are most often used for troubleshooting, maintenance, training, and site/system evaluation.
How HMIs Are Implemented In 2024
Currently, HMIs are made using conventional and legacy hardware and software, which rely on wireless or wired analog and digital communications to handle the data traffic among central processing units, microcontrollers, and peripherals. The makeup of modern HMIs includes communication protocols, information processing, storage, analog and digital interfaces, and software and operating systems.
Many modern HMIs are made with microcontrollers or computer systems with a wide array of digital interfaces and peripherals integrated into the main chipset. This enables organizations developing HMIs to rapidly add functions like Bluetooth®, Wi-Fi®, and USB communication interfaces; other protocols; and peripherals without creating these features from scratch. This functional ability benefits the trend of including mobile HMI features (e.g., tablets and smartphones) and applications alongside dedicated HMIs. Cloud services must host the applications, but users can often access them without internet connectivity if the applications can connect directly via Bluetooth, Wi-Fi, or some other common interface. In this way, an operator can use a tablet or smartphone to control or monitor a machine system directly without being in proximity to the system.
Web-based HMIs, or HMIs with web portals that allow monitoring access and/or control, have been around for a few decades. More recently, cloud-based services have been available to support these internet-based features for machine systems, and they can even support several machine systems simultaneously. Legacy web-based systems typically require a direct connection with the HMI hosted by the web portal directly. In some cases, proprietary systems could poll and even allow some control of intranet-connected HMIs. However, modern cloud-based systems are often supported by software-as-a-service (SaaS) cloud products that can be adapted for various machine systems. These reconfigurable and reprogrammable cloud-based systems often have features that provide analytics, data storage, automation, and remote access from secure accounts. This function allows skilled operators to access the machine system data and control it either locally or remotely from a terminal with access to the cloud-based HMI. This can enable an operator to operate several machine systems simultaneously from the same terminal, regardless of where the machines are located. Such an ability is especially useful when a highly skilled operator is needed to complete a task with a machine system and transporting that operator is infeasible.
One example of such a need is in remote surgery (also known as telesurgery), where a skilled surgeon can operate robotic surgery equipment remotely through the internet via cloud-based HMI services. Remote surgery has provided patients in remote locations, or those needing particular types of surgery, access to surgeons with specific expertise necessary to perform a given surgery without the surgeon needing to be located near the patient. Remote surgery systems typically require a robotic surgery interface (i.e., a surgical console) that is compatible with the automated surgical equipment on site. However, future advances could enable several different robotic surgery interfaces to be compatible with different types of robotic surgery systems, further expanding the access and operator skill with these systems.
Though early prototypes and pioneering methods of AR/VR began appearing decades ago, it is only in the past few years that these HMI systems have become more prevalent. With advances in computing miniaturization and concerted development to yield AR/VR systems that are more user-friendly and practical, AR/VR HMI systems are gaining traction in the industry. VR is largely used for training exercises and remote system control. A main limitation of deploying VR HMIs is that a user is generally intentionally unaware of the outside world beyond the immersive virtual experience, which can be dangerous or cumbersome in many industrial environments. However, using VR as both the display and control mechanisms for remote robotic systems or other machine systems is a viable path for VR HMIs in the future, and it is growing in popularity. An example of this is VR operators who can control autonomous mobile robots (AMR) for training or during an override event.
AR is being used as an enabling technology for applications such as construction, troubleshooting, maintenance, inspection, quality control, assembly, and training systems. Examples of current AR systems include headsets with goggle-like display overlays that can either project or have transparent displays that can overlay critical information/computer displays and small displays within a user's field-of-view (e.g., Microsoft HoloLens, Lenovo ThinkReality, RealWear Navigator, and Apple Vision Pro). There are also AR systems that have complete visual overlays, like VR goggles, which use camera systems and displays to provide vision beyond the goggles and overlays. However, this approach is still less common than AR glasses that provide see-through capability. Voice-controlled HMIs are often paired with AR systems to allow for hands-free control of the AR glasses. Natural language processing ML/AI technologies usually enable these voice-controlled systems.
Lastly, ongoing efforts are to pioneer and popularize brain-computer interfaces (BCIs) as HMIs. These are still in the experimental phases, and typically, electroencephalogram (EEG) or electromyography (EMG) brainwave signals are used to derive user control signals. However, implantable brain-computer interfaces (BCIs) can also sense brain signals directly, and a user can even be trained to operate these systems as an extra faculty. Future efforts may attempt to provide bidirectional communication where a neural or cortical shunt can directly interface with a human brain (e.g., Neuralink's implantable BCI chip). The prominent use cases for BCIs currently are to aid people with disabilities by restoring lost senses (such as cochlear implants and artificial eyeballs), to augment human capabilities by allowing for more seamless human-machine communication, and to facilitate brain research.
Conclusion
HMIs are evolving to enable more advanced, configurable, intuitive, and seamless control of machines. As display, control interface, and computer-processing technologies continue to be miniaturized and become more familiar, more advanced HMIs are being deployed in various industries. Where HMIs used to be in a fixed location and dedicated to a single machine system, now mobile device HMIs such as tablets and smartphones or AR/VR systems can be used to control machine systems across the globe through cloud-based internet systems that are either hardlines or even wireless via 5G and advanced Wi-Fi. Future HMIs may even be brain-computer interfaces implanted into an operator's brain that provide visual information and can be used to process control signals from the operator without physical or audible input.
[1]
“Human Machine Interface Market Size, Share & Trends Analysis Report by Product (Display Terminals, Interface Software, Industrial Pcs, Others), by Application, by Region, and Segment Forecasts, 2023 - 2030,” Research and Markets, October 2023, https://www.researchandmarkets.com/reports/5899530/human-machine-interface-market-size-share-and.
[2]
Kavyanjali Reddy et al., "Advancements in Robotic Surgery: A Comprehensive Overview of Current Utilizations and Upcoming Frontiers." Cureus 15, no. 12 (December 2023): e50415, https://doi.org/10.7759/cureus.50415.
[3]
Baraka Maiseli et al., "Brain-computer interface: Trend, challenges, and threats." Brain Inform 10, no. 1 (August 2023): 20. https://doi.org/10.1186/s40708-023-00199-3.