Driver Monitor System Yields Safety Gains
Jim Harrison for Mouser Electronics
Vehicle safety systems have taken a giant leap in effectiveness in recent years. Most people who have driven a late model car will tell you they can't imagine operating without a backup camera, lane change warning, blind-spot detection, and automatic braking function.
The body responsible for European vehicle safety ratings and testing (NCAP) has an audacious idea. In September 2017, they unveiled its "Pursuit of Vision Zero" Roadmap 2025; with the goal of zero automotive accidents. Imagine zero automobile accidents. According to the World Health Organization, over 1.3 million people die in road accidents every year, with 94% of these fatal accidents caused by human error.
Zero accidents could be possible with ADAS (Advanced Driver Assistance Systems). Carmakers are marching ahead with a level 2 automaton that can take over lane centering, provide adaptive cruise control, and perform automatic braking. This will allow drivers to briefly take their hands off the wheel and feet off the pedals. But, there is daunting evidence that the longer drivers use partial automation, the more careless they become.
Figure 1: DMS will keep an eye on a sleepy driver. (Image Source: Have a nice day/AdobeStock)
Now We Have DMS
Driver attention monitors, or driver monitor systems (DMS), were once thought of as nice but unimportant. They are now recognized as an absolute necessity for any partially autonomous vehicle. DMS detects and alerts drivers when they are not paying attention to their tasks. It’s been used for years in commercial trucking and transport fleets worldwide to detect and mitigate distraction-related driving events and get truck drivers home safely to their families - and company lawyers away from their keyboards. The NCAP has identified driver monitoring as a primary safety feature. It has been required since 2020 for any new on-road vehicles in Europe to achieve a 5-star safety rating.
No technology can yet determine whether someone’s mind is focused on driving. However, technology can monitor a person's gaze, head posture, or hand position to ensure they are consistent with someone actively engaged in driving. And this technology has been rapidly improving as of late. The image sensors have better sensitivity, the lighting control is lower cost and more accurate, and the processing is using AI and machine learning techniques using the latest digital signal processing chips.
And we have found it is easier, faster, and cheaper to make human drivers safer than to replace human drivers with technology. We are probably a long way from SAE Level 4 or 5 autonomous driving.
How To Get DMS Accomplished
Driver monitoring systems (DMS) use a CMOS camera with filters, usually with a resolution of 1 to 2 megapixels, mounted behind the steering wheel or above, near (or in) the rear-view mirror (or where a mirror used to reside in the old days). The camera's image sensor could, for example, be the 8.467mm AR0135CS from onsemi which features a global shutter, meaning it can scan the entire area of the image simultaneously. This global shutter also ensures synchronization with pulsed light sources. The sensor offers parallel and serial data interfaces, superior low-light performance with 10X lower dark current and 4X higher shutter efficiency vs. previous generation products, and HD 720p image quality. There is an evaluation kit for this sensor.
An essential part of the system is the illumination of the driver’s face. This is done with near-infrared light that is not visible to the naked eye. Using infrared removes the distraction of the person driving and helps eliminate unpredictable external effects such as sunlight or streetlights. Infrared light is also relatively unobstructed by sunglasses.
Controlling the LEDs used in scene illumination is critical to the task. Since infrared light can also be a safety hazard, the period and frequency of the light must be carefully selected. The NCV7694 Safety Controller from onsemi can drive a string of infrared LEDs using a single external MOSFET. This chip provides full eye-safety functionality. The PWM output of this driver IC goes to the MOSFET yielding a constant LED current. The FLASH signal input from the image sensor initiates LED drive output, and the controller ensures correct times.
The chips' safety functionality prevents the IR LEDs from running too long due to an inappropriate exposure time or being turned on too frequently. The IC also has circuit fault detection and shut-down and includes complete ESD protection. Faults are reported via the DIAG pin, which can disable the DC/DC converter (Figure 2) and signal the ADAS.
Figure 2: System diagram for typical design with the image sensor and NCV7694 near-infrared safety controller. (Source: onsemi)
There’s an evaluation/development tool for the NCV7694 chip by the moniker NCV7694-GEVB (Figure 3), which comes as a 21mm x 21mm card and has two selectable LED light sources the NIR OSRAM SFH4725S or the white CREE XPGBWT-L1-0000-00G51. The default LED peak current is set to 3A by a shunt resistor. This kit is for lighting evaluation only and does not include an image sensor.
Figure 3: onsemi’s NCV7694-GEVB Safety Controller Illumination Evaluation Board. (Source: onsemi)
The systems' two LED light sources must illuminate the area with the proper light. The lamps are commonly mounted on either side of the image sensor, and the amount of light needed is calculated based on the average distance to the driver.
The wavelength of light sources is typically either 850nm or 940nm. 850nm can cause a “red glow” effect from the light source, which the human eye perceives as red flickering. At 940nm, there's no red flickering. However, the image may have a bit more glare. One lamp example, the SFH 4725AS 940nm NIR LED from OSRAM, gives 1350mW/sr at 1.5A and a VF of 3.1V.
A vision processor/DSP or directly in the vehicle ADAS system evaluates the camera image data. The excellent stuff that a DMS can do requires using AI and machine learning. Today, nearly every industry uses an AI system to perform tasks that improve efficiency. In our case, the DMS system really wouldn’t be practical without AI. Artificial intelligence preceded machine learning, but machine learning (or deep learning) supercharged it.
Figure 4: With level 2 ADAS, my car can almost drive me home. (Source: Have a nice day/Adobe Stock)
The imager/processor can keep an ‘eye’ on everything from blink frequency to the direction of the driver’s gaze - even detecting if the person's eyes are wide open. The processor system concludes driver distraction or drowsiness and may alert the driver with appropriate warning signals to recommend a break. Some AI processors used in DMS are the various high-performance FPGAs, a DSP, and a specialized ADAS processor from a number of companies.
Software Considerations
The development of a DMS usually involves the integration of different computer vision and deep learning components. Machine learning algorithms are programs (math and logic) that adjust themselves to perform better as they are exposed to more data. Many companies offer software development kits (SDKs) specifically for DMS and some, such as Smart Eye AB (Sweden), claim to be hardware agnostic. Others, such as PathPartner Technology (California), say they are available on a range of automobile-grade platforms, including ARM, Intel, NXP, AMD, Renesas, and Broadcom.
Driver Sense from Cipia (Israel) is a software-based driver DMS for OEMs and Tier 1s and is said to provide lean hardware requirements for processing that can be ported to low-end cost-effective processors and enable full compliance. Seeing Machines (Australia) offers their FOVIO embedded driver monitoring engine (e-DME) chip and software. The system is currently used in General Motors’ Super Cruise driver assistance feature and the new Mercedes Benz S-Class and EQS sedans. It features a very large driving data set and is said to enable designers to seamlessly integrate DMS into their ADAS and semi-automated driving systems.
Microprocessor companies specializing in automotive also offer DMS system development tools with artificial intelligence (AI), automatic deep learning functions, and software development toolkits. These often want to be part of a bigger picture – the complete ADAS system of the vehicle. Automotive-grade packages for specific processors enable developers to transition their various algorithms seamlessly from the development environment to full implementation.
The “learning” part of machine learning means that those programs change how they process data over time. Implementing machine learning usually starts with a framework. There are many standard frameworks. Here are a few: TensorFlow by Google, PyTorch by University de Montreal which uses NVIDA CUDA, Caffe by Berkeley AI Lab, Chainer written purely in Python, and Microsoft CNTK. Most all support Python and C++ programming languages.
Nearly all driver monitor system designs require ASIL-A level safety verification. ASIL refers to Automotive Safety Integrity Level. It is a risk classification system defined by the ISO 26262 standard for the functional safety of road vehicles. Four ASILs are identified by ISO 26262―A, B, C, and D. ASIL A represents the lowest degree. ASIL is a system-level safety requirement covering both software and hardware.
Conclusions
The DMS of the future may go beyond facial imaging and analysis and incorporate facial recognition for security. It will be able to recognize who is driving and customize the vehicle settings, such as seat position, mirrors, heating, and music. It will also prevent anyone other than the owner from operating the vehicle. The DMS may also integrate biometric sensors, such as heart rate monitors and breathalyzers, to ensure that a driver is both alert and capable.
Until SAE Level 5 driving automation is available - no driver required - your vehicle will have a joint control system – ADAS and human. Which one is in charge? The driver is undoubtedly monitoring the automation, and the automation must be watching the driver.