Intelligence at the Edge
12
T
he immense potential of Machine Learning (ML) has
been widely demonstrated with successful cloud-
based applications that people use every day.
While centralized cloud computing offers unquestionable
economies of scale, STMicroelectronics (ST) believes
that distributed edge computing is the future of ML for
real-time applications such as autonomous vehicles that
cannot be limited by network latency. Privacy, security,
and speed favor "Edge-AI" for many ML use-cases.
Paving the
Way for AI at
the Edge
Louis Gobin, STMicroelectronics
Successful products are developed for a specific
problem or need. Similarly, ML algorithm selection
requires an understanding of system limitations,
use case needs, and machine states that will be
encountered. The role of a classification model is to
learn all possible states in order to recognize and report
anomalies.
Imagine an ML application that logs current sensor
readings from a brushless motor. Performing signal
analysis directly on the device reduces lag time,
failure points, and data transfers while increasing
system reliability. The brushless motor states for the
ML algorithm to learn may include normal operation,
slippage occurring, motor belt wear, shaft misalignment,
and bearing problems. These example classes of motor
states are those that add user value.
NanoEdge AI
TM
Demonstration
Most IoT and consumer edge devices are powered by
microcontrollers (MCUs) with limited memory. But new
ST tools and optimized libraries now allow engineers to
train and deploy ML models and neural networks (NNs) on
resource-constrained devices, with the training performed
on the embedded environment. This is a "game changer"
because it means that self-learning models can be
deployed almost anywhere for snap decision-making
without cloud dependency.