C h a p t e r 3 | S i m p l i f y i n g D e v e l o p m e n t : T h e R o l e o f P r e t e s t e d B u i l d i n g B l o c k s
which include components such as
high-performance microprocessors
or microcontrollers, DRAM, NAND
flash storage, power management
integrated circuits, and sometimes even
AI accelerators. These modules are
often designed to support specific AI
frameworks and inference engines, such
as TensorFlow Lite or ONNX Runtime,
allowing for easier deployment of
pretrained models.
The use of pretested modules can
significantly reduce the complexity of
hardware design, particularly in areas
like signal integrity, power integrity, and
thermal management. For instance,
high-speed interfaces like LPDDR4X or
LPDDR5X require careful printed circuit
board layout and impedance matching
to maintain signal quality at multigigabit
speeds. By using a prevalidated SOM,
designers can avoid the intricacies of
these high-speed designs and focus on
system-level integration.
Moreover, these modules often come
with board support packages that
include optimized drivers for integrated
components, and middleware that
supports various AI operations. This
support can include libraries for efficient
tensor operations, quantization tools
for model optimization, and drivers for
utilizing hardware accelerators like NPUs
or vision processing units.
From a software perspective, working
with validated building blocks can
simplify the development of AI pipelines.
Many of these modules support edge
AI frameworks that allow for easy model
deployment and inference. For example,
a designer may use TensorFlow Lite for
microcontrollers to deploy a quantized
model onto a low-power microcontroller
By leveraging pretested building
blocks, developers can focus on
the innovative aspects of their
AI projects rather than spending
time on troubleshooting and
integration challenges."
Barry Chang
Director of Advantech Edge Server & AI Group,
Advantech
17
5 Experts on Addressing the Hidden Challenges of Embedding Edge AI into End Products