C h a p t e r 1 | U n l o c k i n g t h e Po w e r : M e m o r y a n d S t o r a g e i n A I A p p l i c a t i o n s
Storage capacity is another important
factor, as edge devices often need
to store large AI models locally and
collect and temporarily store data for
processing. Insufficient storage can limit
the complexity of models that can be
deployed or restrict the amount of data
that can be processed locally or before
transmission to the cloud.
Finally, the interface between storage and
processing units significantly impacts
system performance. High-bandwidth
interfaces like Peripheral Component
Interconnect Express (PCIe) or Universal
Flash Storage (UFS) allow for faster
data transfer between storage and AI
accelerators or central processing units,
reducing bottlenecks in the AI pipeline.
The latest UFS 4.0 storage technology
clocks in at twice the performance of
previous-generation UFS 3.1.
Micron is developing memory and storage
solutions for edge AI that
• Address the unique challenges of
embedded systems, including resource
limitations, power constraints, and
diverse use cases across industries like
transportation, factory automation, and
video security
• Provide high-performance, low-latency
options to meet the demanding
requirements of AI processing at the edge
• Offer a range of technologies—from
DRAM and NAND to multi-chip package
devices—to handle a broad range of AI
applications and performance needs in
edge computing environments
Mark Harvey
Principal FAE, SiMa.ai
Memory and storage solutions are
pivotal in AI applications because
they enable the foundation of
deploying and scaling AI, which
requires solutions to be ultra
low latency, power-efficient,
responsible, and secure, which is
possible only at the edge."
7
5 Experts on Addressing the Hidden Challenges of Embedding Edge AI into End Products