Issue link: https://resources.mouser.com/i/1442826
26 AI Ready to Add AI Functionality into your Edge Device? Let's Begin! By Vikram Tank, Product Manager, Coral T raditionally, data from edge devices was sent to large compute instances, housed in centralized data centers where machine learning (ML) models could operate at speed. As the amount of inferences needed for machine learning continues to grow exponentially, so too has the demand to do this at the edge in order to preserve privacy, reduce latency, bandwidth, and cost. Coral™ is a platform of hardware and software components from Google that helps you build devices with local artificial intelligence (AI)—providing hardware acceleration for neural networks (NNs) right on the edge device. Local AI acceleration has many benefits. The first is speed; processing inferences with the Edge™ TPU (Google's proprietary ASIC for NN acceleration found on Coral products) can occur in as little as 2ms. Fast inference times, allow for fast action times in critical scenarios such as automotive control systems. A car moving at 100km would traverse roughly 3m in 100ms; delays in processing can lead to avoidable accidents. The Edge TPU's speed also allows you to cascade data through several NNs, creating more complex interactions. Imagine a system designed to monitor feeding and watering in vertical farming. One could use an object detection model to locate several plants within a given growing tray and then use an image classifier to triage each one based on its growth stage. Slow growing plants could be given extra nutrients, water, or removed entirely thus optimizing the yield of a particular tray. Local AI enables privacy by giving customers control over their data and can give users fine-grained control of what is shared with a cloud and what is not. Consider a medical device manufacturer that wants to do real-time analysis of ultrasound images using image recognition neural networks. Building Coral into their devices allows for processing patient data in a Health Insurance Portability and Accountability Act (HIPAA) compliant way. Manufacturers can also help patients with at-home devices have confidence that data handled on the device doesn't go out of their control. Coral's local AI also helps you save on data transfer/storage rates and operate in remote field scenarios. Local AI doesn't presuppose an absence of a cloud. Hybrid cloud solutions where edge devices filter the data sent to the cloud are becoming a popular topology for efficient network use. A 4K security camera that stares at an unused door 24 hours a day consumes 24MB of storage with each frame of video. Using an ML model to classify anomalous events and transmit them to the server can lead to huge savings. In no-connectivity scenarios, such as on oil rigs in the middle of the ocean, local AI processing allows you to take action and optimize system performance without a round trip to the cloud. The Coral platform scales for many different use cases. For devices at the edge where power is a concern, the Edge TPU was designed to be a power efficient solution capable of ❝ ❞ Google's Coral Platform enables low cost, rapid prototyping for Internet of Things (IoT) devices with local AI capability.