We are excited to bring Transform 2022 back in-person July 19 and virtually July 20 – 28. Join AI and data leaders for insightful talks and exciting networking opportunities. Register today!
Emza Visual Sense and Alif Semiconductor have demonstrated an optimized face detection model running on Alif’s Ensemble microcontroller based on Arm IP. The two found it is suitable for enhancing low-power artificial intelligence (AI) at the edge.
The emergence of optimized silicon, models and AI and machine learning (ML) frameworks has made it possible to run advanced AI inference tasks such as eye tracking and face identification at the edge, at low-power and low cost. This opens up new use cases in areas such as industrial IoT and consumer applications.
Making edge devices magnitudes faster
By using Alif’s Ensemble multipoint control unit (MCU), which the Alif claims is the first MCU using the Arm Ethos-U55 microNPU, the AI model ran “an order of magnitude” faster than a CPU-only solution with the M55 at 400MHz. It appears Alif meant two orders of magnitude, as the footnotes state that the high-performance U55 took 4ms compared to 394ms for the M55. The high efficiency U55 executed the model in 11ms. The Ethos-U55 is part of Arm’s Corstone-310 subsystem, which it launched new solutions for in April.
Emza said it trained a full “sophisticated” face detection model on the NPU that can be used for face detection, yaw face angle estimation and facial landmarks. The complete application code has been contributed to Arm’s open-source AI repository called “ML Embedded Eval Kit,” making it the first Arm AI ecosystem partner to do so. The repository can be used to gauge runtime, CPU demand and memory allocation before silicon is available.
“To unleash the potential of endpoint AI, we need to make it easier for IoT developers to access higher performance, less complex development flows and optimized ML models,” said Mohamed Awad, vice president of IoT and embedded at Arm. “Alif’s MCU is helping redefine what is possible at the smallest endpoints and Emza’s contribution of optimized models to the Arm AI open-source repository will accelerate edge AI development.”
Emza claims its visual sensing technology is already shipping in millions of products and with this demonstration, it is expanding its optimized algorithms to SoC vendors and OEMs.
“As we look at the dramatically expanding horizon for TinyML edge devices, Emza is focused on enabling new applications across a broad array of markets,” said Yoram Zylberberg, CEO ofEmza. “There is virtually no limit to the types of visual sensing use cases that can be supported by new powerful, highly efficient hardware.”
VentureBeat’s mission is to be a digital town square for technical decision-makers to gain knowledge about transformative enterprise technology and transact. Learn more about membership.