Expanding our AI Capabilities with Sensory Inc.

11/23/2022 | Tamas Daranyi | 2 Min Read

At Silicon Labs, our focus is on creating connected devices that transform the way people interact with their environments, whether that is in the home, or in industrial and commercial sections. One of the more exciting emerging trends that can help expand those interactions is the infusion of artificial intelligence (AI) and machine learning (ML) in embedded devices and at the edge. From refining existing use cases like wake word detection or glass break detection, AI/ML also opens up all-new use cases like building occupancy detection based on sound instead of movement (we’ve all been in the situation where the motion detectors don’t pick up the fine movements of fingers at a keyboard and your office lights switch off while you’re still there).

 

Silicon Labs is Your AI/ML Partner at all Stages of the IoT Development Process

We recognize that everyone is at a different stage of their AI/ML journey, and that’s why Silicon Labs has created simple categories to help package the right solutions for the right level of AI/ML expertise: ML Experts, ML Explorers, and ML Solutions.

 

  • Experts: An ML Expert is someone with extensive experience working on ML projects and who is familiar with TensorFlow and Python. These developers understand how to pre-process raw data and attenuate the key elements, know how to create the proper network of convolutional computations, and how to interpret the constant output of stochastic information from inferencing.
  • Explorers: An ML Explorer is an experienced embedded developer familiar with ML concepts but might be working on their first ML project or are exploring how ML can help them differentiate their product. Developers at this level are interested in a tool that offers end-to-end coverage of the workflow or prefer GUI-based tools over code-based solutions.
  • Solutions: Developers at this level require very little, if any, experience with ML applications and look for solutions focused on their specific use case that they can integrate into their current application. The tools offered at this level will focus on using ML as a methodology but do not require any ML experience.

 

Expanding our ML Solutions for IoT Devices with Sensory Inc.

Today, we’re pleased to announce that we are extending our AI/ML development capabilities by entering a partnership with Sensory Inc. Sensory is the market leader in wake word detection and command or continuous speech recognition and is already providing AI/ML capabilities on over three billion devices. With this new partnership, Sensory is ready to deploy on Silicon Labs robust SoCs focused on AI/ML. Specifically, you can get started with Sensory on EFR32 Series 1 and Series 2, including the MG24, which has a built-in AI/ML accelerator that can offload specialized machine learning tasks from the main Arm Cortex MCU to increase performance by 8x and reduce energy use by 6x for those machine learning operations.

“Silicon Labs’ emphasis on low-power, wireless connectivity for multiple protocols like Zigbee is a great complement for extending the reach of machine learning for Sensory’s user base,” said Todd Mozer, CEO at Sensory. “Together, Silicon Labs and Sensory are opening new markets for low-power wireless applications.”Sensory is perfect for those who don’t need a custom-built AI/ML application and instead are ready to choose from one of Sensory’s pre-packaged, fully-baked code sets. If you are interested, you can look at the sample code we have on GitHub. This small demonstration is to show how you can use wake word detection at the edge.

In addition, Silicon Labs is also joining Sensory and other influential machine learning organizations in the tinyML Foundation. By joining this community, Silicon Labs will augment the Foundation’s efforts to gain more adoption and usage of tinyML applications on embedded devices, share knowledge to educate users and the market, explore new use cases and problems to solve with tinyML techniques and grow market adoption as a result.

 

Get Started with AI/ML for IoT with Sensory and Silicon Labs

As mentioned above, you can get started with Sensory right now using our sample code on GitHub. Sensory was also a participant in our recent Works With 2022 Developer Conference, and you can watch a training using Sensory on-demand to get more hands-on: AI/ML IoT Development Training: Works With 2022.

You can also read my colleague Dan Kozin’s technical how-to, Use your own Wake Word in this ML example by just Typing the Wake Word, on our Silicon Labs Community site to see more examples of what you can do with Silicon Labs and Sensory.  

Tamas Daranyi
Tamas Daranyi
Senior Product Manager, AI/ML | Silicon Labs
Close
Loading Results
Close