Artificial intelligence & Machine Learning Developer Journey

What is AI/ML at Tiny Edge?

In the IoT industry, "edge" refers to devices that perform computation locally instead of relying on cloud computing. The latest development, Tiny Edge, brings computation closer to where data is generated, such as sensor nodes. This shift moves from a centralized, cloud-based solution to a distributed network of edge nodes that collect, process, and infer data locally. By 2027, over 3 billion devices are expected to be sold with TinyML, a subset of AI focused on deploying machine learning models on Tiny Edge devices. This growth is driven by societal trends like the need for speed, privacy and connectivity. Additionally, the transition from wired to wireless technology is further accelerating the adoption of Tiny Edge devices.

Applications of Machine Learning using Silicon Labs’ SoCs

Silicon Labs' Wireless SoCs support a range of ML applications, such as sensor signal processing for predictive and preventative maintenance, bio-signal analysis for healthcare, and cold chain monitoring. They also enable audio pattern matching for security applications, voice commands for smart device control, and low-resolution vision for tasks like people counting and presence detection. The SoCs offer various RAM sizes to accommodate different application requirements. Machine learning models are applied to data from sensors such as microphones, cameras, and those measuring time-series data like acceleration and temperature. These models include audio pattern matching, wake word/command word detection, fingerprint reading, always-on vision, and image/object classification and detection. The detected events can then be further processed according to the requirements.

AI/ML Journey with Silicon Labs

Silicon Labs can accelerate the development of AI/ML devices, starting by outlining each step in the process and helping you along each stage of your project. We are here to simplify your development journey and help you get your devices to market faster and more efficiently.
We have outlined below three key stages of the AI/ML Developer Journey, along with what is required to successfully complete each stage.

Get Started
Build Your Own Solution
Pre-Built Solution
  1. 1. Buy Kits
  2. 2. Create User Account
  3. 3. Development Environment
  4. 4. Explore Demos
  1. 1. Build Model
  2. 2. Test and Validate
  3. Deploy Model
  1. Partners

1. Buy Kit: Hardware and Examples

Silicon Labs provides offers several development and explorer kits, from ultra-low-cost, small form factor to compact, feature-packed platforms designed for robust networks. We have several exciting demos, including wake-word detection, Pacman, and gesture control. These feature-rich kits support multiple protocols and come in different memory configurations with a wide variety of sensors and peripherals for quick debugging and rapid prototyping. Based on the demos you are interested in, please select the kit that best fits your needs below. The demos are hardware agnostic.

 
Kit EFR32xG24 Dev Kit EFR32xG28 Explorer Kit EFR32xG26 +10 dBm
Dev Kit
SiWx917 Wi-Fi 6 &
Bluetooth LE Dev Kit
OPN (xG24-DK2601B) (xG28-EK2705A) (xG26-DK2608A) (SiWx917-DK2605A)
Protocols Supported Bluetooth, Matter, Proprietary, Thread, Zigbee Bluetooth, Sidewalk, Wi-SUN, Z-Wave Bluetooth, Matter, Proprietary, Thread, Zigbee Bluetooth, Wi-Fi
Description The EFR32xG24 Dev Kit is a compact, feature-packed development platform. It provides the fastest path to develop and prototype wireless IoT products. The EFR32xG28 Explorer Kit is small form factor development and evaluation platform based on the EFR32xG28 SoC focused on rapid prototyping and concept creation of IoT applications for Sub-GHz and Bluetooth LE. The EFR32xG26-DK2608A Dev Kit is a compact, feature-packed development platform. It provides the fastest path to develop and prototype wireless IoT products. The SiWx917 Wi-Fi 6 and Bluetooth LE 5.4 Dev Kit is a compact, yet feature-packed development platform for testing, developing, and prototyping wireless IoT applications quickly.
Price $79 USD $34 USD $89 USD $40 USD
*ML enablement in alpha, contact sales
Flash/RAM 1536 kB / 256 kB 512 kB / 32 kB 3.2 MB / 512 kB 8 MB Flash / 8 MB external PSRAM
MVP
Sensors Inertial Sensor, Stereo Microphones, Pressure Sensor, Ambient Light Sensor Temperature Sensor Inertial Sensor, Stereo Microphones, Pressure Sensor, Ambient Light Sensor Temperature Sensor, Humidity Sensor, Inertial Sensor, Digital Microphone, Ambient Light Sensor
< Previous Step Next Step >

2. Create User Accounts

While you wait for your Development Kit, we recommend setting up your user accounts.

Silicon Labs Account:

Silicon Labs Account: This account will offer you access to our developer community, Getting Started guides, private GitHub repositories and our Simplicity Studio development environment. You can create your account or verify access to your account here.

Silicon Labs + Matter
< Previous Step Next Step >

3. Set Up Development Environment

We know you have many options when it comes to choosing your development environment, but we believe Simplicity Studio is the right choice for developing your device with Bluetooth. Here’s why:

  • Includes your programmer and debugger functions so you don’t have to worry about manual setup.
  • Recognizes boards you’ve purchased and identifies which sample apps you can use.

Need help setting up your environment? Our Getting Started Guide will have you up and running in no time. 

Download the Full Online Installer Version of Simplicity Studio v5:

System Requirements

Windows     Windows 10 (64-bit)
Windows 11
MacOS 10.14 Mojave
10.15 Catalina*
11.x Big Sur*
12.x Monterey*
*if trying to use the Keil 8051 or IAR toolchains, click here
Linux Ubuntu 20.24 LTS

 

CPU     1 GHz or better
Memory 1 GB RAM (8 GB recommended for wireless protocol development
Disk Space 600 MB disk space for minimum FFD installation
7 GB for wireless dynamic protocol support
< Previous Step Next Step >

4. Explore Demos

Here is a list of some additional ideas which could be easily realized with minimal coding, modifying the referenced example application as suggested below. These use cases are not provided as ready-to-go demos; instead they provide a perfect context for further evaluation.

Voice Control Light

Detects spoken keywords ""on" and "off" to turn on and off LED on board.

Suggested Kit:   EFR32xG24 Dev Kit

Get up and running quickly with
pre-built application in 10 minutes.

Learn to create the ML application from
trained model in 30 minutes.

Additional Demos

Because starting application development from scratch is difficult, our Simplicity SDK comes with a number of built-in demos and examples covering the most frequent use cases.



Pac-Man

Play the popular Pac-Man game using keywords said out aloud – Go, Left, Right, Up, Down, Stop. The application uses keyword detection. Board can be controlled using Simplicity Studio. The demo is also available as part of Simplicity Studio.

Suggested Kit:



Audio Classifier

This application uses TensorFlow Lite for Microcontrollers to classify audio data recorded on the microphone in a Micrium OS kernel task. The classification is used to control a LED on the board. The demo is also available as part of Simplicity Studio.

Suggested Kit:



Magic Wand

This application demonstrates a model trained to recognize various hand gestures with an accelerometer. The detected gestures are printed to the serial port. The demo is also available as part of Simplicity Studio.

Suggested Kit:



Blink

This application demonstrates a model trained to replicate a sine function. The model is continuously fed with values ranging from 0 to 2pi, and the output of the model is used to control the intensity of an LED. The demo is also available as part of Simplicity Studio.

Suggested Kit:



< Previous Step Next Step >

1. Build Model

Already have your .tflite file ready to go? Skip to the next step: “Test and Validate” .

Train your model and prepare it for conversion into a deployable format.

If you are familiar with ML development follow these steps –


Customized Code

Begin by designing and training your AI/ML model. This involves gathering and preprocessing data, selecting appropriate model, and setting up training parameters.

To help you build your model from scratch, we provide a Python package with command-line utilities and scripts to assist you with building your own model.

Refer to the TensorFlow documentation for support building on the Machine Learning model. Refer to LiteRT documentation for support on converting the model to .tflite 

If you are new to ML development follow these steps –


Low Code

We've partnered with top AI platforms to help you design and build models with minimal coding. These platforms provide user-friendly GUI and automated workflows to simplify the process.

SensiML

Beaverton, Oregon

Edge Impulse

San Jose, California

Neuton AI

San Jose, California

Eta Compute

Sunnyvale, California

If you are looking for a pre-built Machine Learning solution - jump to the last tab "Pre-Built Solution"

< Previous Step Next Step >

2. Test and Validate

Evaluate your model's performance against the embedded target, validate the model to ensure it meets required performance metrics.
 

Optional Tool: MLTK Model Profiler

The MLTK model profiler provides information about how efficiently a model may run on an embedded target. The model profiler allows for executing a .tflite model file in a simulator or on a physical embedded target.

Note: This tool is optional and not officially supported by Silicon Labs, yet.

< Previous Step Next Step >

3. Deploy Model

Integrate and deploy your validated model onto the embedded device.

  • Add AI/ML SDK Extension
  • Configure TensorFlow Micro Component in Studio: set up the component to select the correct kernel for your embedded device 
  • Include and Run the Model: copy your .tflite model into your application into the config folder in your Simplicity Project 
  • Implement Post-Processing: add any necessary post-processing steps to handle model’s output and integrate it with your application’s logic 
< Previous Step Next Step >

Turn Key Solutions

Pre-built, ready-to-deploy AI/ML solutions on Silicon Labs SoCs that simplify the development process and accelerate time-to-market.

Sensory

Santa Clara, California

Azip, Inc.

Cupertino, California

MicroAI

Irving, Texas

Neuton AI

San Jose, California

Design Partners

Silicon Labs has pre-screened and certified the following third-party AI/ML design service companies
to help you design and develop your customized AI/ML solution.

Klika Tech, Inc.

Miami, Florida

AITAD GmbH

Offenburg, Germany

embedUR

Fremont, California

< Previous Step Next Step >
Get Started
1. Buy Kits
2. Create User Account
3. Development Environment
4. Explore Demos
Build Your Own Solution
1. Build Model
2. Test and Validate
Deploy Model
Pre-Built Solution
Partners
Close
Loading Results
Close