The Project board is for sharing projects based on Silicon Labs' component with other community members. View Projects Guidelines ›


      • EFR32MG21A010F768 PWM

        li ye | 11/317/2019 | 07:43 AM

        Silicon Labs 无线SOC EFR32MG21的PWM的输出需要将TIMER的CC端口配置到GPIO上,如下:









      • Control a WIFI-based Device with an Tmall Genie

        Victor Hu | 12/363/2018 | 01:25 AM

        1 Introduction

        People used to regard mobile phones, TVs or routers as the control centers of smart homes until the emergence of smart speakers. It should be that you control your smart devices just by saying a sentence such as "Open the light of living room", rather than by clicking on the phone screen. Now, there are so many smart speakers in market for controlling the smart device, two of the most famous are Google Home from Google and Echo from Amazon. In China, the most famous smart speaker are XiaoAi from XiaoMi and Tmall Genie from Alibaba. Now, all of these companies have developed IoT platform so that the manufacturer who produce smart homes can connect their smart productions to these smart speakers. This project is intended to introduce how to control a Silicon Labs Wi-Fi device.with TmallGenie.

        The software architecture of project consists of:

        • Firmware running on EFM32GG11 STK
        • Web server running on cloud.

        The hardware consists of :

        • A EFM32GG11 STK
        • A WGM110 Wi-Fi Expansion Kit
        • A Tmall Genie smart speaker.

        User can control the LED0 in EFM32GG11 STK through one of two methods below.

        We assume the whole device combine the EFM32GG11 STK and the WGM110 module is a smart light. The block is illustrated below.

        2 Preparation

        2.1 Device Connection

        Connect the boards as illustrated in the picture below. Configure the EFM32GG11 STK board switch as AEM and configure the WGM110 Expansion Kit switch as High Power. 

        2.2 Firmware

        Import the firmware to Simplicity Studio, open file "app_wifi_cfg.h", modify the macro APP_WIFI_AP_SSID to your Wi-Fi SSID and modify APP_WIFI_AP_PWD to your Wi-Fi password, as the picture shown below. Then compile it. Program the Hex file to EFM32GG11 Giant Gecko Starter Kit (SLSTK3701A) to make it as a smart light device. The compiled firmware is located directory Bin/, the default Wi-Fi SSID is "netis_1F7506" and the password is "password", if you want to recompile the firmware you can configure your Wi-Fi.

        When the firmware is runing for the first time, it needs to report itself to silabs-iot. It will issue a http request to silabs-iot so that web server of silabs-iot will add information of the device to database, then the device can been used by one of users.

        2.3 Bind the device with your account

        You need to bind the device and declare it belongs to you then you can control it. Please follow to the steps below to bind a device.

        • After you sign up or log in, you will enter the MyDevice page, as the picture shown below. 

        • Click icon "+" to add a device, fill the Mac address and device name in the pop-up window, the Mac address can be found in the EFM32GG11 STK LCD screen when you push BTN1(Before that, make sure your devcice have connect Wi-Fi correctly, you can only bind the device which has connected internect and hasn't been binded by other users. For testing, we provide 3 fictitious mac addr: 0123456789, 0123456788 and 0123456787, you can choose one for testing and remeber to unbind it after using). Click "OK" will finish binding. 

        3 Control device by Browser

        After binding your device, you can see it on the MyDevice page. click the device and will pop up a window on which there is a switch, you can turn on/off the LED just by click the switch. 

        4 Control device by Tmall Genie smart speaker

        At first, you need to download a Tmall Genie App into your phone and bind your Tmall Genie following the hint of Tmall Genie App. Please follow to the steps below to bind Silabs-iot with Tmall Genie.

        • Click "我的" ("Mine")
        • Click "添加智能设备" ("Add Smart Device")
        • Find SiliconLabs in device list and click it
        • Click "绑定账号" ("Bind Account")
        • Input the account and password that you get from and click "LOGIN AND AUTHORIZE".
        • Return to "我的" the device in will be shown in the window. 

        Now, you can control the device just by talking to Tmall Genie, such as "天猫精灵,开灯" ("Tmall Genie, Open Light Please"). The video attached demonstrate how to control the device with Tmall Genie by voice.


        Note: The Tmall Genie only supports Chinese and it supports turn on/off and query on this case.

        5 Web Server

        The web server of silabs-iot is developed with ThinkPHP5.1, ThinkPHP is a free, open source, fast and simple object-oriented PHP development framework for web application development, visit it from

        5.1 Processes of Control by Tmall Genie

        The following image illustrates the processes of communication among the web server of user, Tmall Genie, silabs-iot, alibaba web server and device.



        • Alibaba web server needs to obtain limited access to silabs-iot so that it can get device information and control it, get details from section 5.2  Authorization between Alibaba and silabs-iot .
        • Alibaba will issue a http request to silabs-iot to get all device information. The web server of silabs-iot will check the identify of visitor and then return all devices list owned by the user.
        • Alibaba get the deice information and display them in device list of Tmall Genie App.
        • User say a command to Tmall Genie.
        • Tmall genie will upload the voice to Alibaba web server.
        • Alibaba parse the voice.
        • Alibaba issue a new http request to silabs-iot according to the result of voice parsing, for example turn on light. The web server of silabs-iot will update the database after receiving the request from alibaba and return response immediately.
        • The device will poll the state by issue a http request periodic to silabs-iot and update itself. Get details from section 5.3 Communication between device and silabs-iot
        • Alibaba return response to Tmall Genie
        • Tmall Genie plays a voice as a repsonse of user's voice command.

        5.2 Authorization between  Alibaba and silabs-iot

        The authorization protocol between them is based on OAuth2.0. OAuth2.0 is an open protocol to allow secure authorization in a simple and standard method from web, mobile and desktop applications, get more details from . The OAuth2.0 server on this case is based on the library from Brent Shaffer, download it from GitHub: The process is divided into the following steps, as the picture shown below.


        1. The Alibaba web server will jump into the Log in page of when you click 3rd-party login in Tmall Genie App.
        2. Input account and password to log in, if success, the web server of silabs-iot will generate a code.
        3. Return the new code to Alibaba web server, the code only used by Alibaba and can only use once.
        4. Alibaba get the code and request an access_token to with it.
        5. The web server of silabs-iot checks the code and identify of visitor, it will generate a access_token and refresh token if success. The access_token is time-sensitive and the time can be configured, it is 2 days in this case. The refresh_token is used to request a new access_token when the access_token is expired.
        6. Alibaba web server will store the access_token and refresh_token. Now, it can access silabs-iot with access_token.


        5.3 Communication between device and silabs-iot

        The communication between devices and web server is based on Http protocol. The device need to visit web server periodic so that it can update the LED0 once the Tmall Genie change the state of light.

        1. Device issues a new http request to silabs-iot.
        2. The web server gets the mac address and query the state in database.
        3. The web server returns the state value of the device.
        4. The device parses the http response and update LED0 according to the results.

      • Win a Wireless Xpress BGX13P Starter Kit!

        Nari Shin | 10/295/2018 | 09:31 AM

        We’re excited to introduce the new Wireless Xpress BGX13P starter kit, which helps you jumpstart your design with no software development necessary. Some of the key features of this kit include:  

        • Bluetooth 5 BGX13 module requiring no firmware development
        • Zero-overhead serial-to-Bluetooth cable replacement solution
        • Smartphone app for Bluetooth LE command, control, and sensing
        • Secure connections with encrypted communication, bonding, and ‘just works’ and passkey pairing options
        • Ideal solution for smart home products requiring Bluetooth control with a mobile app, and the ability to add point-to-point wireless interface to industrial applications

        Want to try it for yourself? We’re giving away five Wireless Xpress BGX13P starter kits to our community members, and this is your chance to get your hands on one.

        How to Participate:

        Explain up to 2 ideas on how you want to use the Xpress kit for your project and why. You can submit your idea by leaving a comment below on this page by November 11th (CDT). 

        Judging Criteria

        Our Wireless marketing team will judge submissions based on the following criteria:

        • Market Potential (50%)
        • Differentiation (30%)
        • The fit between the Xpress kit and your project (20%)


        Number of Winners: 5

        Prize: 1 x Wireless Xpress BGX13P starter kit

        Contest Period

        Oct 22nd 2018 – Nov 11th 2018 (CDT)

        The winners will be announced on this page soon after the end of the contest.

        By entering the contest, you acknowledge that you have read and agree to the attached Community Contest Terms and Conditions.

      • Oscilloscope Simulation System

        Victor Hu | 09/257/2018 | 02:57 AM

        1 Introduction

        The Oscilloscope Simulation System based on EFM8UB1 works as a tool to sample an ADC value and display it as waveform on a computer screen. Although it is a combination of hardware and software, it looks like a real oscilloscope.

        The software architecture of this system consists of firmware running on EFM8UB1 STK and an application running on a computer. The GUI application running on the computer was developed using python 2.7, pyside2, Qt5, and the USBxpress library.

        The hardware consists of a computer host and an EFM8UB1EK device. The EFM8UB1EK is connected to the computer with a USB cable. You can use EFM8UB1 to sample the ADC voltage value and, send the collected data to the host via USB. The block is illustrated below.


        2 Preparation

        2.1 Install Python 2.7 32-bit

        Most of library made by Silicon Labs was developed with Python 2.7 32-bit. Please download and install Python 2.7 32-bit. Here is the download link:

        2.2 Install Pyside2 and Qt

        Official release wheels of Qt for Python can be installed regularly via pip:

        pip install pyside2

        Pre-release (snapshot) wheels containing the latest code changes are available at

        For example you can install the latest 5.11 snapshot wheel using:

        pip install --index-url= pyside2 --trusted-host

        See more info from

        2.3 Firmware Programming

        The compiled firmware is located in the directory Bin/. Program the Hex file to EFM8 Universal Bee Starter Kit (SLSTK2000A) to make it as an oscilloscope device.


        Connect the EFM8UB1EK device and a computer with a USB cable.

        2.4 Execute GUI application

        Download the software/ to your local device and run the application with the command below.


        3 Overview GUI

        The application will start with a login widget as shown below. The component 1 is a device combobox that shows the connected EFM8UB1EK device, and the component 2 is a pushbutton.


        The device shown in the combobox will be open, and the application will jump to a main widget when the pushbutton is pressed.


        District 1 is a canvas which can be used to display the waveform. When the cursor is in this district, the time and ADC value of cursor position will be shown.

        District 2 is the display district. The buttons in this district are the zoom in pushbutton, the zoom out pushbutton, the enable ADC pushbutton, and the single trigger button from left to right.

        District 3 is used to control 3 LED on the EFM8UB1EK device.

        District 4 is used to enable channels. There are two channels available: the routed to the GPIO pin1.7 and the GPIO pin1.2 respectively.

        District 5 is the frequency district. The sample frequency will be set to the value shown in the combobox when the set pushbutton is pushed.

        The current information will be shown in district 6.

        4 Start sampling

        Once you have all the required environments and equipment, you can start sampling by following the steps.

        • Connect the device and host
        • Execute the GUI application
        • Choose the required device in combobox and press the enter button
        • Check the required channel
        • Choose the appropriate frequency and then press the set button
        • Push the enable ADC button in district 2

        After that, the waveform will now display in district 1.

        5 Software development and functional development

        In order to facilitate continued function expansion, a flexible protocol was developed. Every operation is an item of command which consists of preamble byte, command byte, value length and value. The preamble byte is a fixed value 0xAA, the command byte shows the specific operation, and the third byte followed by a value is the length of the value. The table shown below lists the command that have been taken. If you want to develop a new command, you can start with 0x60.


        6 Conclusion

        The oscilloscope simulation system is used to sample analog voltage and show it as waveform. It provides two channels to sample the voltage signal, supports zoom out and zoom in, and can display voltage and time information on the mouse cursor point, making it look like a real oscilloscope. It is an important tool for measuring analog voltages for engineers.

        7 Source Code

        The source code of the GUI tool and firmware is attached for reference.

      • Zigbee to Modbus TCP/IP Gateway

        gettogupta | 07/195/2018 | 03:43 PM


        Zigbee to Modbus Gateway

        Fact #1 : Industrial IoT is one of the biggest slice of the global IoT pie.

        Fact #2 : Zigbee is a popular industrial communication wireless standard

        Fact #3 :  Modbus is an indispensable part of industrial automation.

        Our technical team realized this opportunity and came up with this Zigbee to Modbus TCP/IP gateway, that can act as an interface between a industrial Zigbee network and a Modbus TCP/IP or even a general TCP/IP network and hence the internet at large. 

        Naturally, when it came to selecting a Zigbee module we decided to go with the Telegesis ETRX357-LRS module, for a host of reasons. For the TCP/IP end we used the Xpico module by Lantronix. Because the Xpico comes in a general TCP/IP stack version and also a Industrial Modbus TCP/IP stack version. 

        To make the product more versatile we decided to add a ARM Cortex M4 MCU for some custom firmware and device management. To make the product truly easy to use and deploy on field we added a PoE+ (IEEE 802.3at) power option. 

        Look forward to comments and suggestions from experts here on potential use cases and opportunities with the product. 

      • IoT Party Button

        Mark Mulrooney | 02/33/2018 | 05:58 PM

        The following is a project write-up from a recent hackathon that took place with the Silicon Labs MCU and Micrium Application Engineering teams. The members of this team were Mark Mulrooney, Michael Dean, Alan Sy and Joe Stine.


        Project Summary:

        The goal of this project was to create an IoT enabled Party Button that would allow a user to press a button and trigger a number of party lights all to turn on at the same time. This was accomplished using a combination of Silicon Labs EFM32GG11 Starter Kits, Silicon Labs EFR32MG12 Starter Kits, Silicon Labs Smart Outlets, Silicon Labs Si8751-KIT Isolators, Dream Cheeky Big Red Button and a lot of party lights/disco balls. Using this hardware, a signal was send from the Big Red Button to an MQTT broker which then propagated to out to Giant Gecko kits that listening for a signal. Some of the GG11s had the isolator connected directly to the board and would toggle their specific party light and other GG11s were connected over serial to a Mighty Gecko kit. The Mighty Gecko would send a ZCL on/off message over Zigbee to other Mighty Geckos or the Silicon Labs Smart Outlet to control the other party lights.


        Project Background:

        Since our team typically works on the EFM32 platform or with software other than Micrium OS, our main goals of this project were to become familiar with the EFR32 chips/tools and to use Micrium OS to add internet connectivity to a LAN IoT ecosystem such as a Zigbee network. As we found out, the project did prove to be a good exercise in both the EFR32 and Micrium OS.

        The project can be divided up into three main sections: MQTT, Zigbee and Isolation. An advantage of this was since the project was somewhat complicated and involved a lot of moving parts, it allowed our team members to work on different parts of the project without holding up another part of the team. The following sections describe the different parts of the project and how they operated.



        MQTT Diagram

        The IoT Party Button project used MQTT as the communication protocol between the Big Red Button and the GG11 nodes. MQTT is a lightweight publish-subscribe IoT protocol that sits on-top of TCP. For our project we used the Mosquitto broker as our MQTT broker for all of the clients to connect to. The Mosquitto broker was hosted on an AWS EC2 instance and implemented a simple username/password for some basic security. In a real-world application you would ideally use TLS in conjunction with MQTT to encrypt your connection to an MQTT broker.

        The project used MQTT for control of the trigger for a few reasons. The biggest reason was flexibility. Initially, during the planning of the project we discussed the possibility of plugging the Big Red Button into a Giant Gecko. This would have allowed us to use the Micrium OS USB stack to detect the button press and Micrium OS Net’s MQTT client to publish the button push to the MQTT broker. Since we only had a few days to complete this project we were unsure if there would be enough time to complete this portion, so we set that part aside.

        For testing purposes, we created a simple button simulator in Node.JS that could run on anyone’s computer and publish a message to the MQTT topic for a button press. Since we did not have enough time to complete the USB portion on a GG11, we ended up using a Node.JS script to listen for a button press while the button was plugged into one of our computers. When the Node.JS script detected the button press it sent an MQTT message to the trigger topic.

        Another advantage of using MQTT for control of the trigger is it opened up the ability to have the trigger sent from a number of places. We use Slack as a communication tool in the office, but we also have a helper bot that you can send commands to. It is possible that we could have had the bot send the same MQTT command to the MQTT broker to trigger the IoT party.

        All of the GG11s that were subscribed to the MQTT topic for the button trigger used Micrium OS and the Micrium OS Network MQTT client. Once the Micrium OS portion was set up, the subscribed nodes had one of two functions: either trigger an isolator connected to it or send a serial command to a Mighty Gecko to trigger it’s local network via Zigbee. For simplicity’s sake, we used the same application on all of the GG11s. This allowed us to program them all up without the need for individual code changes.



        Zigbee diagram


        The Giant Gecko kit only has Ethernet, we found it was not practical to use Giant Gecko’s for every node in our project. Instead, it was easier to use one Giant Gecko that had an Ethernet connection and use a Mighty Gecko to send the command out wirelessly to all nodes in its network. It also allowed us to use some of the Silicon Labs Smart Outlets as nodes in our project.

        Zigbee networks typically have three different types of nodes in them: Coordinator, Router and End Device. In our project, the coordinator was connected to the Giant Gecko via serial to receive the on or off command and would then relay that command out to all of the nodes in the network. The coordinator was configured using AppBuilder in Simplicity Studio. AppBuilder allows you to specify what packages should be included and generates the necessary code. Since the Giant Gecko was connected to the Mighty Gecko over serial, we enabled the command line as a simple way for the Giant Gecko to send commands to the Mighty Gecko.

        We took advantage of the Zigbee Cluster Library in this project to simplify the format of the on or off message being sent to the nodes. Also, the Silicon Labs Smart Outlets by default use the ZCL on/off library so we did not have to do any configuration on the outlets. Once the command line and ZCL on/off library was enabled in AppBuilder, it was able to put our project together and we were able to flash our coordinator.

        The rest of the nodes in our project were configured either as a router or end devices. Similar to the coordinator we used AppBuilder to generate a project that listens for ZCL on/off messages, but in this case, we did not need the command line. We did however have to add code to the ZCL on/off hooks to toggle a GPIO which would in-turn, toggle the power switcher which was connected to our party lights.


        Power Switching:

        To be energy friendly, we decided our system should control the power of the disco light. Our MCU board is running off DC power, but the disco light is powered by AC from a standard wall outlet. So we needed to control AC power from a DC system. To be safe, we should isolate the AC power from the DC power and use a high power MOSFET to turn on and off the AC power to the disco light. Fortunately, Silicon Labs makes an evaluation kit that does just this.

        The Si8751-KIT contains an evaluation board that takes care of isolating two power systems and allows for a digital input on the low voltage, DC side to control the MOSFET on the high power, AC side. Set up was as simple as configuring a few jumpers on the board, connecting the low power side to the VDD, GND, and a GPIO of the MCU, and then connecting the high-power side to the AC outlet and the disco light.

        We also had another disco light that operated from 12V DC, and fortunately, the SI8751-KIT also has high voltage DC isolation capabilities. So, we used a second Si8751-KIT to isolate the 12V DC from our low voltage DC system on the wireless MCU.


        Lessons Learned:

        This project required a fine balance between several different protocols all within the same network. This meant there were a lot of moving parts to deal with so sometimes it was difficult to determine where a problem may be occurring. Over the course of the week our debugging skills became a little more fine-tuned but we definitely had some hiccups at first.

        By far our biggest challenge was working with Zigbee. This was mainly because none of us were familiar with the tools or the development kits. The Zigbee tools, as we found out, have a bit of a learning curve and a few tricks to them. We also got unlucky when the first example project we chose to try didn’t work because of a software problem in a newly released SDK. After determining the issue was with the project we moved on to a known working example, Dynamic Multi-Protocol. Once we started working with that project we quickly realized that we were using an example that had a lot of extra overhead we did not need and was confusing us.

        After our failed experiments with some sample projects we decided to start from scratch and build up our own project in the App Builder. After jumping through a few hoops we were able to get a project configured the way we wanted. We found that starting small and building off that was a much better approach than trying to use a complicated example and trim off the excess features. We also found that complicated projects like Zigbee can have a steep learning curve and we underestimated the amount of time it would take to complete the Zigbee portion. Luckily, we were able to complete the Micrium OS portion on the Giant Gecko rather quickly which gave us extra time to focus on Zigbee.


        Next Steps:

        Due to some issues with our Zigbee configuration, our project was not complete at the end of the hackathon week. Our final presentation had the ability to send a message from the Big Red Button to the MQTT broker and down to the EFM32GG11 boards, the ability to send a serial command from the EFM32GG11 to the EFR32MG12 and the ability to switch the isolators from either the EFM32GG11 or EFR32MG12. The one gap in the project was sending the ZCL on/off message correct to all of the nodes in our network. An obvious next step for this project would be to rectify the issues in our Zigbee network configuration.

        Beyond getting the Zigbee network configured correctly, we had a few other improvements that could be implemented. First, the Big Red Button could be connected a EFM32GG11 running Micrium OS USB Host to read the button state and send it via MQTT using Micrium OS Network. The second improvement we talked about was actually hooking up a Slack chat bot to have a command to trigger the party instead of using the Big Red Button.



        While we were not able to get the project working as intended, it proved to be a very valuable exercise to explore the EFR32MG12 and a fun way to do it.

      • Wireless PC Remote for volume and media control

        BrianL | 02/33/2018 | 04:00 PM

        Recently, our MCU Applications team and our Micrium OS team decided to spend a few days, in teams, on a "Hackathon". This allowed us the opportunity to work on a larger, real-world application, in an effort to gain more insight into our products and uses.

        Our team comprised of Brian Lampkin, Janos Magasrevy, and Yanko Sosa. For our project, we decided to create a PC media controller for wireless volume and media control. This would consist of a wireless USB Dongle, connected to a wireless remote controller to provide media controls such as volume up/down, next track/last track, mute, etc.

        1. Requirements

        1.1 Hardware

        1. A USB ‘Dongle’ to provide wireless connectivity to the controller.

          This required a USB interface MCU to communicate with the host PC and a radio MCU to communicate with the remote. For the USB MCU, we chose an EFM32HG, since it is relatively small, and our application – a simple UART to USB HID command bridge – would require little flash. For our Radio MCU, we chose an EFR32FG12 device, which could cover any proprietary protocol we chose to implement. This would provide our UART to Wireless bridge.
        2. A wireless ‘Remote’ to provide the user interface for the media controller.

          We chose another EFR32FG12 radio MCU, to pair with the other on the USB dongle. Since this was to be a battery powered remote, we needed an MCU that could be run in a low duty-cycle, low power mode. To provide the user interface, buttons and a joystick on an expansion board were used.

          The completed remote and dongle hardware, using an EFM32HG STK with a Wireless Expansion Board and EFR32FG12, along with an EFR32FG12 Wireless STK with a Joystick Expansion Board, are shown below:


        1.2 Software

        An additional requirement was added – the project must integrate Micrium OS in some manner. We chose to implement this on the EFR32FG12 wireless devices to help manage wireless connectivity and low power features.

        2. System Overview

        The system block diagram is as follows:


        Buttons and Joystick input are taken by the remote’s Flex Gecko MCU, and converted into wireless packets that represend media commands. These are transmitted to the dongle’s Flex Gecko, which are then converted into UART transmissions to the dongle’s Happy Gecko MCU. Finally, these are interpreted as HID media control commands, sent to the host PC over USB.

        The joystick expansion board was mapped to the following media control functions:


        2.1 Wireless Protocol

        Our project has a very simple wireless communication requirement. When a button is pressed on the remote, this button status must be transmitted from the remote to the dongle’s receiver. Since there are few functions, a single byte payload was used to transmit this data. The remote never needs to receive any information from the dongle, so the dongle can be kept in RX mode, while the dongle can transmit a byte whenever the state of the remote’s buttons changes. This is an extremely simple communication protocol, so we decided to use the lower level Radio Abstraction Interface Layer (RAIL) directly rather than utilizing a stack such as Zigbee or Connect.

        Since no stack is used, the protocol is effectively proprietary. 2.4 GHz was chosen for the radio’s communication band, as opposed to a Sub GHz band, as this allows for a smaller antenna, useful for a handheld remote.

        2.2 Energy Concerns

        As a battery powered device, low energy consumption is a huge priority for the remote. However, since the dongle is USB powered, there is little reason to limit the power consumption there. Thus, the dongle can be awake and in RX mode continuously with little drawback when connected to the PC's USB. On the remote side, however, consideration was given into keeping the device in lower energy modes whenever possible. Due to the dongle always being in RX mode, we can effectively keep the remote in a low energy state until a button press is made, triggering a new media function update. In our design, this means that the remote only wakes to transmit a packet, then immediately re-enters sleep mode.

        3 The Dongle

        3.1 USB HID Media Device

        The first step in the project was to create a device that could communicate to the PC as a media controller. We decided to implement a HID device, which allows for driverless communication to a PC host for a limited set of known functions. For this project, we implemented what is called a USB HID “Consumer Control” device. The description for the options available in interface is provided in a table in section 15. "Consumer Page" in the USB HID Usage Tables document available on Some of the available commands found in this interface are:

        This interface includes many of the media controls that you would normally use during the use of a media application on a PC (Playing video, music, etc): Play, Pause, Record, etc. In this project, we chose to implement the following commands on our remote:

        1. Play/Pause – ID: 0xCD
        2. Scan Next Track – ID: 0xB5
        3. Scan Previous Track – ID: 0xB6
        4. Mute – ID: 0xE2
        5. Volume Increment – ID: 0xE9
        6. Volume Decrement – ID: 0xEA
        7. Play (Unused) – ID: 0xB0
        8. Stop (Unused) – ID: 0xB7

        We eventually decided not to use the Play and Stop commands, as the Play/Pause command that we found implemented the functionality we desired, and allowed us to reduce the total number of inputs to six, which would map neatly to our expansion board’s two buttons and joystick with four cardinal directions.

        3.1.1 HID Report Descriptor

        To interface with a host using the HID interface, a HID report descriptor, describing the functionality of the device, must be constructed. We used the HID Usage Tables document, which included several examples (Specifically, Appendix A.1 Volume Control contained a useful example on volume +/-), and the HID Descriptor Tool to construct the following HID Descriptor:

        // HID Report Descriptor for Interface 0
        const char hid_reportDesc[39] SL_ATTRIBUTE_ALIGN(4) =
          0x05, 0x0C,       // USAGE_PAGE (Consumer)
          0x09, 0x01,       // USAGE (Consumer Control)
          0xA1, 0x01,       // COLLECTION (Application)
          0x15, 0x00,       //   LOGICAL_MINIMUM (0)
          0x25, 0x01,       //   LOGICAL_MAXIMUM (1)
          0x75, 0x01,       //   REPORT SIZE (1)
          0x95, 0x08,       //   REPORT COUNT (8)
          0x09, 0xCD,       //   USAGE (Play/Pause)
          0x09, 0xB5,       //   USAGE (Scan Next Track)
          0x09, 0xB6,       //   USAGE (Scan Previous Track)
          0x09, 0xE2,       //   USAGE (Mute)
          0x09, 0xE9,       //   USAGE (Volume Increment)
          0x09, 0xEA,       //   USAGE (Volume Decrement)
          0x09, 0xB0,       //   USAGE (Play)
          0x09, 0xB7,       //   USAGE (Stop)
          0x81, 0x02,       //   INPUT (Data,Var,Abs)
          0x75, 0x08,       //   REPORT SIZE (8)
          0x95, 0x01,       //   REPORT COUNT (1)
          0x81, 0x03,       //   INPUT (Cnst,Var,Abs)
          0xC0              // END_COLLECTION

        This constructs a HID report with two bytes of data. The first byte implements 8 bit options, one for each of the HID commands. The second is a placeholder byte, unused by our application (but could be used for additional functions in the future).

        As a basis for our EFM32HG USB project, we used the usbhidkbd example, which implements a USB HID Keyboard. The conversion for this was rather simple, as the USB side only required a quick swap from the HID Keyboard descriptor to the new HID Consumer Device descriptor, above. With this change, the EFM32HG device now enumerated on the host PC as a media controller.

        3.1.2 USB to Radio Interface

        The next step in the process was to develop an interface that could communicate between the dongle’s radio MCU and the USB MCU to tell the PC when a media button had been pressed. For this, we implemented a simple UART interface.

        The media control functions are represented by single bits in the HID report's first byte's bitfield, as described below:

        typedef enum {
          PLAY = 0x01,
          SCAN_NEXT = 0x04,
          SCAN_LAST = 0x02,
          MUTE = 0x08,
          VOL_UP = 0x10,
          VOL_DOWN = 0x20,

        On the dongle side, the radio MCU merely sends one byte of data over UART with the appropriate bit set for the desired media function. This is then transmitted over USB by sending a HID report. When the EFM32HG MCU receives a byte over UART, the report is updated:

        void USART0_RX_IRQHandler(void)
        	report = USART0->RXDATA;


        Then, in the main loop, if the report has changed since the last one that was sent to the host, it is sent over USB:

            if (report != lastReport) {
        	  /* Pass keyboard report on to the HID keyboard driver. */
        	  lastReport = report;

        Note the function names and comments left over from the usbhidkbd example - a result of the limited modifications we had to make to this example to implement the media controller.

        3.2 Dongle Radio Receiver

        The Dongle’s radio receiver was built with an EFR32FG12 Wireless MCU, using RAIL as the radio interface layer. The firmware is extremely simple: a single byte packet is received from the remote device, and this packet is transmitted to the EFM32HG USB device over UART.

        3.2.1 Radio Configuration

        The EFR32FG12’s radio was configured using AppBuilder. We used the default settings for a 2.4 GHz, 1 Mbps PHY, modifying it for single byte packets. No other changes were made to this default profile’s settings.

        3.2.2 Radio to UART Implementation

        Once configured, the radio initialization is simple – the device’s radio is initialized and put into RX mode, while an RX callback is registered to handle the reception of the packet and its transmission over UART. The device then waits forever in a while loop to receive packets. The initialization routines are simply:

          // Configure RAIL callbacks
          RAIL_Idle(railHandle, RAIL_IDLE, true);
          RAIL_StartRx(railHandle, channel, NULL);
          while (1) {


        In the RX callback, the packet is received, the radio is put back into RX mode, and the packet is transmitted over UART:

        void RAILCb_Generic(RAIL_Handle_t railHandle, RAIL_Events_t events) {
          report_t packet;
          if (events & RAIL_EVENT_RX_PACKET_RECEIVED) {
            RAIL_RxPacketInfo_t packetInfo;
            // Receive the packet's one-byte payload
            packet = *(packetInfo.firstPortionData);
            RAIL_Idle(railHandle, RAIL_IDLE, true);
            RAIL_StartRx(railHandle, channel, NULL);
            // TX Packet over UART to EFM32HG
            USART_Tx(USART0, (uint8_t) packet);


        3.3 The Remote

        The remote has two main components – the user interface and the radio, used for transmitting user inputs.

        3.3.1 User Interface

        The user interface of the remote uses a joystick expansion board, which provides two buttons and an analog joystick for inputs. This expansion board is described in section 8 of this document:

        The analog joystick has an output of one pin which changes voltages depending on the direction the joystick is pressed in. To interface this joystick with the EFR32FG12, the device’s ADC is used to sample the voltage on the joystick’s output every 25 ms, triggered by the RTCC. This voltage is then converted into a direction in the ADC’s interrupt handler.

        #define ADC_MAX_CODES (0x0FFF)
        #define JOY_NONE_THRESH  (0.93 * ADC_MAX_CODES)
        #define JOY_UP_THRESH    (0.81 * ADC_MAX_CODES)
        #define JOY_RIGHT_THRESH (0.68 * ADC_MAX_CODES)
        #define JOY_LEFT_THRESH  (0.55 * ADC_MAX_CODES)
        #define JOY_DOWN_THRESH  (0)
        void ADC0_IRQHandler(void)
          uint16_t sample;
          ADC_IntClear(ADC0, ADC_IF_SINGLE);
          sample = ADC0->SINGLEDATA;
          if (sample > JOY_NONE_THRESH) {
            joyState = JOY_NONE;
          } else if (sample > JOY_UP_THRESH) {
            joyState = JOY_UP;
          } else if (sample > JOY_RIGHT_THRESH) {
            joyState = JOY_RIGHT;
          } else if (sample > JOY_LEFT_THRESH) {
            joyState = JOY_LEFT;
          } else {
            joyState = JOY_DOWN;


        For the pushbuttons, GPIO interrupts were enabled for each pushbutton pin, which update the status of the buttons.

        void BTN_Handler(void)
          bool BTN2, BTN3;
          BTN2 = GPIO_PinInGet(BTN2_PORT, BTN2_PIN);
          BTN3 = GPIO_PinInGet(BTN3_PORT, BTN3_PIN);
          if (BTN2 == BUTTON_PRESSED) {
            BTN2State = BTN2_PRESSED;
          } else {
            BTN2State = BTN2_RELEASED;
          if (BTN3 == BUTTON_PRESSED) {
            BTN3State = BTN3_PRESSED;
          } else {
            BTN3State = BTN3_RELEASED;


        3.3.1 Radio and Packet Transmission

        The radio on the EFR32FG12 device was configured exactly the same as on the dongle. In fact, the exact same AppBuilder project was used as a basis for both devices. Instead of remaining in RX mode, however, the remote is powered down between ADC measurements and button state changes. If the state of the inputs has changed (i.e. a button has been pressed or released since last sleeping), the Report Handler constructs a new report packet and transmits it. When the packet has been transmitted, the device is permitted to transition back to sleep mode.

        To construct the report packet, the states of each button and the joystick are simply ORed together, since these states are mapped to the respective bit of their function in the HID report bitfield:

        void Report_Handler(void)
          report_t report_current;
          static report_t report_previous = 0;
          while (1) {
            report_current = BTN3State | BTN2State | joyState;
            if (report_current != report_previous) {
              report_previous = report_current;
            } else {


        3.4 Integration of Micrium OS

        As an additional challenge, we were required to integrate Micrium OS into our project. For this, we decided to integrate this only on our EFR32FG12 devices, since the EFM32HG USB device was limited in flash, and it would not benefit from the addition of an operating system due to the simplicity of the firmware running on the device.

        Adding Micrium OS to the Flex Gecko EFR32FG12

        One of the challenges we faced early in the project was that Micrium OS did not natively support the EFR32FG12 in the sense that the development of a Micrium OS board support package (BSP) was required.

        1. Micrium OS Board Support Package (BSP)

        1. Compiler-specific Startup (Micrium_OS/bsp/siliconlabs/efr32fg12/source/startup/iar/startup_efr32fg12p.s)

          We first created the standard Micrium OS BSP folder structure within the Micrium_OS/bsp/siliconlabs folder using the EFM32GG11 as our reference BSP due to its similarities in the startup code. We then started modifying the compiler-specific startup file for the EFR32FG12. This step was fairly straight forward given the fact that most ARM-Cortex-M devices share the same initialization code, with the obvious difference being the number of interrupt vectors sources amongst the various devices.

          The Micrium OS kernel port relies on two ARM-Cortex-M core interrupt sources, they are the PendSV and the SysTick. In our compiler-specific startup code, we had to include these two sources found in the Micrium OS kernel port with the use of the EXTERN assembly directive:
        1. EXTERN  OS_CPU_PendSVHandler

        EXTERN  OS_CPU_SysTickHandler

        Then we allocated memory for the two handlers with:

        DCD    OS_CPU_PendSVHandler

        DCD    OS_CPU_SysTickHandler

        We now have a Micrium OS compatible compiler-specific startup file.

        1. Device-specific Startup (Micrium_OS/bsp/siliconlabs/efr32fg12/source/startup/system_efr32fg12p.c)

          A device-specific startup file was required for the clock initialization. For this, we looked inside the Gecko SDK and found the corresponding startup for the EFR32FG12P (system_efr32fg12p.c). This file was added as-is into the Micrium OS BSP.
        2. Micrium OS Tick BSP (Micrium_OS/bsp/siliconlabs/efr32fg12/source/bsp_os.c)

          The Micrium OS Tick BSP file essentially handles the kernel tick initialization in either periodic mode or in dynamic mode depending on the power consumption requirements of the project. We left this file the same as the one found in the EFM32GG11 and ran in periodic mode. As one of the potential improvements later on, we could switch to dynamic tick in order to improve the power consumption of our device.
        3. Micrium OS CPU BSP (Micrium_OS/bsp/siliconlabs/efr32fg12/source/bsp_cpu.c)

          The Micrium OS CPU BSP file deals with the setup of timestamp timers that are required by the OS for statistical purposes and other features. This was once again left the same as in the EFM32GG11.
        4. Micrium OS Interrupt Sources definitions (Micrium_OS/bsp/siliconlabs/efr32fg12/include/bsp_int.h)

          In this file, the various interrupt sources definitions are specified. Although not necessary for our project, this file is included in bsp_os.c to assign BSP_INT_ID_RTCC as a kernel aware interrupt source when dynamic tick is enabled.
        5. Micrium OS generic BSP API (Micrium_OS/bsp/include/bsp.h)

          This is the final piece of the Micrium OS BSP puzzle. In this file, the prototypes for BSP_SystemInit(), BSP_TickInit(), and BSP_PeriphInit() are defined. Some of these functions will later be used in our program main().


        2. Micrium OS main.c


        1. main()

          In the standard Micrium OS main(), the CPU is initialized with CPU_Init(), followed then by the board initialization via BSP_initDevice() and BSP_initBoard(), both from the Gecko SDK. After the CPU and the board clocks are initialized, the OS follows with OSInit() which initializes the kernel. Once the OS is initialized, our startup task is then created by calling OSTaskCreate() (see section 2b.). Finally, after the startup task has started its execution, the kernel starts by calling OSStart().
        2. StartupTask

          In the Startup Task, the kernel tick is initialized using BSP_TickInit() from the Micrium OS BSP. Other services such as the UART are also initialized here. In our case, USART2 is used. It is important to mention that the Startup Task has a 500-millisecond delay inside an infinite loop in order for it to yield CPU time to other tasks when running in a multithreaded environment.

          Since our project utilizes proprietary wireless, the RAIL library is included and therefore initialized in the Startup Task at 2.4GHz.

          In order to demonstrate different kernel services, a RAIL receive (Rx) semaphore object is created in this task.
        3. RAIL Rx Task

          Our model consists of two tasks: Startup Task and the RAIL Rx Task.

          In the RAIL Rx Task, the program pends on the RAIL Rx semaphore created in the Startup Task. Once data from the wireless remote is received by our device, an interrupt fires and a callback function dissects the packet and posts the first byte of data to the RAIL Rx semaphore. The RAIL Rx task then transmits the data received via USART2 to the Happy Gecko. The callback function briefly puts the radio in an idle state before waking the receiver once again to obtain the next radio packet.


        5. Next Steps

        With the project complete and functional using STKs and pre-made expansion boards, we want to pursue creating custom PCBs for both the remote and dongle. This would require a fair amount of work, laying out two MCUs plus a USB connector on the dongle board, and another MCU in a reasonable hand-held remote form factor for the wireless remote. Additional challenges may arise in laying out the wireless specific portions of the board, especially in regards to antenna design and placement. We hope to accomplish this early this year, and have remotes and dongles constructed for each team member to use. Overall, this has been an interesting and challenging project, and it would be great to see it to completion with a physical, practical media remote designed and built.

        6. Attached Projects

        All firmware projects can be found here:
        This includes firmware to run on the dongle's EFM32HG USB MCU and EFR32FG12 Wireless MCU, and the remote's EFR32FG12 Wireless MCU. These are:

        1. Dongle_EFM32HG - firmware for the dongle's EFM32HG to perform UART to USB HID Media Control

        2. Dongle_EFR32FG12_Micrium - firmware for the dongle's EFR32FG12 wireless receiver, with Micrium OS integration

        3. Dongle_EFR32FG12_simple - firmware for the dongle's EFR32FG12 wireless receiver, before Micrium OS integration (simple while loop)

        4. Remote_EFR32FG12 - firmware for the remote's EFR32FG12 wireless receiver for user input

      • Building a Digital Tuner from Scratch

        JohnB | 02/32/2018 | 07:54 PM
        by Silicon Labs MCU and Micrium OS applications team members John Bodnar, Sharbel Bousemaan, Mitch Crooks, and Fernando Flores

        What is tuning and why does it matter?

        If you play a musical instrument, especially if you play a wind instrument, you’re going to want to tune once you’re sufficiently warmed up. For the not so musically-inclined, tuning is the process of making an adjustment to your instrument so that the notes you play, in particular notes which correspond naturally to the construction of the instrument, are produced accurately. Electronically speaking, you could say the notes are reproduced with the correct frequency.

        Figure 1. Clarinet

        For a simple example, consider the clarinet shown above. As with any tubular musical instrument, it’s fundamental pitch (the note it most naturally plays) is proportional to its length. In particular, the clarinet above is a B-flat clarinet, so by slightly lengthening it or shortening it, the fundamental note it produces can be made to match a concert B-flat.

        Without going into too much detail, modern instruments tune to notes relative to A = 440 Hz above middle C (think the middle key on a piano). In the case of a B-flat clarinet, its fundamental pitch has a frequency of 466.164 Hz. Thus, a clarinet is “in tune” when a player adjusts his/her embouchure (the relative tension of the facial muscles and positioning of the lips and teeth) to play a B-flat and the sound that comes out of the instrument has a frequency of 466.164 Hz.

        If the sound that comes out of a clarinet when attempting to play a B-flat has a frequency that is lower than expected, the instrument is said to be flat. Similarly, if the frequency of the sound is too high, the instrument is said to be sharp.

        On a clarinet, the mouthpiece, which is the plastic and metal assembly against which the player blows, can be pushed in or pulled out slightly to adjust its tuning. So, if the player’s B-flat is sharp, the mouthpiece can be pulled out a little to lower the frequency and bring it in tune. Likewise, if the B-flat is flat (too low), the mouthpiece is pushed in slightly to raise the instrument’s pitch. Tuning an instrument to its proper concert pitch (B-flat in the case of our clarinet example) is a necessary first step to getting the other notes it can produce to also be in tune when they are played.

        What is a tuner?

        Experienced musicians and people with perfect pitch can tune by ear simply by listening to the note produced and adjusting the instrument’s tuning mechanism accordingly. The rest of us generally rely upon a device called a tuner that compares the frequency of the note we play to its mathematically calculated frequency. A modern digital tuner can be a standalone electronic device or even an application for a smartphone.  An example is shown in Figure 2.


        Figure 2. OEM Digital Tuner

        These devices, which can be had for as little as $15, are generally powered by inexpensive, 8-bit microcontrollers. Knowing this, we can probably assume that such a tuner does not make use of digital signal processing (e.g. finding the fundamental frequency by means of a FFT) to compare the frequency of the note played to what it ideally should be. Instead, we figured such a device would probably operate in the time domain by comparing the note played to what it should be purely by means of frequency comparison.

        Every instrument produces a unique sound that is colored by timbral impurities. These impurities are introduced by the shape of the instrument, the materials from which it’s constructed, and by the uniqueness of the musician’s embouchure (for wind players) or touch (for string and percussion players). The net result of these variations is that the waveform of the sound produced on a given instrument as played by any one musician is not spectrally pure but instead consists of a fundamental frequency superimposed by waveforms with various overtones (integer multiples of the fundamental frequency).

        Knowing this, we felt that a tuning method that operates purely in the time domain with no consideration of spectral content would be most suitable for a low-cost processor. Considering that dedicated digital tuners run off one or two AA or AAA batteries, such an approach would also have the benefit of being particularly energy efficient.

        Project Summary

        Our goal was to construct a digital instrument tuner that could:

        1. distinguish the fundamental frequency of the note being played,
        2. determine the nearest equal temperament musical note (based on A = 440 Hz modern tuning),
        3. visually display whether the note is sharp or flat relative to the target note/frequency, and
        4. function without resorting to computationally intensive DSP concepts in order to minimize energy use.

        We implemented what might be considered a very simply analog-to-digital converter that takes the output from an analog microphone and turns it into a pulse train with the frequency equal to that of the note’s fundamental. The pulse train is then easily captured by a microcontroller, which can then perform all of the aforementioned tasks.

        Detailed Description

        For hardware, we opted to use the EFM32 Series 1 Giant Gecko Starter Kit. While the Series 1 Giant Gecko microcontroller might be a bit overkill for the project at hand, the starter kit has a nice dot matrix memory LCD to use for output. Optimization for a smaller EFM32 microcontroller could follow later once the whole concept and application code had been proven.

        To capture the sound from the instrument and output the pulse train, we used an analog MEMS microphone with an amplifier circuit connected to a 74VHC14 Schmitt-triggered inverter. Because we would need to both measure the frequency of the note being played and keep the tuner display updated, a multi-threaded software foundation was a no-brainer.

        We used Micrium’s µC/OS real-time kernel to provide this environment, along with the kernel services needed to protect shared resources and synchronize the tasks. This RTOS foundation allowed us to simplify the design and implementation of our application code, which, at its most basic level, required just two tasks: one for sampling the pulse train and the other for displaying the tuner’s output.

        For the display task to know what pitch is being detected, the measurement task needs some way to communicate results. To do this, we opted for a simple shared variable which the measurement task updates after each sampling period.

        In multi-threaded applications, shared data must be protected by a kernel mechanism, such as a semaphore. Pending on a semaphore usually means that a task might block (be stuck waiting for new data) until it becomes available. This behavior is undesirable for our measurement task because it must sample the pulse train periodically.

        µC/OS allows a non-blocking pend on semaphore, which provides resource safety without the risk of having a task block indefinitely. The drawback of this method is that some measurements might never be communicated to the display task. In practice, this is not an issue for the tuner because we are more concerned with a fast response to changes in pitch.

        Display Task

        The design of our display task (Figure 3) follows the general outline of the flowchart below. However, we have included a signaling semaphore that the measurement task uses to notify the display task when the frequency variable has been updated. The display task blocks on this semaphore to avoid updating the display multiple times with the same data. This helps to reduce the overall energy use of the application.

        Figure 3. Display Task Flowchart

        Once it is signaled, the display task tries to access the shared frequency variable. It does so by trying to grab the tuner semaphore which is used to protect the shared data. Eventually, the semaphore will become available, allowing us to read the frequency value. The frequency is converted into a pitch using a simple lookup table. The remainder of the task deals with how the user interface will look when the pitch is displayed. We decided on a minimal design which provides a visual representation of how in or out of tune the played note is, as shown in Figure 4.

        Figure 4. Tuner Display Output

        Measurement Task

        The measurement task (Figure 5) implements an algorithm for reading the pulse train and calculating its average frequency. Pulses are captured using the WTIMER0 peripheral, while the LDMA reads a timestamp from WTIMER0 for each pulse detected. The timestamps are copied into a memory buffer over a period of 125 ms, as measured by the CRYOTIMER peripheral. Once the 125 ms has elapsed, the CRYOTIMER interrupt notifies the measurement task that the sample is ready.

        Figure 5. Measurement Task Flowchart

        The task averages the periods between pulses to calculate the frequency of the pulse train. This value is reported to the display task using the mechanisms described above. The LDMA and timer peripherals are then reinitialized for the next sample, and the task waits for the next CRYOTIMER interrupt.

        Microphone and Pulse Generation Circuit

        A primary goal of this project was to devise a means of detecting the frequency of a note played by a musical instrument without the use of complex and computationally expensive signal processing algorithms.  Use of such techniques, for example, FFTs, complicates software development, requires a substantial number of processing cycles, and increases energy use.

        We needed a computationally simpler and less energy-intensive solution that would still permit reliable detection of the frequency of the note being played.  A combination of hardware signal processing and software capture and analysis allowed us to do this with a substantially smaller computational footprint and, thus, less energy, than an FFT-based or similar approach.

        The hardware front end of the frequency measurement portion of the tuner consists of an ADMP401 analog MEMS microphone with preamplifier circuitry followed by a 74VHC14 inverting Schmitt trigger.  Figure 6 shows the signal flow through each stage of the hardware.

        Figure 6. Audio Signal Flowchart through Hardware Front-end Stages

        Although not currently implemented in the project due to time constraints but available for future implementation, a digitally-tunable low pass filter is placed between the microphone and the Schmitt trigger. Ideally, this would filter out harmonics (overtones) above the fundamental frequency of the note being played in order to improve the quality of the input to the Schmitt trigger.  Figure 7 shows the hardware front end prototype.

        Figure 7. Hardware Front-end Prototype

        The MEMs microphone captures the note played and passes an analog signal to the input of the Schmitt trigger. Depending on the instrument being played, this analog waveform will have a different envelope or shape, but it will still be periodic in nature and have the fundamental frequency of that note.  As such, it will have periodic vertical crossings of the high and low Schmitt trigger threshold voltages if the input signal is properly scaled. In Figure 8, CH1 shows the input analog waveform at a frequency of about 866 Hz.

        Figure 8. Oscilloscope Capture showing analog audio signal (microphone output) on CH1 and Schmitt-triggered output on CH2

        A Schmitt trigger is essentially a comparator circuit with hysteresis, and, in this application, it functions as a 1-bit analog-to-digital converter. Thus, as the input signal rises above the Schmitt trigger input high threshold voltage (VIH), the inverting Schmitt trigger output transitions from logic high to logic low. Likewise, when the analog signal falls below the input low threshold voltage (VIL), the inverting Schmitt trigger output transitions from logic low to logic high.

        Note that the Schmitt trigger implements hysteresis where VIH > VIL resulting in a more stable digital output in the presence of a noisy or non-monotonic input signal. The resulting output is a pulse train with the same frequency as the input analog waveform, in this case about 880 Hz.  This digital signal is then routed to one of the MCU’s timer input pins where its edges are captured and used to quickly calculate the frequency of the note being played.

        Results and Lessons Learned

        Surprisingly, the application ran almost exactly as expected when first tested with simulated instrument sounds. However, we did encounter a problem with higher frequencies resulting in pulse trains that did not correspond with the expected frequencies. In this case, our problem was caused by failure to reset the timer before each new sampling period. This meant that any pulses occurring after one measurement task ended up being counted and affecting the frequency calculated in the next run of the measurement task.

        The system responded well to clean inputs from frequency generators, sine waves recorded through the microphone, and some simulated instrument sounds. However, accuracy began to degrade when more complex tones were played, such as those from a brass instrument.

        As noted above, one aspect of the project originally conceived but not yet implemented is a low-pass filter that can be tuned to strip out harmonics coloring the sound from instruments as played by real people. Time constraints prevented this feature from being integrated into the demonstrated project. Naturally, more effort can still be spent to optimize energy use and get the entire system to provide substantial operating life from one or two alkaline batteries.

      • Electronic Label Wireless Transmission System

        yucheng | 01/30/2018 | 06:42 AM

        1. Project Summary:

        QR Codes are everywhere now, it appears on business cards, commodity packaging, electronic label, mobile payments, and so on.

        The goal of this project is developing an electronic label wireless transmission system by using the QR code. It use Python and Pyside to generate GUI frame, which responsible for encoding text strings into QR code, and sending the QR code image to cloud server, such as Amazon. A Python script application running on the cloud server, monitoring net port, once it receive QR code image, it will distribute the QR code image to EFM32GG11+WGM110. Running Micrium OS in EFM32GG11, control WGM110 via EXP interface, it will display the QR code image once receive it, also the decoded QR code text string will be displayed.


        2. Project Description:

        The figure below shows the target overview of the project.

        The GUI tool on the PC generates QR code image and sends the image to cloud server, the remote server receives and distribute QR code image, EFM32GG11 running Micrium OS can decode the QR code and show the content on the LCD. Also the EFM32GG11 can encode information string to QR code, and transfer the QR code image to cloud server, GUI will fetch the image from cloud server and decode the QR code.

        2.1 QR Code Introduction:

        QR code (Quick Response Code) is the trademark for a type of matrix barcode (or two-dimensional barcode) first designed for the automotive industry in Japan. It uses four standardized encoding modes (numeric, alphanumeric, byte/binary, and kanji) to efficiently store data; extensions may also be used.
        The symbol versions of QR Code range from Version 1 to Version 40. Each version has a different module configuration or number of modules. (The module refers to the black and white dots that make up QR Code.)
        "Module configuration" refers to the number of modules contained in a symbol, commencing with Version 1 (21 × 21 modules) up to Version 40 (177 × 177 modules). Each higher version number comprises 4 additional modules per side. 

        The resolution of the build-in LCD-TFT in the EFM32GG11 STK board is 128*128, and considering the pixel size is very small in the LCD, so will use each 4 pixels as 1 bit QR code to enhance the identification probability, and adopt Version 3 as the QR Code version for the project. The amount of data that can be stored in the QR code symbol be figured out below. For example, the version 3 (29*29) QR code can store total 77 alphanumeric character with L level ECC.


        2.2 QR Code Generator/Decoder GUI

        The GUI tool below which be developed with Python and Pyside, PyQRCode module be used for QR code encoder, and qrtools module be used for decoder.

        The tool can finish the encoding to convert URL/text into a QR Code image, and connect to remote cloud server for sending QR code image, also it has the ability to decode the QR code image fetched from the remote server.

        Below is the screenshot of the GUI tool. After inputting the text in the edit box and encoding it to QR code image, press "Upload" button will send the encoded QR code image to remote server.

        Also after receiving the QR code image from remote server, push "Decode" button will decode the image and show the decoded string.


        2.3 QR Code decode in MCU

        We use the EFM32GG11 STK and WGM110 Wi-Fi Expansion Kit for receiving QR code image, decoding QR code image, displaying QR code image and decoded text string.

        Some of the 3rd libraries are available on git for QR code image decoding, such as quirc (licensed under ISC) and qsantos (licensed under the GPL V3.0). What the library be used in the project is qsantos.

        Please refer to the figure below for the detailed flow chart of the QR code receiving and decoding on MCU.


        3. Demonstration

        The figures below demonstrate the project. The on board LED will blink during the QR code image receiving, after receiving done, it will display the QR code image immediately, and push Button 0/1 can switch the display mode.

        4. What we done in the project.

        The block diagram be illustrated in the section of "Project Description" is the finial target of the project, however, for the first step, we implemented the project followed the diagram below.
        The GUI tool is able to generate a QR code from a string, decode a QR code PNG file and show the content on PC. The EFM32GG11 STK and WIFI module act as an access port which can be connected by PC GUI tool. After creating the connection between PC GUI tool and EFM32GG11/WIFI module, GUI tool can send the encoded QR code image to the EFM32GG11, after receiving the QR code image, EFM32GG11 will decode the QR code and show the text string on the LCD screen.


        5. Lessons learned

        • QR code encoding/decoding in Python.
        • GUI Design with Python and Pyside.
        • Python Network communication.
        • Wi-Fi module transmission.
        • Micrium OS + Network.


        6. Next Steps

        The next steps for this project is involving the remote server into the project for QR code image distribution, and the electronic label wireless transmission system will become more flexibility.


        7. Source code

        The source code of the GUI tool and firmware be attached for reference.

      • How to create your own BLE Mobile App in 10 minutes

        Juan Benavides | 01/29/2018 | 10:43 PM

        Hackathon Project by Juan Benavides




        Have you ever wanted to get into mobile apps development but didn't have the time to learn other programming languages?

        In this blog I will show you how to create your own mobile app that connects to your Bluetooth Low Energy device.

        As part of a Hackathon project, I was interested in creating my own mobile app to connect to a Zigbee-based Light that doubles as a BLE Device. In other words, it is a Light that supports two protocols; Zigbee and BLE over the same radio.

        I wanted to support Android and iOS and I only had a background in Embedded Systems and a couple of days to get it done. So, after a quick Google search I was lucky to find a solution that only takes 10 minutes to develop (that's including the time spent installing the tools).

        The solution is called Evothings Studio and it offers a way to develop your mobile app using familiar web technologies such as HTML, CSS and JavaScript.

        One of the most amazing things is that the mobile app runs live on mobile devices while you develop and you can instantly see code changes on your apps.

        The process of developing the mobile app can be summarized as follows:

        1. Get a Bluetooth Device running.
        2. Install Evothings Tools in your PC and your phone.
        3. Select an example from the Evothings list of examples.
        4. Modify the example by using the UUIDs from your BLE Device. 
        5. Connect your phone to your BLE Device to control it.



        Step-by-step guide to reproduce the project


        Getting a Bluetooth Device running

        If you already have a BLE Device running, then the only thing you need to do in this step is make note of the following Unique Identifiers (UUID):

        • BLE Unique Service Indentifier: this is the UUID of the BLE Device Service you want to control (e.g. bae55b96-7d19-458d-970c-50613d801bc9).
        • BLE Characteristic Unique Identifier: this is the UUID of the BLE Device Characteristic you want to control (e.g. 76e137ac-b15f-49d7-9c4c-e278e6492ad9).

        If you don't have your BLE Device ready to go, then you can use the same one that I used from the Dynamic Multiprotocol (DMP) Light/Switch Demo. Click here and follow steps 1-5 to get it running.

        Once you get it running, open the Light Project in Simplicity Studio, open the file DynamicMultiprotocolDemoLightSoc.isc, select the Bluetooth GATT tab and make note of the BLE Unique Service Identifier and the BLE Characteristic Unique Identifier as illustrated in the image below:

        Figure 1
        Finding your BLE Device's UUIDs in Simplicity Studio



        Installing the Evothings Tools

        In this step you need to install both, the Evothings Studio application in your PC and the Evothings Mobile App in your phone. Click here to install the Evothings tools and verify that everything is operational by running their Hello World example.



        Select a Mobile App BLE example as a starting point

        Once you verified that you can run their Hello World example, locate the example Arduino101 LED On/Off BLE and click the button Copy as shown in Figure 2:

        Figure 2
        Selecting one of the BLE Examples as a starting point



        A window will be displayed for you to edit the project's name. You can call it dmp-light as shown in the following image:

        Figure 3
        Customizing the example with your own project name



        Modifying the Mobile App BLE example with your own UUIDs

        The next step is to edit the code. Select the tab MyApps in Evothings Studio to display your newly created example and click the button Edit and your text editor will open the folder where your example is located.

        First edit the file evothings.json and replace the property uuid with your own BLE Device Service UUID as shown in Figure 4:

        Figure 4
        Customizing the example with your own BLE device's service UUID



        Similarly, open the file index.html and replace the uuids in the functions to turn the LED On and Off, with your own BLE Device Characteristic UUID as shown in the image below:

        Figure 5
        Customizing the example with your own BLE device's characteristic UUID



        Finally, find out what's the advertised name of your BLE Device by scanning all the BLE devices within range from your phone as shown in Figure 6:

        Figure 6
        Scanning BLE devices within range from your phone



        Open the file index.html, locate the function app.connect and replace the advertised name of the BLE device with your own name as illustrated in Figure 7:

        Figure 7
        Customizing the example with your own BLE device name



        Connecting your new mobile app to your BLE device

        To connect to your BLE device you need to follow the same steps you made when trying their Hello World example; Start Evothings Workbench and under the Connect tab paste your Cloud Token and get a connect key using the GET KEY button. Launch Evothings Viewer on your phone, enter the connect key to hook up with your Workbench. On your computer, click the RUN button on your new DMP Light example in the Workbench.

        Now you can turn the Light On/Off from your phone as shown in the image below:

        Figure 8
        Your new mobile app





        The Evothings Studio platform is a fast and easy way to develop mobile applications for Android and iOS while supporting one single code base through the use of the three most basic web technologies (HTML, JavaScript and CSS).

        For more information on how to deploy your mobile app to an app store check the following page:

        For more information on how to get started developing your own Bluetooth Low Energy devices visit our website at: