• Zigbee to Modbus TCP/IP Gateway

        gettogupta | 07/195/2018 | 11:43 AM

        Fact #1 : Industrial IoT is one of the biggest slice of the global IoT pie.

        Fact #2 : Zigbee is a popular industrial communication wireless standard

        Fact #3 :  Modbus is an indispensable part of industrial automation.

        Our technical team realized this opportunity and came up with this Zigbee to Modbus TCP/IP gateway, that can act as an interface between a industrial Zigbee network and a Modbus TCP/IP or even a general TCP/IP network and hence the internet at large. 

        Naturally, when it came to selecting a Zigbee module we decided to go with the Telegesis ETRX357-LRS module, for a host of reasons. For the TCP/IP end we used the Xpico module by Lantronix. Because the Xpico comes in a general TCP/IP stack version and also a Industrial Modbus TCP/IP stack version. 

        To make the product more versatile we decided to add a ARM Cortex M4 MCU for some custom firmware and device management. To make the product truly easy to use and deploy on field we added a PoE+ (IEEE 802.3at) power option. 

        Look forward to comments and suggestions from experts here on potential use cases and opportunities with the product. 

      • IoT Party Button

        MarkM | 02/33/2018 | 12:58 PM

        The following is a project write-up from a recent hackathon that took place with the Silicon Labs MCU and Micrium Application Engineering teams. The members of this team were Mark Mulrooney, Michael Dean, Alan Sy and Joe Stine.


        Project Summary:

        The goal of this project was to create an IoT enabled Party Button that would allow a user to press a button and trigger a number of party lights all to turn on at the same time. This was accomplished using a combination of Silicon Labs EFM32GG11 Starter Kits, Silicon Labs EFR32MG12 Starter Kits, Silicon Labs Smart Outlets, Silicon Labs Si8751-KIT Isolators, Dream Cheeky Big Red Button and a lot of party lights/disco balls. Using this hardware, a signal was send from the Big Red Button to an MQTT broker which then propagated to out to Giant Gecko kits that listening for a signal. Some of the GG11s had the isolator connected directly to the board and would toggle their specific party light and other GG11s were connected over serial to a Mighty Gecko kit. The Mighty Gecko would send a ZCL on/off message over Zigbee to other Mighty Geckos or the Silicon Labs Smart Outlet to control the other party lights.


        Project Background:

        Since our team typically works on the EFM32 platform or with software other than Micrium OS, our main goals of this project were to become familiar with the EFR32 chips/tools and to use Micrium OS to add internet connectivity to a LAN IoT ecosystem such as a Zigbee network. As we found out, the project did prove to be a good exercise in both the EFR32 and Micrium OS.

        The project can be divided up into three main sections: MQTT, Zigbee and Isolation. An advantage of this was since the project was somewhat complicated and involved a lot of moving parts, it allowed our team members to work on different parts of the project without holding up another part of the team. The following sections describe the different parts of the project and how they operated.



        MQTT Diagram

        The IoT Party Button project used MQTT as the communication protocol between the Big Red Button and the GG11 nodes. MQTT is a lightweight publish-subscribe IoT protocol that sits on-top of TCP. For our project we used the Mosquitto broker as our MQTT broker for all of the clients to connect to. The Mosquitto broker was hosted on an AWS EC2 instance and implemented a simple username/password for some basic security. In a real-world application you would ideally use TLS in conjunction with MQTT to encrypt your connection to an MQTT broker.

        The project used MQTT for control of the trigger for a few reasons. The biggest reason was flexibility. Initially, during the planning of the project we discussed the possibility of plugging the Big Red Button into a Giant Gecko. This would have allowed us to use the Micrium OS USB stack to detect the button press and Micrium OS Net’s MQTT client to publish the button push to the MQTT broker. Since we only had a few days to complete this project we were unsure if there would be enough time to complete this portion, so we set that part aside.

        For testing purposes, we created a simple button simulator in Node.JS that could run on anyone’s computer and publish a message to the MQTT topic for a button press. Since we did not have enough time to complete the USB portion on a GG11, we ended up using a Node.JS script to listen for a button press while the button was plugged into one of our computers. When the Node.JS script detected the button press it sent an MQTT message to the trigger topic.

        Another advantage of using MQTT for control of the trigger is it opened up the ability to have the trigger sent from a number of places. We use Slack as a communication tool in the office, but we also have a helper bot that you can send commands to. It is possible that we could have had the bot send the same MQTT command to the MQTT broker to trigger the IoT party.

        All of the GG11s that were subscribed to the MQTT topic for the button trigger used Micrium OS and the Micrium OS Network MQTT client. Once the Micrium OS portion was set up, the subscribed nodes had one of two functions: either trigger an isolator connected to it or send a serial command to a Mighty Gecko to trigger it’s local network via Zigbee. For simplicity’s sake, we used the same application on all of the GG11s. This allowed us to program them all up without the need for individual code changes.



        Zigbee diagram


        The Giant Gecko kit only has Ethernet, we found it was not practical to use Giant Gecko’s for every node in our project. Instead, it was easier to use one Giant Gecko that had an Ethernet connection and use a Mighty Gecko to send the command out wirelessly to all nodes in its network. It also allowed us to use some of the Silicon Labs Smart Outlets as nodes in our project.

        Zigbee networks typically have three different types of nodes in them: Coordinator, Router and End Device. In our project, the coordinator was connected to the Giant Gecko via serial to receive the on or off command and would then relay that command out to all of the nodes in the network. The coordinator was configured using AppBuilder in Simplicity Studio. AppBuilder allows you to specify what packages should be included and generates the necessary code. Since the Giant Gecko was connected to the Mighty Gecko over serial, we enabled the command line as a simple way for the Giant Gecko to send commands to the Mighty Gecko.

        We took advantage of the Zigbee Cluster Library in this project to simplify the format of the on or off message being sent to the nodes. Also, the Silicon Labs Smart Outlets by default use the ZCL on/off library so we did not have to do any configuration on the outlets. Once the command line and ZCL on/off library was enabled in AppBuilder, it was able to put our project together and we were able to flash our coordinator.

        The rest of the nodes in our project were configured either as a router or end devices. Similar to the coordinator we used AppBuilder to generate a project that listens for ZCL on/off messages, but in this case, we did not need the command line. We did however have to add code to the ZCL on/off hooks to toggle a GPIO which would in-turn, toggle the power switcher which was connected to our party lights.


        Power Switching:

        To be energy friendly, we decided our system should control the power of the disco light. Our MCU board is running off DC power, but the disco light is powered by AC from a standard wall outlet. So we needed to control AC power from a DC system. To be safe, we should isolate the AC power from the DC power and use a high power MOSFET to turn on and off the AC power to the disco light. Fortunately, Silicon Labs makes an evaluation kit that does just this.

        The Si8751-KIT contains an evaluation board that takes care of isolating two power systems and allows for a digital input on the low voltage, DC side to control the MOSFET on the high power, AC side. Set up was as simple as configuring a few jumpers on the board, connecting the low power side to the VDD, GND, and a GPIO of the MCU, and then connecting the high-power side to the AC outlet and the disco light.

        We also had another disco light that operated from 12V DC, and fortunately, the SI8751-KIT also has high voltage DC isolation capabilities. So, we used a second Si8751-KIT to isolate the 12V DC from our low voltage DC system on the wireless MCU.


        Lessons Learned:

        This project required a fine balance between several different protocols all within the same network. This meant there were a lot of moving parts to deal with so sometimes it was difficult to determine where a problem may be occurring. Over the course of the week our debugging skills became a little more fine-tuned but we definitely had some hiccups at first.

        By far our biggest challenge was working with Zigbee. This was mainly because none of us were familiar with the tools or the development kits. The Zigbee tools, as we found out, have a bit of a learning curve and a few tricks to them. We also got unlucky when the first example project we chose to try didn’t work because of a software problem in a newly released SDK. After determining the issue was with the project we moved on to a known working example, Dynamic Multi-Protocol. Once we started working with that project we quickly realized that we were using an example that had a lot of extra overhead we did not need and was confusing us.

        After our failed experiments with some sample projects we decided to start from scratch and build up our own project in the App Builder. After jumping through a few hoops we were able to get a project configured the way we wanted. We found that starting small and building off that was a much better approach than trying to use a complicated example and trim off the excess features. We also found that complicated projects like Zigbee can have a steep learning curve and we underestimated the amount of time it would take to complete the Zigbee portion. Luckily, we were able to complete the Micrium OS portion on the Giant Gecko rather quickly which gave us extra time to focus on Zigbee.


        Next Steps:

        Due to some issues with our Zigbee configuration, our project was not complete at the end of the hackathon week. Our final presentation had the ability to send a message from the Big Red Button to the MQTT broker and down to the EFM32GG11 boards, the ability to send a serial command from the EFM32GG11 to the EFR32MG12 and the ability to switch the isolators from either the EFM32GG11 or EFR32MG12. The one gap in the project was sending the ZCL on/off message correct to all of the nodes in our network. An obvious next step for this project would be to rectify the issues in our Zigbee network configuration.

        Beyond getting the Zigbee network configured correctly, we had a few other improvements that could be implemented. First, the Big Red Button could be connected a EFM32GG11 running Micrium OS USB Host to read the button state and send it via MQTT using Micrium OS Network. The second improvement we talked about was actually hooking up a Slack chat bot to have a command to trigger the party instead of using the Big Red Button.



        While we were not able to get the project working as intended, it proved to be a very valuable exercise to explore the EFR32MG12 and a fun way to do it.

      • Wireless PC Remote for volume and media control

        BrianL | 02/33/2018 | 11:00 AM

        Recently, our MCU Applications team and our Micrium OS team decided to spend a few days, in teams, on a "Hackathon". This allowed us the opportunity to work on a larger, real-world application, in an effort to gain more insight into our products and uses.

        Our team comprised of Brian Lampkin, Janos Magasrevy, and Yanko Sosa. For our project, we decided to create a PC media controller for wireless volume and media control. This would consist of a wireless USB Dongle, connected to a wireless remote controller to provide media controls such as volume up/down, next track/last track, mute, etc.

        1. Requirements

        1.1 Hardware

        1. A USB ‘Dongle’ to provide wireless connectivity to the controller.

          This required a USB interface MCU to communicate with the host PC and a radio MCU to communicate with the remote. For the USB MCU, we chose an EFM32HG, since it is relatively small, and our application – a simple UART to USB HID command bridge – would require little flash. For our Radio MCU, we chose an EFR32FG12 device, which could cover any proprietary protocol we chose to implement. This would provide our UART to Wireless bridge.
        2. A wireless ‘Remote’ to provide the user interface for the media controller.

          We chose another EFR32FG12 radio MCU, to pair with the other on the USB dongle. Since this was to be a battery powered remote, we needed an MCU that could be run in a low duty-cycle, low power mode. To provide the user interface, buttons and a joystick on an expansion board were used.

          The completed remote and dongle hardware, using an EFM32HG STK with a Wireless Expansion Board and EFR32FG12, along with an EFR32FG12 Wireless STK with a Joystick Expansion Board, are shown below:


        1.2 Software

        An additional requirement was added – the project must integrate Micrium OS in some manner. We chose to implement this on the EFR32FG12 wireless devices to help manage wireless connectivity and low power features.

        2. System Overview

        The system block diagram is as follows:


        Buttons and Joystick input are taken by the remote’s Flex Gecko MCU, and converted into wireless packets that represend media commands. These are transmitted to the dongle’s Flex Gecko, which are then converted into UART transmissions to the dongle’s Happy Gecko MCU. Finally, these are interpreted as HID media control commands, sent to the host PC over USB.

        The joystick expansion board was mapped to the following media control functions:


        2.1 Wireless Protocol

        Our project has a very simple wireless communication requirement. When a button is pressed on the remote, this button status must be transmitted from the remote to the dongle’s receiver. Since there are few functions, a single byte payload was used to transmit this data. The remote never needs to receive any information from the dongle, so the dongle can be kept in RX mode, while the dongle can transmit a byte whenever the state of the remote’s buttons changes. This is an extremely simple communication protocol, so we decided to use the lower level Radio Abstraction Interface Layer (RAIL) directly rather than utilizing a stack such as Zigbee or Connect.

        Since no stack is used, the protocol is effectively proprietary. 2.4 GHz was chosen for the radio’s communication band, as opposed to a Sub GHz band, as this allows for a smaller antenna, useful for a handheld remote.

        2.2 Energy Concerns

        As a battery powered device, low energy consumption is a huge priority for the remote. However, since the dongle is USB powered, there is little reason to limit the power consumption there. Thus, the dongle can be awake and in RX mode continuously with little drawback when connected to the PC's USB. On the remote side, however, consideration was given into keeping the device in lower energy modes whenever possible. Due to the dongle always being in RX mode, we can effectively keep the remote in a low energy state until a button press is made, triggering a new media function update. In our design, this means that the remote only wakes to transmit a packet, then immediately re-enters sleep mode.

        3 The Dongle

        3.1 USB HID Media Device

        The first step in the project was to create a device that could communicate to the PC as a media controller. We decided to implement a HID device, which allows for driverless communication to a PC host for a limited set of known functions. For this project, we implemented what is called a USB HID “Consumer Control” device. The description for the options available in interface is provided in a table in section 15. "Consumer Page" in the USB HID Usage Tables document available on Some of the available commands found in this interface are:

        This interface includes many of the media controls that you would normally use during the use of a media application on a PC (Playing video, music, etc): Play, Pause, Record, etc. In this project, we chose to implement the following commands on our remote:

        1. Play/Pause – ID: 0xCD
        2. Scan Next Track – ID: 0xB5
        3. Scan Previous Track – ID: 0xB6
        4. Mute – ID: 0xE2
        5. Volume Increment – ID: 0xE9
        6. Volume Decrement – ID: 0xEA
        7. Play (Unused) – ID: 0xB0
        8. Stop (Unused) – ID: 0xB7

        We eventually decided not to use the Play and Stop commands, as the Play/Pause command that we found implemented the functionality we desired, and allowed us to reduce the total number of inputs to six, which would map neatly to our expansion board’s two buttons and joystick with four cardinal directions.

        3.1.1 HID Report Descriptor

        To interface with a host using the HID interface, a HID report descriptor, describing the functionality of the device, must be constructed. We used the HID Usage Tables document, which included several examples (Specifically, Appendix A.1 Volume Control contained a useful example on volume +/-), and the HID Descriptor Tool to construct the following HID Descriptor:

        // HID Report Descriptor for Interface 0
        const char hid_reportDesc[39] SL_ATTRIBUTE_ALIGN(4) =
          0x05, 0x0C,       // USAGE_PAGE (Consumer)
          0x09, 0x01,       // USAGE (Consumer Control)
          0xA1, 0x01,       // COLLECTION (Application)
          0x15, 0x00,       //   LOGICAL_MINIMUM (0)
          0x25, 0x01,       //   LOGICAL_MAXIMUM (1)
          0x75, 0x01,       //   REPORT SIZE (1)
          0x95, 0x08,       //   REPORT COUNT (8)
          0x09, 0xCD,       //   USAGE (Play/Pause)
          0x09, 0xB5,       //   USAGE (Scan Next Track)
          0x09, 0xB6,       //   USAGE (Scan Previous Track)
          0x09, 0xE2,       //   USAGE (Mute)
          0x09, 0xE9,       //   USAGE (Volume Increment)
          0x09, 0xEA,       //   USAGE (Volume Decrement)
          0x09, 0xB0,       //   USAGE (Play)
          0x09, 0xB7,       //   USAGE (Stop)
          0x81, 0x02,       //   INPUT (Data,Var,Abs)
          0x75, 0x08,       //   REPORT SIZE (8)
          0x95, 0x01,       //   REPORT COUNT (1)
          0x81, 0x03,       //   INPUT (Cnst,Var,Abs)
          0xC0              // END_COLLECTION

        This constructs a HID report with two bytes of data. The first byte implements 8 bit options, one for each of the HID commands. The second is a placeholder byte, unused by our application (but could be used for additional functions in the future).

        As a basis for our EFM32HG USB project, we used the usbhidkbd example, which implements a USB HID Keyboard. The conversion for this was rather simple, as the USB side only required a quick swap from the HID Keyboard descriptor to the new HID Consumer Device descriptor, above. With this change, the EFM32HG device now enumerated on the host PC as a media controller.

        3.1.2 USB to Radio Interface

        The next step in the process was to develop an interface that could communicate between the dongle’s radio MCU and the USB MCU to tell the PC when a media button had been pressed. For this, we implemented a simple UART interface.

        The media control functions are represented by single bits in the HID report's first byte's bitfield, as described below:

        typedef enum {
          PLAY = 0x01,
          SCAN_NEXT = 0x04,
          SCAN_LAST = 0x02,
          MUTE = 0x08,
          VOL_UP = 0x10,
          VOL_DOWN = 0x20,

        On the dongle side, the radio MCU merely sends one byte of data over UART with the appropriate bit set for the desired media function. This is then transmitted over USB by sending a HID report. When the EFM32HG MCU receives a byte over UART, the report is updated:

        void USART0_RX_IRQHandler(void)
        	report = USART0->RXDATA;


        Then, in the main loop, if the report has changed since the last one that was sent to the host, it is sent over USB:

            if (report != lastReport) {
        	  /* Pass keyboard report on to the HID keyboard driver. */
        	  lastReport = report;

        Note the function names and comments left over from the usbhidkbd example - a result of the limited modifications we had to make to this example to implement the media controller.

        3.2 Dongle Radio Receiver

        The Dongle’s radio receiver was built with an EFR32FG12 Wireless MCU, using RAIL as the radio interface layer. The firmware is extremely simple: a single byte packet is received from the remote device, and this packet is transmitted to the EFM32HG USB device over UART.

        3.2.1 Radio Configuration

        The EFR32FG12’s radio was configured using AppBuilder. We used the default settings for a 2.4 GHz, 1 Mbps PHY, modifying it for single byte packets. No other changes were made to this default profile’s settings.

        3.2.2 Radio to UART Implementation

        Once configured, the radio initialization is simple – the device’s radio is initialized and put into RX mode, while an RX callback is registered to handle the reception of the packet and its transmission over UART. The device then waits forever in a while loop to receive packets. The initialization routines are simply:

          // Configure RAIL callbacks
          RAIL_Idle(railHandle, RAIL_IDLE, true);
          RAIL_StartRx(railHandle, channel, NULL);
          while (1) {


        In the RX callback, the packet is received, the radio is put back into RX mode, and the packet is transmitted over UART:

        void RAILCb_Generic(RAIL_Handle_t railHandle, RAIL_Events_t events) {
          report_t packet;
          if (events & RAIL_EVENT_RX_PACKET_RECEIVED) {
            RAIL_RxPacketInfo_t packetInfo;
            // Receive the packet's one-byte payload
            packet = *(packetInfo.firstPortionData);
            RAIL_Idle(railHandle, RAIL_IDLE, true);
            RAIL_StartRx(railHandle, channel, NULL);
            // TX Packet over UART to EFM32HG
            USART_Tx(USART0, (uint8_t) packet);


        3.3 The Remote

        The remote has two main components – the user interface and the radio, used for transmitting user inputs.

        3.3.1 User Interface

        The user interface of the remote uses a joystick expansion board, which provides two buttons and an analog joystick for inputs. This expansion board is described in section 8 of this document:

        The analog joystick has an output of one pin which changes voltages depending on the direction the joystick is pressed in. To interface this joystick with the EFR32FG12, the device’s ADC is used to sample the voltage on the joystick’s output every 25 ms, triggered by the RTCC. This voltage is then converted into a direction in the ADC’s interrupt handler.

        #define ADC_MAX_CODES (0x0FFF)
        #define JOY_NONE_THRESH  (0.93 * ADC_MAX_CODES)
        #define JOY_UP_THRESH    (0.81 * ADC_MAX_CODES)
        #define JOY_RIGHT_THRESH (0.68 * ADC_MAX_CODES)
        #define JOY_LEFT_THRESH  (0.55 * ADC_MAX_CODES)
        #define JOY_DOWN_THRESH  (0)
        void ADC0_IRQHandler(void)
          uint16_t sample;
          ADC_IntClear(ADC0, ADC_IF_SINGLE);
          sample = ADC0->SINGLEDATA;
          if (sample > JOY_NONE_THRESH) {
            joyState = JOY_NONE;
          } else if (sample > JOY_UP_THRESH) {
            joyState = JOY_UP;
          } else if (sample > JOY_RIGHT_THRESH) {
            joyState = JOY_RIGHT;
          } else if (sample > JOY_LEFT_THRESH) {
            joyState = JOY_LEFT;
          } else {
            joyState = JOY_DOWN;


        For the pushbuttons, GPIO interrupts were enabled for each pushbutton pin, which update the status of the buttons.

        void BTN_Handler(void)
          bool BTN2, BTN3;
          BTN2 = GPIO_PinInGet(BTN2_PORT, BTN2_PIN);
          BTN3 = GPIO_PinInGet(BTN3_PORT, BTN3_PIN);
          if (BTN2 == BUTTON_PRESSED) {
            BTN2State = BTN2_PRESSED;
          } else {
            BTN2State = BTN2_RELEASED;
          if (BTN3 == BUTTON_PRESSED) {
            BTN3State = BTN3_PRESSED;
          } else {
            BTN3State = BTN3_RELEASED;


        3.3.1 Radio and Packet Transmission

        The radio on the EFR32FG12 device was configured exactly the same as on the dongle. In fact, the exact same AppBuilder project was used as a basis for both devices. Instead of remaining in RX mode, however, the remote is powered down between ADC measurements and button state changes. If the state of the inputs has changed (i.e. a button has been pressed or released since last sleeping), the Report Handler constructs a new report packet and transmits it. When the packet has been transmitted, the device is permitted to transition back to sleep mode.

        To construct the report packet, the states of each button and the joystick are simply ORed together, since these states are mapped to the respective bit of their function in the HID report bitfield:

        void Report_Handler(void)
          report_t report_current;
          static report_t report_previous = 0;
          while (1) {
            report_current = BTN3State | BTN2State | joyState;
            if (report_current != report_previous) {
              report_previous = report_current;
            } else {


        3.4 Integration of Micrium OS

        As an additional challenge, we were required to integrate Micrium OS into our project. For this, we decided to integrate this only on our EFR32FG12 devices, since the EFM32HG USB device was limited in flash, and it would not benefit from the addition of an operating system due to the simplicity of the firmware running on the device.

        Adding Micrium OS to the Flex Gecko EFR32FG12

        One of the challenges we faced early in the project was that Micrium OS did not natively support the EFR32FG12 in the sense that the development of a Micrium OS board support package (BSP) was required.

        1. Micrium OS Board Support Package (BSP)

        1. Compiler-specific Startup (Micrium_OS/bsp/siliconlabs/efr32fg12/source/startup/iar/startup_efr32fg12p.s)

          We first created the standard Micrium OS BSP folder structure within the Micrium_OS/bsp/siliconlabs folder using the EFM32GG11 as our reference BSP due to its similarities in the startup code. We then started modifying the compiler-specific startup file for the EFR32FG12. This step was fairly straight forward given the fact that most ARM-Cortex-M devices share the same initialization code, with the obvious difference being the number of interrupt vectors sources amongst the various devices.

          The Micrium OS kernel port relies on two ARM-Cortex-M core interrupt sources, they are the PendSV and the SysTick. In our compiler-specific startup code, we had to include these two sources found in the Micrium OS kernel port with the use of the EXTERN assembly directive:
        1. EXTERN  OS_CPU_PendSVHandler

        EXTERN  OS_CPU_SysTickHandler

        Then we allocated memory for the two handlers with:

        DCD    OS_CPU_PendSVHandler

        DCD    OS_CPU_SysTickHandler

        We now have a Micrium OS compatible compiler-specific startup file.

        1. Device-specific Startup (Micrium_OS/bsp/siliconlabs/efr32fg12/source/startup/system_efr32fg12p.c)

          A device-specific startup file was required for the clock initialization. For this, we looked inside the Gecko SDK and found the corresponding startup for the EFR32FG12P (system_efr32fg12p.c). This file was added as-is into the Micrium OS BSP.
        2. Micrium OS Tick BSP (Micrium_OS/bsp/siliconlabs/efr32fg12/source/bsp_os.c)

          The Micrium OS Tick BSP file essentially handles the kernel tick initialization in either periodic mode or in dynamic mode depending on the power consumption requirements of the project. We left this file the same as the one found in the EFM32GG11 and ran in periodic mode. As one of the potential improvements later on, we could switch to dynamic tick in order to improve the power consumption of our device.
        3. Micrium OS CPU BSP (Micrium_OS/bsp/siliconlabs/efr32fg12/source/bsp_cpu.c)

          The Micrium OS CPU BSP file deals with the setup of timestamp timers that are required by the OS for statistical purposes and other features. This was once again left the same as in the EFM32GG11.
        4. Micrium OS Interrupt Sources definitions (Micrium_OS/bsp/siliconlabs/efr32fg12/include/bsp_int.h)

          In this file, the various interrupt sources definitions are specified. Although not necessary for our project, this file is included in bsp_os.c to assign BSP_INT_ID_RTCC as a kernel aware interrupt source when dynamic tick is enabled.
        5. Micrium OS generic BSP API (Micrium_OS/bsp/include/bsp.h)

          This is the final piece of the Micrium OS BSP puzzle. In this file, the prototypes for BSP_SystemInit(), BSP_TickInit(), and BSP_PeriphInit() are defined. Some of these functions will later be used in our program main().


        2. Micrium OS main.c


        1. main()

          In the standard Micrium OS main(), the CPU is initialized with CPU_Init(), followed then by the board initialization via BSP_initDevice() and BSP_initBoard(), both from the Gecko SDK. After the CPU and the board clocks are initialized, the OS follows with OSInit() which initializes the kernel. Once the OS is initialized, our startup task is then created by calling OSTaskCreate() (see section 2b.). Finally, after the startup task has started its execution, the kernel starts by calling OSStart().
        2. StartupTask

          In the Startup Task, the kernel tick is initialized using BSP_TickInit() from the Micrium OS BSP. Other services such as the UART are also initialized here. In our case, USART2 is used. It is important to mention that the Startup Task has a 500-millisecond delay inside an infinite loop in order for it to yield CPU time to other tasks when running in a multithreaded environment.

          Since our project utilizes proprietary wireless, the RAIL library is included and therefore initialized in the Startup Task at 2.4GHz.

          In order to demonstrate different kernel services, a RAIL receive (Rx) semaphore object is created in this task.
        3. RAIL Rx Task

          Our model consists of two tasks: Startup Task and the RAIL Rx Task.

          In the RAIL Rx Task, the program pends on the RAIL Rx semaphore created in the Startup Task. Once data from the wireless remote is received by our device, an interrupt fires and a callback function dissects the packet and posts the first byte of data to the RAIL Rx semaphore. The RAIL Rx task then transmits the data received via USART2 to the Happy Gecko. The callback function briefly puts the radio in an idle state before waking the receiver once again to obtain the next radio packet.


        5. Next Steps

        With the project complete and functional using STKs and pre-made expansion boards, we want to pursue creating custom PCBs for both the remote and dongle. This would require a fair amount of work, laying out two MCUs plus a USB connector on the dongle board, and another MCU in a reasonable hand-held remote form factor for the wireless remote. Additional challenges may arise in laying out the wireless specific portions of the board, especially in regards to antenna design and placement. We hope to accomplish this early this year, and have remotes and dongles constructed for each team member to use. Overall, this has been an interesting and challenging project, and it would be great to see it to completion with a physical, practical media remote designed and built.

        6. Attached Projects

        All firmware projects can be found here:
        This includes firmware to run on the dongle's EFM32HG USB MCU and EFR32FG12 Wireless MCU, and the remote's EFR32FG12 Wireless MCU. These are:

        1. Dongle_EFM32HG - firmware for the dongle's EFM32HG to perform UART to USB HID Media Control

        2. Dongle_EFR32FG12_Micrium - firmware for the dongle's EFR32FG12 wireless receiver, with Micrium OS integration

        3. Dongle_EFR32FG12_simple - firmware for the dongle's EFR32FG12 wireless receiver, before Micrium OS integration (simple while loop)

        4. Remote_EFR32FG12 - firmware for the remote's EFR32FG12 wireless receiver for user input

      • Building a Digital Tuner from Scratch

        JohnB | 02/32/2018 | 02:54 PM
        by Silicon Labs MCU and Micrium OS applications team members John Bodnar, Sharbel Bousemaan, Mitch Crooks, and Fernando Flores

        What is tuning and why does it matter?

        If you play a musical instrument, especially if you play a wind instrument, you’re going to want to tune once you’re sufficiently warmed up. For the not so musically-inclined, tuning is the process of making an adjustment to your instrument so that the notes you play, in particular notes which correspond naturally to the construction of the instrument, are produced accurately. Electronically speaking, you could say the notes are reproduced with the correct frequency.

        Figure 1. Clarinet

        For a simple example, consider the clarinet shown above. As with any tubular musical instrument, it’s fundamental pitch (the note it most naturally plays) is proportional to its length. In particular, the clarinet above is a B-flat clarinet, so by slightly lengthening it or shortening it, the fundamental note it produces can be made to match a concert B-flat.

        Without going into too much detail, modern instruments tune to notes relative to A = 440 Hz above middle C (think the middle key on a piano). In the case of a B-flat clarinet, its fundamental pitch has a frequency of 466.164 Hz. Thus, a clarinet is “in tune” when a player adjusts his/her embouchure (the relative tension of the facial muscles and positioning of the lips and teeth) to play a B-flat and the sound that comes out of the instrument has a frequency of 466.164 Hz.

        If the sound that comes out of a clarinet when attempting to play a B-flat has a frequency that is lower than expected, the instrument is said to be flat. Similarly, if the frequency of the sound is too high, the instrument is said to be sharp.

        On a clarinet, the mouthpiece, which is the plastic and metal assembly against which the player blows, can be pushed in or pulled out slightly to adjust its tuning. So, if the player’s B-flat is sharp, the mouthpiece can be pulled out a little to lower the frequency and bring it in tune. Likewise, if the B-flat is flat (too low), the mouthpiece is pushed in slightly to raise the instrument’s pitch. Tuning an instrument to its proper concert pitch (B-flat in the case of our clarinet example) is a necessary first step to getting the other notes it can produce to also be in tune when they are played.

        What is a tuner?

        Experienced musicians and people with perfect pitch can tune by ear simply by listening to the note produced and adjusting the instrument’s tuning mechanism accordingly. The rest of us generally rely upon a device called a tuner that compares the frequency of the note we play to its mathematically calculated frequency. A modern digital tuner can be a standalone electronic device or even an application for a smartphone.  An example is shown in Figure 2.

        Figure 2. OEM Digital Tuner

        These devices, which can be had for as little as $15, are generally powered by inexpensive, 8-bit microcontrollers. Knowing this, we can probably assume that such a tuner does not make use of digital signal processing (e.g. finding the fundamental frequency by means of a FFT) to compare the frequency of the note played to what it ideally should be. Instead, we figured such a device would probably operate in the time domain by comparing the note played to what it should be purely by means of frequency comparison.

        Every instrument produces a unique sound that is colored by timbral impurities. These impurities are introduced by the shape of the instrument, the materials from which it’s constructed, and by the uniqueness of the musician’s embouchure (for wind players) or touch (for string and percussion players). The net result of these variations is that the waveform of the sound produced on a given instrument as played by any one musician is not spectrally pure but instead consists of a fundamental frequency superimposed by waveforms with various overtones (integer multiples of the fundamental frequency).

        Knowing this, we felt that a tuning method that operates purely in the time domain with no consideration of spectral content would be most suitable for a low-cost processor. Considering that dedicated digital tuners run off one or two AA or AAA batteries, such an approach would also have the benefit of being particularly energy efficient.

        Project Summary

        Our goal was to construct a digital instrument tuner that could:

        1. distinguish the fundamental frequency of the note being played,
        2. determine the nearest equal temperament musical note (based on A = 440 Hz modern tuning),
        3. visually display whether the note is sharp or flat relative to the target note/frequency, and
        4. function without resorting to computationally intensive DSP concepts in order to minimize energy use.

        We implemented what might be considered a very simply analog-to-digital converter that takes the output from an analog microphone and turns it into a pulse train with the frequency equal to that of the note’s fundamental. The pulse train is then easily captured by a microcontroller, which can then perform all of the aforementioned tasks.

        Detailed Description

        For hardware, we opted to use the EFM32 Series 1 Giant Gecko Starter Kit. While the Series 1 Giant Gecko microcontroller might be a bit overkill for the project at hand, the starter kit has a nice dot matrix memory LCD to use for output. Optimization for a smaller EFM32 microcontroller could follow later once the whole concept and application code had been proven.

        To capture the sound from the instrument and output the pulse train, we used an analog MEMS microphone with an amplifier circuit connected to a 74VHC14 Schmitt-triggered inverter. Because we would need to both measure the frequency of the note being played and keep the tuner display updated, a multi-threaded software foundation was a no-brainer.

        We used Micrium’s µC/OS real-time kernel to provide this environment, along with the kernel services needed to protect shared resources and synchronize the tasks. This RTOS foundation allowed us to simplify the design and implementation of our application code, which, at its most basic level, required just two tasks: one for sampling the pulse train and the other for displaying the tuner’s output.

        For the display task to know what pitch is being detected, the measurement task needs some way to communicate results. To do this, we opted for a simple shared variable which the measurement task updates after each sampling period.

        In multi-threaded applications, shared data must be protected by a kernel mechanism, such as a semaphore. Pending on a semaphore usually means that a task might block (be stuck waiting for new data) until it becomes available. This behavior is undesirable for our measurement task because it must sample the pulse train periodically.

        µC/OS allows a non-blocking pend on semaphore, which provides resource safety without the risk of having a task block indefinitely. The drawback of this method is that some measurements might never be communicated to the display task. In practice, this is not an issue for the tuner because we are more concerned with a fast response to changes in pitch.

        Display Task

        The design of our display task (Figure 3) follows the general outline of the flowchart below. However, we have included a signaling semaphore that the measurement task uses to notify the display task when the frequency variable has been updated. The display task blocks on this semaphore to avoid updating the display multiple times with the same data. This helps to reduce the overall energy use of the application.

        Figure 3. Display Task Flowchart

        Once it is signaled, the display task tries to access the shared frequency variable. It does so by trying to grab the tuner semaphore which is used to protect the shared data. Eventually, the semaphore will become available, allowing us to read the frequency value. The frequency is converted into a pitch using a simple lookup table. The remainder of the task deals with how the user interface will look when the pitch is displayed. We decided on a minimal design which provides a visual representation of how in or out of tune the played note is, as shown in Figure 4.

        Figure 4. Tuner Display Output

        Measurement Task

        The measurement task (Figure 5) implements an algorithm for reading the pulse train and calculating its average frequency. Pulses are captured using the WTIMER0 peripheral, while the LDMA reads a timestamp from WTIMER0 for each pulse detected. The timestamps are copied into a memory buffer over a period of 125 ms, as measured by the CRYOTIMER peripheral. Once the 125 ms has elapsed, the CRYOTIMER interrupt notifies the measurement task that the sample is ready.

        Figure 5. Measurement Task Flowchart

        The task averages the periods between pulses to calculate the frequency of the pulse train. This value is reported to the display task using the mechanisms described above. The LDMA and timer peripherals are then reinitialized for the next sample, and the task waits for the next CRYOTIMER interrupt.

        Microphone and Pulse Generation Circuit

        A primary goal of this project was to devise a means of detecting the frequency of a note played by a musical instrument without the use of complex and computationally expensive signal processing algorithms.  Use of such techniques, for example, FFTs, complicates software development, requires a substantial number of processing cycles, and increases energy use.

        We needed a computationally simpler and less energy-intensive solution that would still permit reliable detection of the frequency of the note being played.  A combination of hardware signal processing and software capture and analysis allowed us to do this with a substantially smaller computational footprint and, thus, less energy, than an FFT-based or similar approach.

        The hardware front end of the frequency measurement portion of the tuner consists of an ADMP401 analog MEMS microphone with preamplifier circuitry followed by a 74VHC14 inverting Schmitt trigger.  Figure 6 shows the signal flow through each stage of the hardware.

        Figure 6. Audio Signal Flowchart through Hardware Front-end Stages

        Although not currently implemented in the project due to time constraints but available for future implementation, a digitally-tunable low pass filter is placed between the microphone and the Schmitt trigger. Ideally, this would filter out harmonics (overtones) above the fundamental frequency of the note being played in order to improve the quality of the input to the Schmitt trigger.  Figure 7 shows the hardware front end prototype.

        Figure 7. Hardware Front-end Prototype

        The MEMs microphone captures the note played and passes an analog signal to the input of the Schmitt trigger. Depending on the instrument being played, this analog waveform will have a different envelope or shape, but it will still be periodic in nature and have the fundamental frequency of that note.  As such, it will have periodic vertical crossings of the high and low Schmitt trigger threshold voltages if the input signal is properly scaled. In Figure 8, CH1 shows the input analog waveform at a frequency of about 866 Hz.

        Figure 8. Oscilloscope Capture showing analog audio signal (microphone output) on CH1 and Schmitt-triggered output on CH2

        A Schmitt trigger is essentially a comparator circuit with hysteresis, and, in this application, it functions as a 1-bit analog-to-digital converter. Thus, as the input signal rises above the Schmitt trigger input high threshold voltage (VIH), the inverting Schmitt trigger output transitions from logic high to logic low. Likewise, when the analog signal falls below the input low threshold voltage (VIL), the inverting Schmitt trigger output transitions from logic low to logic high.

        Note that the Schmitt trigger implements hysteresis where VIH > VIL resulting in a more stable digital output in the presence of a noisy or non-monotonic input signal. The resulting output is a pulse train with the same frequency as the input analog waveform, in this case about 880 Hz.  This digital signal is then routed to one of the MCU’s timer input pins where its edges are captured and used to quickly calculate the frequency of the note being played.

        Results and Lessons Learned

        Surprisingly, the application ran almost exactly as expected when first tested with simulated instrument sounds. However, we did encounter a problem with higher frequencies resulting in pulse trains that did not correspond with the expected frequencies. In this case, our problem was caused by failure to reset the timer before each new sampling period. This meant that any pulses occurring after one measurement task ended up being counted and affecting the frequency calculated in the next run of the measurement task.

        The system responded well to clean inputs from frequency generators, sine waves recorded through the microphone, and some simulated instrument sounds. However, accuracy began to degrade when more complex tones were played, such as those from a brass instrument.

        As noted above, one aspect of the project originally conceived but not yet implemented is a low-pass filter that can be tuned to strip out harmonics coloring the sound from instruments as played by real people. Time constraints prevented this feature from being integrated into the demonstrated project. Naturally, more effort can still be spent to optimize energy use and get the entire system to provide substantial operating life from one or two alkaline batteries.

      • Electronic Label Wireless Transmission System

        yucheng | 01/30/2018 | 01:42 AM

        1. Project Summary:

        QR Codes are everywhere now, it appears on business cards, commodity packaging, electronic label, mobile payments, and so on.

        The goal of this project is developing an electronic label wireless transmission system by using the QR code. It use Python and Pyside to generate GUI frame, which responsible for encoding text strings into QR code, and sending the QR code image to cloud server, such as Amazon. A Python script application running on the cloud server, monitoring net port, once it receive QR code image, it will distribute the QR code image to EFM32GG11+WGM110. Running Micrium OS in EFM32GG11, control WGM110 via EXP interface, it will display the QR code image once receive it, also the decoded QR code text string will be displayed.


        2. Project Description:

        The figure below shows the target overview of the project.

        The GUI tool on the PC generates QR code image and sends the image to cloud server, the remote server receives and distribute QR code image, EFM32GG11 running Micrium OS can decode the QR code and show the content on the LCD. Also the EFM32GG11 can encode information string to QR code, and transfer the QR code image to cloud server, GUI will fetch the image from cloud server and decode the QR code.

        2.1 QR Code Introduction:

        QR code (Quick Response Code) is the trademark for a type of matrix barcode (or two-dimensional barcode) first designed for the automotive industry in Japan. It uses four standardized encoding modes (numeric, alphanumeric, byte/binary, and kanji) to efficiently store data; extensions may also be used.
        The symbol versions of QR Code range from Version 1 to Version 40. Each version has a different module configuration or number of modules. (The module refers to the black and white dots that make up QR Code.)
        "Module configuration" refers to the number of modules contained in a symbol, commencing with Version 1 (21 × 21 modules) up to Version 40 (177 × 177 modules). Each higher version number comprises 4 additional modules per side. 

        The resolution of the build-in LCD-TFT in the EFM32GG11 STK board is 128*128, and considering the pixel size is very small in the LCD, so will use each 4 pixels as 1 bit QR code to enhance the identification probability, and adopt Version 3 as the QR Code version for the project. The amount of data that can be stored in the QR code symbol be figured out below. For example, the version 3 (29*29) QR code can store total 77 alphanumeric character with L level ECC.


        2.2 QR Code Generator/Decoder GUI

        The GUI tool below which be developed with Python and Pyside, PyQRCode module be used for QR code encoder, and qrtools module be used for decoder.

        The tool can finish the encoding to convert URL/text into a QR Code image, and connect to remote cloud server for sending QR code image, also it has the ability to decode the QR code image fetched from the remote server.

        Below is the screenshot of the GUI tool. After inputting the text in the edit box and encoding it to QR code image, press "Upload" button will send the encoded QR code image to remote server.

        Also after receiving the QR code image from remote server, push "Decode" button will decode the image and show the decoded string.


        2.3 QR Code decode in MCU

        We use the EFM32GG11 STK and WGM110 Wi-Fi Expansion Kit for receiving QR code image, decoding QR code image, displaying QR code image and decoded text string.

        Some of the 3rd libraries are available on git for QR code image decoding, such as quirc (licensed under ISC) and qsantos (licensed under the GPL V3.0). What the library be used in the project is qsantos.

        Please refer to the figure below for the detailed flow chart of the QR code receiving and decoding on MCU.


        3. Demonstration

        The figures below demonstrate the project. The on board LED will blink during the QR code image receiving, after receiving done, it will display the QR code image immediately, and push Button 0/1 can switch the display mode.

        4. What we done in the project.

        The block diagram be illustrated in the section of "Project Description" is the finial target of the project, however, for the first step, we implemented the project followed the diagram below.
        The GUI tool is able to generate a QR code from a string, decode a QR code PNG file and show the content on PC. The EFM32GG11 STK and WIFI module act as an access port which can be connected by PC GUI tool. After creating the connection between PC GUI tool and EFM32GG11/WIFI module, GUI tool can send the encoded QR code image to the EFM32GG11, after receiving the QR code image, EFM32GG11 will decode the QR code and show the text string on the LCD screen.


        5. Lessons learned

        • QR code encoding/decoding in Python.
        • GUI Design with Python and Pyside.
        • Python Network communication.
        • Wi-Fi module transmission.
        • Micrium OS + Network.


        6. Next Steps

        The next steps for this project is involving the remote server into the project for QR code image distribution, and the electronic label wireless transmission system will become more flexibility.


        7. Source code

        The source code of the GUI tool and firmware be attached for reference.

      • How to create your own BLE Mobile App in 10 minutes

        Juan Benavides | 01/29/2018 | 05:43 PM

        Hackathon Project by Juan Benavides




        Have you ever wanted to get into mobile apps development but didn't have the time to learn other programming languages?

        In this blog I will show you how to create your own mobile app that connects to your Bluetooth Low Energy device.

        As part of a Hackathon project, I was interested in creating my own mobile app to connect to a Zigbee-based Light that doubles as a BLE Device. In other words, it is a Light that supports two protocols; Zigbee and BLE over the same radio.

        I wanted to support Android and iOS and I only had a background in Embedded Systems and a couple of days to get it done. So, after a quick Google search I was lucky to find a solution that only takes 10 minutes to develop (that's including the time spent installing the tools).

        The solution is called Evothings Studio and it offers a way to develop your mobile app using familiar web technologies such as HTML, CSS and JavaScript.

        One of the most amazing things is that the mobile app runs live on mobile devices while you develop and you can instantly see code changes on your apps.

        The process of developing the mobile app can be summarized as follows:

        1. Get a Bluetooth Device running.
        2. Install Evothings Tools in your PC and your phone.
        3. Select an example from the Evothings list of examples.
        4. Modify the example by using the UUIDs from your BLE Device. 
        5. Connect your phone to your BLE Device to control it.



        Step-by-step guide to reproduce the project


        Getting a Bluetooth Device running

        If you already have a BLE Device running, then the only thing you need to do in this step is make note of the following Unique Identifiers (UUID):

        • BLE Unique Service Indentifier: this is the UUID of the BLE Device Service you want to control (e.g. bae55b96-7d19-458d-970c-50613d801bc9).
        • BLE Characteristic Unique Identifier: this is the UUID of the BLE Device Characteristic you want to control (e.g. 76e137ac-b15f-49d7-9c4c-e278e6492ad9).

        If you don't have your BLE Device ready to go, then you can use the same one that I used from the Dynamic Multiprotocol (DMP) Light/Switch Demo. Click here and follow steps 1-5 to get it running.

        Once you get it running, open the Light Project in Simplicity Studio, open the file DynamicMultiprotocolDemoLightSoc.isc, select the Bluetooth GATT tab and make note of the BLE Unique Service Identifier and the BLE Characteristic Unique Identifier as illustrated in the image below:

        Figure 1
        Finding your BLE Device's UUIDs in Simplicity Studio



        Installing the Evothings Tools

        In this step you need to install both, the Evothings Studio application in your PC and the Evothings Mobile App in your phone. Click here to install the Evothings tools and verify that everything is operational by running their Hello World example.



        Select a Mobile App BLE example as a starting point

        Once you verified that you can run their Hello World example, locate the example Arduino101 LED On/Off BLE and click the button Copy as shown in Figure 2:

        Figure 2
        Selecting one of the BLE Examples as a starting point



        A window will be displayed for you to edit the project's name. You can call it dmp-light as shown in the following image:

        Figure 3
        Customizing the example with your own project name



        Modifying the Mobile App BLE example with your own UUIDs

        The next step is to edit the code. Select the tab MyApps in Evothings Studio to display your newly created example and click the button Edit and your text editor will open the folder where your example is located.

        First edit the file evothings.json and replace the property uuid with your own BLE Device Service UUID as shown in Figure 4:

        Figure 4
        Customizing the example with your own BLE device's service UUID



        Similarly, open the file index.html and replace the uuids in the functions to turn the LED On and Off, with your own BLE Device Characteristic UUID as shown in the image below:

        Figure 5
        Customizing the example with your own BLE device's characteristic UUID



        Finally, find out what's the advertised name of your BLE Device by scanning all the BLE devices within range from your phone as shown in Figure 6:

        Figure 6
        Scanning BLE devices within range from your phone



        Open the file index.html, locate the function app.connect and replace the advertised name of the BLE device with your own name as illustrated in Figure 7:

        Figure 7
        Customizing the example with your own BLE device name



        Connecting your new mobile app to your BLE device

        To connect to your BLE device you need to follow the same steps you made when trying their Hello World example; Start Evothings Workbench and under the Connect tab paste your Cloud Token and get a connect key using the GET KEY button. Launch Evothings Viewer on your phone, enter the connect key to hook up with your Workbench. On your computer, click the RUN button on your new DMP Light example in the Workbench.

        Now you can turn the Light On/Off from your phone as shown in the image below:

        Figure 8
        Your new mobile app





        The Evothings Studio platform is a fast and easy way to develop mobile applications for Android and iOS while supporting one single code base through the use of the three most basic web technologies (HTML, JavaScript and CSS).

        For more information on how to deploy your mobile app to an app store check the following page:

        For more information on how to get started developing your own Bluetooth Low Energy devices visit our website at:


      • How to control a Zigbee-based Light with an Occupancy Sensor, Alexa and your Phone

        Juan Benavides | 01/22/2018 | 03:05 PM

        Hackathon project by Manasa Rao, Stephen To and Juan Benavides




        Would you like to learn an easy way to control a Zigbee-based Light with Alexa voice commands?

        Or how about controlling a Zigbee-based Light automatically with an Occupancy Sensor?

        Furthermore, how would you like to learn how to easily create your own Bluetooth mobile application to control a Zigbee-based Light and access the Internet?

        In this Hackathon project we demonstrate how to easily accomplish all of these cool features.

        Figure 1
        Project Presentation




        Silicon Labs released on November 2017 the new Dynamic Multiprotocol Software (DMP) for our Wireless Geckos. In summary, the DMP software enables the simultaneous operation of Zigbee and Bluetooh Low Energy (BLE) on a single radio chip.

        The best example to showcase this technology is by controlling and monitoring a Zigbee-based Light directly over Bluetooth with a smartphone mobile application. This example is actually available with the DMP SDK and our Hackathon project is based on it. 




        Project Overview

        The following image illustrates the scope of this blog:

        Project Overview
        Figure 2
        Project Block Diagram


        The Demo featured in this blog is based on the official DMP Light/Switch Demo that consists of two devices:

        Light: it runs Micrium OS to switch between Zigbee and Bluetooth on a single radio. 

        Switch: it supports Zigbee to provide wireless control of the Light.

        Click here to get started with this demo.

        In the original demo, users press a push button on the wireless Switch device, which in turn, sends a Zigbee Light Link (ZLL) command to toggle the Light On or Off.

        The Light device runs Micrium OS to switch between Zigbee and Bluetooth while sharing the same radio. This allows a mobile application to connect to the Light via Bluetooth.

        Users tap the mobile application to toggle the Light On/Off. 
        The Light device receives the Bluetooth notification and not only toggles the Light On/Off  but also updates the corresponding ZLL Cluster Attribute so the Switch remains in sync.

        This blog shows you how to add an Occupancy Sensor to the Switch device by connecting the Silicon Labs Optical Sensor Expansion Board (Si1133/Si1153) to the Switch's EXP Header (I2C).

        The Optical Sensor Expansion Board has a series of optical sensors to detect the presence of a person in the room. The occupancy sensor is used by the embedded application to control the light automatically. 

        Click here to see Stephen's blog on how to add this occupancy sensor to the DMP Light/Switch Demo.

        This blog is also going to show you how to control the Light with Alexa Voice Commands.

        To enable Alexa Voice Commands, we are also going to show you an easy way to create your own Bluetooth Mobile Application. This mobile application will use the smartphone as a gateway to access the Internet.

        Click here to see Juan's blog on how to create your own BLE mobile app to control the DMP Light/Switch Demo.

        Users will get to control the Light with Voice Commands and this blog will describe how to setup the system in Amazon Web Services (AWS).

        Click here to see Manasa's blog on how to add Alexa capabilities to the DMP Light/Switch Demo.




        Useful Links




      • Windows Bluetooth system analysis – a SensorTag approach

        m_dobrea | 12/364/2017 | 11:49 AM

        In the following, I will do an analysis of the Windows operating system (OS) from the point of view of communication with Bluetooth Low Energy devices – in our case with different types of SensorTags:  Thunderboard React, Thunderboard Sense (both produced by Silicon Labs Company), CC2650STK and CC2541DK (both developed by Texas Instruments Company).

        I what follows, I will analyze Windows 7, Windows 8.1 and the following Windows 10 versions:

        • Anniversary Update (released on August 2, 2016; end of support: tentatively March 2018),
        • Creators Update (released on April 5, 2017; end of support: tentatively September 2018) and
        • Fall Creators Update (released on October 17, 2017; end of support: tentatively March 2019).

        The analysis will be done from the following points of view:

        1. The ability of the operating system (OS) to pair with a SensorTag;
        2. The ability to get Generic Access data (this is a mandatory service);
        3. The ability to get Device Information (this service exposes manufacturer and/or vendor information related a specific SensorTag);
        4. The ability to get the SensorTag’s data, using the reading approach and
        5. The ability to get the SensorTag’s data, using the notification approach.

        All the tests were done using version of the blessTags application. The blessTags application was built having as support the Windows SDK – Bluetoothapis. Functions like BluetoothGATTGetCharacteristicValue, BluetoothGATTGetDescriptorValueBluetoothGATTGetServices or BluetoothGATTSetCharacteristicValue were used.

        This application, blessTags (BLE SensorTags) application, can be downloaded from the Windows Store Apps: For more information, demo, practical applications, examples etc. please visit the following blog:


        Windows 10 - Anniversary Update - Version 1607

        This version of the Windows 10 operating system is the best one, from the point of view of Bluetooth Low Energy devices. It can pair without any problem with all SensorTags (regardless of the software version running on them), with which blessTags application knows how to work (CC2650STK, Thunderboard React, Thunderboard Sense and CC2541DK), and all the information from the Bluetooth’s Services Get Generic Access and Get Device Information is acquired without any problem.

        Analyzing the data acquisition speed (for CC2650STK and CC2541DK devices) using notifying and reading mechanism of data transfer, we can observe the following:

        1. through the notification mechanism, we can get data from all sensors (eight) from 150 [ms] to 150 [ms] without any problems;
        2. instead, when we set the acquisition time to 150 [ms] and we use the data reading mechanism - in the happiest situation, we get 713 [ms] and in the worst case, we get 840 [ms].

        If we will analyze Thunderboard React and Thunderboard Sense, we will get the equivalent results – they work without any problem in the Windows 10 Anniversary Update environment.

        In fact, all the presentation movies of the blessTags application main functions and of the different specific features (like Gadgets) have been made with the support of the Windows 10 Anniversary Update.

        A small demonstration and a proof of the above statements in the following movie:

        Movie I


        Windows 10 - Creators Update - Version 1703

        The Creators Update version of Windows 10 is the worst operating system (OS) from the point of view of Bluetooth Low Energy devices.

        Almost nothing is working. Microsoft acknowledged that the Creators Update broke Bluetooth Low Energy (reference 1 and reference 2). The Microsoft company promised a hotfix as soon as possible. But since then they have released an updated version of Windows (Fall Creators Update) and nothing has happened – up to now within the Windows 10 Creators Update version, the Bluetooth Low Energy still does not work.

        There are a large number of posts on forums in which different peoples complains regarding different types of Bluetooth devices that stop working after upgrading to Creators Update (see here, see here, see here, see here etc.).

        The results, I'm going to show right away, were obtained after many tests: (1) on a desktop PC that had a CSR4.0 Bluetooth USB dongle (CSR8510 A10) and (2) on a Dell Inspiron P66F laptop with an integrated Bluetooth LE device. I know there are many solutions on the internet to fix several types of Bluetooth issues. I tried almost all, but nothing was working (update the Bluetooth driver, run Windows troubleshooter, disable and enable Bluetooth related services etc.).

        So, let’s present the results:

        The operating system has a strange behavior when the pairing process is initiated. In the list of discovered devices, the SensorTag appear and disappear (with a period of 1 … 1.5 s). Finally, when a mouse clicks succeed on the SensorTag, the pairing process accomplishes and the LEDs on the Thunderboard React (the blue and the green ones) have a period when they are flashing consecutively in an atypical mode.

        The reading of the characteristics of the Generic Access Service (0x1800) can be done without any problem, but the reading from Device Information Service (0x180A) fails on all four existing characteristics.

        Setting the sensors (embedded on SensorTag), the mode of acquiring data (on Thunderboard React you have only the following possibility: (1) to get data through the notification from 3 sensors and (2) to read data from the other four sensors) is impossible. Therefore, the impossibility of obtaining the actual data from sensors results directly from here.

        The same pulsating process, observed for Thunderboard React, was found to be also existing for Thunderboard Sense – when we want to achieve the pairing process. But here, things are even worse: after pairing, the blessTag program cannot detect the SensorTag. So, no active device – no entity from where the blessTags application to acquire the data.

        • CC2650STK:
          • On the firmware version 1.40 pairing the SensorTag device with Windows is impossible (I repeated the process several times, at least 8-10 times, I turned on and off the Bluetooth and I tried again – the results were the same: it was impossible to add this device).
          • On the firmware version 1.20, the PC discovered the SensorTag and I was able to pair the SensorTag with the PC.

               Also, I was able to get Generic Access data. But, at the Get Device Information service, from 9 characteristics only 6 responded and only from them it was possible to get information.

              Instead, I cannot set up the device and I cannot retrieve data from sensors either through the read mechanism or through the notifications.

        The behavior is identical with the behavior of CC2650STK (firmware version 1.40). At each connection attempt, you will get the following error message: "Try connecting your device again".

        So, in conclusion, within this version of Windows 10 (Creators Update), it is impossible to communicate with any of the four types of SensorTags point out above. Consequently, I mention (once again) that here I have used the same software version that I also used in all test made on Windows 10 Anniversary Update.


        Windows 10 – Fall Creators Update - Version 1709

        This version of Windows 10 (1709 – OS Build 16299.19) is a huge step forward, compared with Windows 10 Creators Update (were on BLE almost nothing is working), but still has a long way to get to the level of Windows 10 Anniversary Update (1607) operating system.

        But let's see why I made this statement:

        In the same mode like in Windows 10 Creators Update, the SensorTag appear and disappear when we want to add a new Bluetooth device. The same behavior can be highlighted in the action center on Bluetooth’s quick action button were “Not connected" and "Thunderboard React" are displayed repeatedly (please see in the following movie this process starting from the time index 5.14 s). Immediately we can conclude that Thunderboard React is guilty, mainly due to a flawed implementation of the advertising mechanism by Silicon Labs engineers. But, searching on the internet, we will notice that other users reported the same problem to other types of BLE devices, after installing the Fall Creators Update – e.g. view this movie on YouTube.

        After pairing the SensorTag, the blessTags application is not able to find the Thunderboard React device. So, at this point nothing is working:  Generic Access and the Device Information services or data acquisition from the sensors embedded on Thunderboard React SensorTag.

        The mode to behave is similar to the one of the Thunderboard React. This Bluetooth device is displayed and disappears repeatedly. When the pairing process succeeded, it is possible to take data from Generic Access Service. But from this point, nothing is working anymore.

        I will treat these two devices here simultaneously because their behavior related to the Windows 10 (1709) operating system is similar.

        The pairing operation and the reading, from the Generic Access and the Device Information services, are working perfectly without any kind of problems.

        The problems only occur when we want to read information from the sensors. The data transfer mechanism through notifications does not work at all.

        The only way to get data from the sensors, embedded in the SensorTag, is by means of the direct reading mechanism from the device. This approach has two issues: (1) lower data transfer speed (as we have shown above) and (2) if all the sensors accept one of the two data transfer methods (through reading and notification), the buttons on the SensorTag can be interrogated only through the notification mechanism. Thanks to this "feature" of the Windows 10 (1709) OS, the blessTags application implements, starting with version, the reading method for data acquisition also.

        A problem appears with the CC2650STK SensorTag having the firmware version 1.20. If the process of pairing and data reading from Generic Access service works very well, the reading process from Device Information services is not possible. Moreover, the sensors reading (from this SensorTag with this firmware version) does not work through either one of the two possible mechanisms (reading or notification).

        As a conclusion, up now on Windows 10 Fall Creators Update (1709, build 16229.19) only the SensorTags produced by TI (CC2650STK and CC2541DK) are working. More, they are working only in reading mode. But attention! Only CC2650STK firmware version 1.40 will work in this mode. Unfortunately, when you buy a CC2650STK you have a very high chance of taking a device with firmware revision 1.20. So, to be able to communicate with a such a type of SensorTag an upgrade it is necessary at least to the firmware version 1.40.

        Below I present a movie that proves all these statements made above for Windows 10 Fall Creators Update.

        Since the first release of Windows 10 Fall Creators Update (build 16229.19), on October 17th, 2017, there have been no improvements or errors corrections related with Bluetooth LE up to KB4054517 (released on December 12th, 2017). In KB4054517 (OS Build 16299.125) there is a key change on Bluetooth LE (see here): “Addresses issue with personalized Bluetooth devices that don't support bonding”. Since this message is very cryptic, I've decided to resume all my analysis donned so far and to see if there are any improvements compared to the first release of Windows 10 Fall Creators Update (build 16229.19).  … and a little surprise, right now I am able to get: (1) data from Thunderboard Sense (from the sensors embedded on the SensorTag but only through the reading mechanism) and (2) all the information from Generic Access and Device Information services. There are no other improvements.


        Windows 8.1

        As a first Microsoft OS with BLE support, the implementation is satisfactory, but it is far to be an excellent one. The only devices that work with this operating system are CC2650STK and CC2541DK.

        Setting the acquisition time to 150 [ms], for the CC2650STK, we can get the data (from all embedded sensors), complying the 150 [ms] sampling rate, through the notification mechanism without any problems. Unfortunately, using the CCC2650STK reading mechanism, we can get data (from all the sensors) with a period of 2 seconds.

        The situation is getting worse when we are talking about CC2541DK. Through the notification mechanism, the data is obtained with a period of 0.4 ... 0.6 seconds. While using the reading mechanism we can retrieve the data with a fluctuating period of 2.8 ... 3 seconds. The conditions are the same: acquisition period 150 [ms] from all the sensors embedded on the CC2541DK SensorTag.


        Windows 7

        The Microsoft company has added support for the Bluetooth Low Energy (BLE) stack starting with the Windows 8 operating system. They have provided an API which enables applications to access BLE devices.

        But the Microsoft has not ported the BLE API's to Windows 7. The Windows 7’s built-in stack supports only Bluetooth version 2.1/3.0, there is no support for BLE (4.0, 4.1 or 4.2). So, from the point a view of a developer it is impossible to communicate, in Windows 7, with a BLE device using Windows 7’s stack.

        The TI company have a program called the BLE Device Monitor that is able: (1) to run on Windows 7 and (2) to communicate with a SensorTag. But you must use for these a special USB dongle (e.g. CC2540 Bluetooth Low Energy USB). If the source code for the USB dongle is free, the source code for the BLE Device Monitor is not available – it is only for the internal use of the TI company.



        The Windows 10 Anniversary Update (Version 1607) is the best Windows version ever made by Microsoft from the point a view of Bluetooth Low Energy (BLE) devices – SensorTags in our case. Obviously, this is also due to the considerable number of improvements that took place at the Bluetooth LE level in the following OS builds (see for more info: 14393.51, 14393.105, 14393.189, 14393.222, 14393.321, 14393.351, 14393.726 and 14393.1083.

        The blessTags (BLE SensorTags) application can be downloaded from the Windows Store Apps: For more information, demo, practical applications, examples etc. please visit the following blog:

        Synthesizing all of the above results we will get the table below.


      • Thunderboard Sense Battery Life

        Nick_W | 10/289/2017 | 11:43 PM
        I’ve been playing with this great new board for a few days now. One issue I have run into is that the battery life is very poor. I’m only getting a couple of days out of a coin cell.

        I’ve dug into the code, and a couple of things become clear. First, when running rom battery, when you connect to the board, all the sensors switch on, and stay on, until you disconnect.

        Second, when you read the sensor values, you aren’t reading the sensors directly, you are reading a cached copy of the values, which are updated every 3 seconds, continuously while you are connected - irrespective of how often you actually read the values out.

        Thirdly, on USB power, sensors are powered on all the time (OK, not a big deal).

        The conclusion is that if you connect, stay connected, and read the sensors once per minute, the sensors are live all the time, and the battery dies quickly. In fact, I’m not sure the device sleeps at all when connected.

        If you connect, read, then disconnect, you have the problem that the device times out after 30 seconds (ie goes into deep sleep, and can only be woken with a button push).

        I have changed this timeout to 1 hour, reduced the advertising cycle to 500ms (from 100ms) and reduced the led flashes (during advertising) to 20ms every 10 seconds, but the battery life is still poor when connecting, reading, disconnecting once every minute. I’m just reading the environmental sensors.

        It seems that although the sensors are off when not connected, the device does not sleep at all when advertising. (Although debug output does say BLE sleep mode enabled).

        Does anyone have any suggestions on how to reduce battery drain? Make the device sleep between advertisements? Or any other way to extend the battery life beyond a few days?

      • Simple WGM110 programming fixture

        madsci | 10/289/2017 | 11:29 AM

        This project isn't really an end in itself, just a simple tool for working with the WGM110 Wi-Fi module, but it was suggested that I share it here, so I'm re-posting.


        Sometimes it's useful to be able to pre-program modules before they're installed on a board, particularly if your board doesn't have space for a SWD header.  The earliest versions of the WGM110 also had a defective DFU bootloader (stop bits were 1/2 bit time) that wouldn't work with some hosts, which made in-circuit programming difficult without SWD.


        To address this I made a quick and dirty programming fixture for about $15.  The components are a Mill-Max 854-22-010-40-001101 0.050" pitch 10-position pogo pin header to connect to the WGM110, a Samtec SFSD-10-28-H-05.00-SR cable with 20-pin 0.050" pitch plug for the JTAG side, some heat shrink tubing, and a 3D printed holder that I whipped up in Alibre.


        This is not really the right JTAG connector - its polarity key is in the wrong place and it's latching - but it's the closest thing that Digi-Key happened to have in stock that had discrete wires.  It fits on the dev board as is, and it'll fit on a P&E Cyclone's shrouded header if you snip off the key.


        In the orientation shown in the top view photo, the pogo pins are wired to the following JTAG pins:



        • 3 (ground)
        • (nc) 
        • 1 (VDD) 
        • 10 (Reset) 
        • (nc)
        • 2 (SWDIO)
        • 4 (SWCLK)
        • (nc)
        • (nc)
        • (nc)


        If I was going to be using this much I'd have machined mine from Delrin or anti-static ABS, but for only doing a few dozen units at most it didn't seem worth firing up a milling machine.


        Without the holder piece, I can use the pogo pin plug to re-flash modules in circuit that have been bricked by DFU failures.  This depends on having the PCB lands extend far enough beyond the periphery of the module to make contact.  If you followed the recommended layout in the datasheet, it ought to work. I wouldn't want to have to do many boards this way without an alignment jig to hold it in place, but it works fine to just hold it in place if you're just fixing an occasional mistake and not doing production quantities.


        Top view:


        2017-10-14 12.34.07.jpg

        Bottom view, with wiring to pogo pins:

        2017-10-14 12.34.19.jpg

        The STL file and original model in Alibre format are attached.  Keep in mind that I set the dimensions to make it work with my own 3D printer and the fit may require tweaking on yours.


        It's no substitute for a proper in-circuit programming setup for production, but it's handy for prototyping and I figured I'd share it here in case anyone else needs such a gadget.