Official Blog of Silicon Labs

    Publish
     
      • Happy Gecko starterkit review

        uSasha | 08/243/2015 | 05:06 PM

        0 Hardware

        The board is pretty unique if we compare with other vendor’s starter kits. Modern evaluation boards are always bare PCBs with only necessary elements: MCU itself, debugger, couples of LEDs and button(s) if you are lucky.

        Gecko series starter kits are full featured development kit, which content a lot of functionality besides this bare minimum:

        • Fully featured Segger j-Link, which is really fast and hassle free debugger and is supported in all IDEs on all OS types. And it’s not restricted to only one board, you can debug your own hardware based on Silabs silicon.
        • Energy aware debugger - high precision circuit (we are talking about uA) which measures EFM energy consumption in real time. It’s also possible to pass data thru SWO to associate current consumption with C-functions and IRQ handlers!
        • Cute Sharp Memory LCD, which is ultra low power (less than 3 uA) and a great friend of Geckos.
        • Si7021 - integrated digital humidity plus temperature sensor with i2c interface.
        • Of course there are some LEDs, button, touch buttons, USB, USB-UART and all pins are on breakout pads.
        • There is also a CR2032 battery holder and switch to select power source from debugger, MCU’s USB or battery.

         

        IMG-20150831-00910.jpg

         

         

        I have old Giant Gecko starter kit, so let’s compare what changed in two generations of STKs:

        First what I’ve noticed is that board became a little bit bigger. Now there are two 16 pin rows on upper and lower edge of the board with 32x2=64 pins total on breakouts instead of 22x2=44 pins total on older boards.

        New board have two 1.27mm JTAG headers instead of one 2.54mm on older board. All headers are 20 pin standard. One is marked as DBG and another is not marked at all, schematics don’t know about it either. Lets just call it “Mysterious Header” Robot Happy

        No light and LC sense on board because there is no LE sense on Happy Gecko.

        No NAND Flash, because there is no NAND controller on such small device.

        No OPAMP footprint as there is no OPAMP in HG. No backup domain with super cup also.

        Expansion header is the same but on new board all pins are marked on the back side which is very useful.

        MCU’s USB port is moved to the right and not covers breakout pins anymore which good too.

        Segment LCD replaces with crispy graphics one.

        IMG-20150831-00915.jpg

         

        IMG-20150831-00911.jpg

         

        My conclusion is that as 3 years before it’s best quality and feature vice starter kit with some usability improvements. Of course some Giant’s features are missed but hey, Happy Gecko is tiny MCU and the kit expose all it’s features. With new price point of $30 ridiculously cheap, even cheaper  than J-Link EDU. Also I miss SWD 2.54mm header on this board, which was really helpfull.

         

         

        1 Getting started

        First, download new Simplicity Studio 3.1: https://www.silabs.com/products/mcu/Pages/simplicity-studio.aspx

         

        Than connect HG-STK and SS will download and install all documentation, examples and software you need. After installation click demo button. It’s a good place to start playing with new kit, in few clicks you can run some fun demos and check system current consumption. My favorite is analog watch demo, there are also a lot of ready to use USB classes which is great.

         

        1.PNG

         

        Than I recommend to check MBED web page: https://developer.mbed.org/

        You should register and add your board, then you’ll get access to online IDE with a lot of build-in libraries which are good for beginners or for prototyping.


        Last but not least check University Program button in SS.

        In my opinion it’s the best way to learn EFMs and one of the best for MCUs in general.

        It’s very dense materials with presentations and hands-on examples about everything to start. I wish I have this materials when I started to learn MCU and coding.

         


        The next step is to start a pet project and read application notes for specific peripherals and use cases
        .

         

      • The Fastest Isolated Current Sense Amplifier for the World’s Harshest Conditions

        lethawicker | 08/243/2015 | 04:08 PM

        8920blog_1.pngThere’s a problem with the way we measure current in high-voltage systems. While inverters and high-power systems need current information to maintain safe operating conditions, improve system efficiency, and respond quickly to load changes; measuring current on a high-voltage rail can be a real challenge. Sensors have to be electrically isolated from the system controller, sensors have long delays limiting response time, accuracy over temperature ranges is difficult to maintain, and systems are noisy.

         

         

        How does the Si8920 isolated amplifier solve for these challenges? Let’s take a look:

         

        Problem

        Solution

        Sensors have to be electrically isolated from the system controller

        Robust galvanic isolation keeps the controller safe even with working voltages up to 1200 V

        Sensors have long delays limiting response time

        Low signal delay means the controller can respond quickly with bandwidth of up to 750kHZ and response times with an unprecedented 0.75 µs signal delay

        Accuracy over temperature ranges is difficult to maintain

        Tiny offset (1 µV/°C) and gain drift insures accuracy over an entire temperature range of -40 to 125°C

        Systems are noisy

        Excellent transient noise immunity, with more than 25 years of field operation expected

         

        In contrast with traditional amplifiers, the Si8920 is the industry’s fastest isolated current sense 8920blog_2.pngamplifier. It provides precise current shunt measurement for power control systems, including motor drivers and inverters.

         

        Ideal use cases for the Si8920 include industrial motor drivers, solar inverters, high-voltage power systems, uninterruptible power supplies (UPS) and electric/hybrid-electric (EV/HEV) vehicle systems.

         

        Here’s an example of an AC Motor Drive that uses the Si8920 to measure current both on the high-voltage DC Link (+), as well as on the legs of the motor.

         

        motordriver.PNG

        For more information on the Si8920, visit our website.

      • Superfast Sensor Evaluation Using the EFM8 Sleepy Bee

        Anonymous | 08/239/2015 | 04:13 PM

        Quickly Evaluating a Sensor with EFM8 Sleepy Bee MCU

        Getting a sensor and an MCU to communicate reliably can be a challenge—especially if you are new to the MCU. When you just want to quickly evaluate a sensor, or something else, the best solution is a fast way to configure the peripherals (an ADC in this case), and capture the effects of the different sensors.

         

        To illustrate this, I designed an experiment to test a flex sensor with the Silicon Labs EFM8 Sleepy Bee SB1 8-bit MCU. I chose the EFM8 Sleepy Bee MCU because my (fictional) target application will run on a battery and needs to be ultra-low power.

         

        EFM8SB Kit.jpgEFM8 Sleepy Bee EVB

         

        Sleepy Bee a great fit for this application because the ADC operates in configurable low power modes, which is a feature that is hard to find in a low cost MCU.  The ADC will support 12-bit ADC with 75ksps or 300ksps with 10 bit mode, which I chose to take many rapid readings from the sensor. 

         

        Which Sensors?

        Next I selected the sensors I wanted to evaluate. I chose a couple of different sensors. One from Spectra Symbol part number SEN-08606. And one from Flexpoint Sensor Systems part number 176-3-001.

         

        Flex Sensor.jpgFlex Sensor Options from Spectra Symbol and Flexpoint

         

        Each sensor can be powered by 5V or 3.3V, depending on performance and signal coming from the MCU. They work as a variable resistor, so as the bend of the sensor changes the resistance changes as well, altering the voltage across it.  This voltage drop corresponds to the bend or arch the sensor is. I used 3.3V for this design to reduce power consumption. So the sensor divides this and returns a voltage somewhere in the 0.5-2.5V range.

         

        Configuring the EFM8 Sleepy Bee ADC

        Next I needed to configure the ADC on the MCU. For this, I downloaded Simplicity Studio which includes a free IDE and compiler.

         

        Simplicity Studio also includes an ADC reference design, which made getting the ADC up and running superfast.  Loading the reference design was simple and it had the ADC configured with the settings I needed above. 

         

        Simp Studio Check Boxes.pngSimplicity Studio ADC Input Pin Dialog Box

         

        Simp Studio Config Boxes.png

        Simplicity Studio ADC Configuration Dialog

         

        Once I finished configuring the ADC settings and pins, the Simplicity Studio reference design spit out code for me too. In other words, I had to write no code to evaluate these sensors. All I had to do was select configuration options and check boxes.

         

        Simp Studio Code Example.png

        Simplicity Studio Code Snippet

         

        Sensor Evaluation Set-Up

        Next I configured the set up on my desktop. In the sensor set up image, the red wire runs 3.3V to the sensor. The green wire is the input to the ADC. And the brown wire goes to a potentiometer and then to ground so I can alter the pot’s resistence to create the largest voltage swing.

         

        Set-up Image.jpg

        The last step was to see if my set-up worked, and to take some power measurements.

         

        As I bent the sensor, the voltage drop increased and the voltage read back decreased. Success! My set-up was working.

         

        Voltage Image.png

        Output of Voltage Readback from Sensor

         

         

        Simplicity Studio Energy Profiler

        Now to Test power consumption. Simplicity Studio’s Energy Profiler allowed me to see the auto-generated code’s power consumption in real-time.

         

        Simp Studio Energy Profiler.png

        Simplicity Studio Energy Profiler Power Consumption Readings

         

        Energy Profiler also allows the code to be broken down to provide insight into where the most power is consumed. This is a simple example where I am reading the ADC so the current consumption is relatively flat. When I am ready to develop the application further this will be a powerful tool to extend my battery life.

         

        Summary

        10 minutes is all it took to evaluate two sensors and compare their current consumption. EFM8 Sleepy Bee is super flexible and has a perfectly suited, low-power ADC. And Simplicity Studio has a ton of EFM8 examples that make it easy to configure and utilize its functionality including its ADC, SPI, LCD, and many more.

         

        Check out the EFM8 Sleepy Bee

        Check out Simplicity Studio

         

         

         

      • The USB and Battery Power Balancing Act

        lethawicker | 08/238/2015 | 03:36 PM

        When designing a portable device, you want to maximize interoperability and user friendliness, so you want to choose an interface like USB. And when you incorporate USB, you also make your gadgets host-agnostic. It doesn’t matter whether your users connect to a PC, an iPhone, or an Android tablet. Therefore, when you want to connect all of these extra gadgets via USB to your battery-powered go-to mobile devices, what was never a concern in the original USB specification – power consumption – suddenly becomes a top priority when choosing a USB-based solution.

         

        You don’t want to waste the precious battery life of a tablet or laptop just to communicate with the on-board peripherals. And you don’t want to design a simple add-on application for a smart phone that quickly drains its battery.

         

        By choosing the right USB-enabled hardware, you will be able to develop your device with a much smaller energy footprint since a universal M2M interface allows you to exclude almost all external components.

         

        USB and the IoT

        In general, only the host can initiate transfers. Even if there is no communication, the host sends keep-alive messages to the device every millisecond. If the device has data available, it will reply. In this active mode, the device has up to 100 mA of power, and the host expects the device to provide an immediate response to any request. When the host stops sending these keep-alive messages for 3 ms, the device should enter a suspend state and immediately reduce its current draw below 3 mA.

         

        In the suspend state, most of the device can be switched off, and usually we can switch off the most power-hungry parts of the PHY. Even though a 3 mA suspend current should be easily achievable by any modern MCU, there is no reason to keep it that high. MCUs with well-thought-out energy modes, like the Silicon Labs EFM32 Happy Gecko MCU, should be able to achieve less than 3 µA in this mode, including the current draw of the PHY.

         

        However, in active mode, when inspecting the USB communication of a regular keyboard device, active mode is still not very active; most of the time, the device is just waiting for the host to send data. However, whenever the host requests a response from the device, the response must be immediate; that is why most implementations keep the USB peripheral running at 48 MHz at all times to allow sufficient response time. In this particular example, the lines are idle for 97 percent of the time, even though we are enumerated and active.

         

        A USB implementation that decides when the clock is needed and for how long is uniquely optimized for battery-powered applications. Silicon Labs now has two patents pending for designs to make the USB interface truly usable in today’s battery-powered IoT world. Energy-efficient communications, even in active mode, are enabled by using crystal-less USB oscillators and by disabling the power-hungry part of USB connectivity between packets, as shown in Figure 3. This innovative approach greatly reduces system-level power consumption and creates a truly universal M2M interface offering exceptional energy efficiency.

         

        usb_blog1.png 

         

        Low-energy USB should be implemented in a way that is completely transparent to developers and to end users. What will be noticeable is significantly reduced power consumption through low-energy modes (LEM), as shown in Figure 4. When this technology is combined with other space- and cost-saving features such as crystal-less USB implementations and clock recovery, developers can realize a truly ultra-low-power universal M2M interface without the need for additional external components.

         

        USB_blog2.png

         

        EFM32 Happy Gecko—the USB MCU for the Future of the IoT

        When examining the evolution of the USB interface, it’s clear that the next step is to make USB the universal and power-friendly solution for battery-powered devices. MCUs like Silicon Labs EFM32 Happy Gecko make the minute decisions necessary to reduce power consumption dramatically, enabling USB to penetrate markets where it has not yet succeeded to its fullest potential. 

      • Add USB Easy as 1-2-3

        lethawicker | 08/236/2015 | 09:22 AM

        Sometimes we forget to add USB to our designs, or we need USB to access the design more efficiently from our development platform.

         

        Don’t worry. It’s easy to drop-in USB connectivity to any design—old or new—with the fixed-function CP210x MCU family from Silicon Labs. In fact, you can do it in just three quick steps.

         

        Step 1 – Connect CP210x EVK to your Windows PC and your Launch Windows driver installer to walk through the wizard to set the driver name and configurations.

         

        Connect

         

        Step 2 – Install the driver on the target device and reboot Windows to recognize it. No additional code writing necessary. 

         

        This is the set up.  There are two wires that go from my UART ports on the device to the TX and RX ports of the Silicon Labs CP2102 device.  Then the USB goes to the host computer where the terminal is viewed.

         

        USB 123

         

        Step 3 – Once the drivers are in place and the device is recognized, open a com port and, USB-am! start sending and receiving USB data.

         

        com port

         

        Learn more at the Silicon Labs CP210x Page

        CP210x devices

         

        Download AN721 for more detailed instructions

        AN721, adding USB walkthrough

         

         

        Buy the CP210x EVK to get started

        Evaluation kit

         

         

        Customize the USB driver

        Custom driver info

         

        Feel free to share your thoughts in the comments!

      • Chapter 6: User Interface Experiments Part 2 - Switch Bounce

        lynchtron | 08/233/2015 | 09:11 AM

        makersguide_ch6_2.png

        Switch Bounce

        Another issue in the design of button interfaces is switch bounce, also known as contact bounce or chatter.  When a mechanical switch is pressed, the metal contacts inside actually bounce on the surface of each other for some amount of time before they settle down and remain in contact.  The Stater Kit has two pushbuttons which are “debounced” via an RC filter.  The resistor and capacitor create a low-pass filter that doesn’t allow the sharp rise and fall waveforms of switching bouncing to occur.   The change in voltage passing through the pushbutton is smoothed to create a single button press event.  There are more robust debouncing circuits that involve flip-flops and even integrated circuits dedicated to solve this problem in hardware.  That may be right for your solution if the response time is critical, but for most applications, you can handle this solely in software without any external components at all.

         

        4.4_user_pushbuttons.png

         

        Keep in mind that other user actions can cause indeterminate states.  Switches can bounce when they are pressed and again as they are released.  Sometimes, they don’t bounce at all and work perfectly.  As switches are used in the field and age, they can perform differently than the new switches that you test for your prototype.  In addition, you will get similar bouncing waveforms from a plug being inserted into a socket.  If your design is trying to detect that event, you will need to debounce the plug event.  To handle all of these situations, you will need to develop a software algorithm to detect the initial event and then wait for some amount of time until the bouncing has stopped before you allow another input event.

         

        Since the pushbuttons on the Starter Kit are brand new and already debounced in hardware, I used a jumper wire attached to the GPIO pin PF9 and then touched it against the VMCU power rail to introduce a nasty switch bounce effect.  The following code will count the number of times the jumper has been touched to VMCU, which is the pin right next to PF9.

         

        #include "em_device.h"
        #include "em_chip.h"
        #include "em_cmu.h"
        #include "em_gpio.h"
        #include "utilities.h"
         
        #define TEST_JUMPER_PORT      gpioPortF
        #define TEST_JUMPER_PIN       9
         
        /**************************************************************************//**
         * @brief  Main function
         *****************************************************************************/
        int main(void)
        {
              // Chip errata
              CHIP_Init();
         
              setup_utilities();
         
              CMU_ClockEnable(cmuClock_GPIO, true);
         
              // Set up the user interface buttons
              GPIO_PinModeSet(TEST_JUMPER_PORT, TEST_JUMPER_PIN, gpioModeInput,  0);
         
              bool pressed = false;
              int number_of_presses __attribute__((unused)) = 0;
         
              while (1)
              {
                    if (GPIO_PinInGet(TEST_JUMPER_PORT, 9))
                    {
                          if (pressed == false)
                          {
                                number_of_presses++;
                                pressed = true;
                          }
                    }
                    else
                    {
                          pressed = false;
                    }
              }
        }

        I am keeping track of the number of times the jumper has been touched against VMCU with the number_of_presses variable.  I have added a special message to the compiler in the declaration of this variable with __attribute__((unused)) that I don’t want to see any compiler warnings if this variable is unused.  I like to do this whenever I declare a variable that I will use strictly for debugging purposes, and it keeps my build console clean.

         

        In order to use this code, compile and run the debugger, then put a break point on the if (GPIO_PinInGet…) line, and the code should immediately break in a that line.  Examine the value of the number_of_presses variable is zero by hovering your mouse over it.  Remove the breakpoint, and click on the resume button.  Then touch the jumper from PF9 to VMCU once and then remove the jumper from VMCU.  Set the breakpoint on the if line again, which should again cause the debugger to break in and allow you to examine the value of the number_of_presses.  If you only touched the jumper one time, it should contain a count of one, but it will sometimes be higher than one.  It depends on how cleanly you touched the jumper to the pin.

         

        To debounce this switch, we need to start a timer when the jumper is touched to the pin, and not allow another touch or release until the timer expires.  Then, we need to do the same thing when the jumper is released from the pin.

         

        #define DEBOUNCE_TIME         300  // ms
         
              bool pressed = false;
              int number_of_presses __attribute__((unused)) = 0;
              int debounce_pressed_timeout = 0;
              int debounce_released_timeout = 0;
         
              while (1)
              {
                    if (GPIO_PinInGet(TEST_JUMPER_PORT, 9))
                    {
                          if (pressed == false && expired_ms(debounce_released_timeout))
                          {
                                // You could start some process here on the initial event
                                // and it would be immediate
                                number_of_presses++;
                                pressed = true;
                                debounce_pressed_timeout = set_timeout_ms(DEBOUNCE_TIME);
                          }
                    }
                    else
                    {
                          if (pressed == true && expired_ms(debounce_pressed_timeout))
                          {
                                // You could start some process here on the release event
                                // and it would be immediate
                                pressed = false;
                                debounce_released_timeout = set_timeout_ms(DEBOUNCE_TIME);
                          }
                    }
              }

        I found that a DEBOUNCE_TIME of 100ms was not sufficient for this jumper wire switch, and I had to look at an oscilloscope waveform to figure out why that was so.  Sure enough, attaching a jumper to pin with a wire actually connects and disconnects over a longer period of time than a switch, as much as 300ms.  Sometimes, it can toggle back and forth between VMCU and ground many times before settling on VMCU, so I set DEBOUNCE_TIME to 300ms to be safe.  More characterization and test is needed to ensure that this value will work for every user that interacts with your device and with your users.

         

        6.2_switch_bounce.png

         

        Keep in mind when using GPIO interrupts that switch bounce will cause many rapid interrupts to occur.  If you were to always trigger an interrupt handler to do something on each button press, you will find that you get more button presses than what the user actually presses.  Therefore, you will either need to disable the interrupts for a time after a button is pressed, or allow the interrupt to trigger but always check a timer to see if the time for switch bounce has passed before taking any action.

         

        This jumper wire should perform much more poorly than an SMT switch, but you never know until you test it.

         

        More research on this topic can be found in this article from The Embedded Muse.  An analysis of multiple switches are performed, and hardware and software solutions to the problem are explained.

         

        One more thing to consider: whenever a user connects power to your device, whether it be through a battery or a connector, the MCU will see power for a few milliseconds and then power will be lost, due to the same phenomenon we are studying in this lesson.  Therefore, I always recommend putting a delay in the beginning of your program to allow everything to settle down before you start to process any of your inputs or sensors.  You don’t want to do a one-time only event such as the first ever power-on initialization only to see power be removed a few milliseconds after it was applied.  While debugging, I will usually inject a delay(400) statement in my programs as soon as I enable the clocks and the SysTick interrupt before the MCU tries to initialize anything else.

         

        PREVIOUS NEXT 

      • Who Shrunk My Sleepy Bee?

        lethawicker | 08/231/2015 | 04:06 PM

        Sleepy Bee

        When we say we’ve made the Sleepy Bee, the most energy-friendly of our Bee family of 8-bit MCUs, smaller, we’re not kidding. It now measures a miniscule 1.72 X 1.66 mm2. The best part of this shrink-ray process is that we’ve preserved the best features:

         

        • 12 robust capacitive touch channels to add touch sensing to even the most space-constrained environments
        • Ultra low-power consumption achieving a touch detection of ~1 µA
        • A full library of Capacitive Sense features and algorithms in Simplicity Studio
        • Fast wake up at 2 µs
        • Precision clocking to optimize for system and power requirements

        SB1 or SB2—which is Right for You?

         

        The Sleepy Bee is packaged in two form factors—the SB1 and the SB2. What’s the difference? Let’s take a look.

         

        The SB1 is great for applications that require:

        • Low power,
        • A small package
        • Robust capacitive sensing

        Some examples include set-top boxes, earphones, instrumentation panels, and keypads.

         

        The SB2 shines in applications that require:

        • Low power with a low standby current
        • Larger flash capacity for more complex processing
        • An ideal solution to go-between a sensor and a radio

        Examples of applications where you might want an SB2 include hand-held medical devices, remote controls, toys, or product tags.

         

        The bottom line is that no matter how small we’ve made the Sleepy Bee, you still get an MCU without compromise. Get a starter kit at our website and find out for yourself.

         

        Sleepy Bee Starter Kit

      • Meet the Captain of Oceanic Internet of Things

        deirdrewalsh | 08/229/2015 | 06:08 PM

        cShRwOznEKrs4L2KNQRcPZHXs3EK9s_apUm6g5Ol6lg.png

         

        "O Captain! My Captain!" IoT Hero Phil Cruver is more than a savvy entrepreneur.  He is a passionate pioneer working to use Internet of Things technology to meet the growing demand for shellfish, create jobs, and reduce our nation’s $11 billion seafood deficit, all while preserving a healthy ocean. I had the opportunity to interview him on his amazing, responsible project. 

         

        So, Phil, tell me a bit about yourself. 

         

        I am currently the founder and CEO of Catalina Sea Ranch, which holds the first and only aquaculture permit for farming the vast seas in U.S. Federal waters.  

         

        I believe a “Blue Revolution” is essential for feeding future generations. Jacque Cousteau presciently stated decades ago: “With Earth’s burgeoning human populations to feed, we must turn to the sea with a new understanding and new technology. We need to farm it as we farm the land.”

         

        Another luminary, Nobel Laureate Peter Drucker stated: “Aquaculture, not the Internet, represents the most promising investment opportunity of the 21st Century.” These quotes put into perspective the potential for an Ocean Internet of Things™ platform supporting sustainable aquaculture as the “Next Big Thing” for sustaining an exploding global population.

         

        As an entrepreneur with six startups, and an outlier, I have a unique perspective on the in the traditional seafood industry. An Ocean Internet of Things will rapidly transform the $135 billion dollar global aquaculture industry. Smart, connected products will unleash a disruptive new era of competition opening a major global market opportunity.

         

        I know your company is at the forefront of “marine big data” and “oceanic IoT.”  Can you share details about your project? 

         

        Screenshot 2015-08-17 14.30.04.pngCatalina Sea Ranch’s 100-acre aquaculture facility is located on the edge of about 26,000 acres (40 square miles) of U.S. Federal waters on the San Pedro Shelf. This is the broadest mainland continental shelf segment offshore California. The legs of three offshore oil platforms, located two miles away, are blanketed with shellfish naturally thriving on rich and abundant phytoplankton.

         

        I became interested in the potential for an Ocean Internet of Things platform to exceed the expectations of regulators with unassailable analytics from massive amounts of data showing no negative impact from our offshore operation. This would allow us to accelerate the timetable for scaling our sustainable shellfish ranch in a responsible way that would be in the interest of our country.

         

        Imagine a massive sensor array allowing us to remotely take the environmental pulse of our ranch with continuous, real-time measurements of depth, temperature, conductivity, dissolved oxygen, and measuring phytoplankton biomass. We are partnering with leading IoT companies that are designing and developing innovative technologies for the first Ocean Internet of Things platform.

         

        We were also proactive to develop a program for monitoring our ranch operations with leading institutions specializing in marine ecology and spatial planning. Our Ocean Internet of Things platform will provide science-based data for evaluating any environmental or social impacts. This will help policy makers decide how to allocate offshore waters for economic and societal benefits not conflicting with or perturbing marine life and ocean ecology. It will also provide a platform to showcase science-based solutions and best practices for advancing sustainable offshore aquaculture. This would help America assume a leadership role in this fastest growing global food production industry.

         

         

        Through our conversation, I know this is not just another IoT project.  Can you share your mission?

         

        The United States Exclusive Economic Zone (EZZ), extending between 3 and 200 nautical miles from shore is the largest of any nation covering 4.5 million square miles. Studies show that less than 0.01% of the U.S. EEZ could produce up to 1.33 billion pounds or more of additional seafood.

         

        Like petroleum, achieving self-sufficiency and security is required to sustainably and reliably meet America’s growing demand for seafood in an increasingly competitive global marketplace. Imports will increasingly become more vulnerable to supply disruption attributed to global geopolitical tensions and major demographic trends. This dire and vulnerable situation suggests that seafood shortages should be elevated to a national security issue by our government.

         

        Climate change dictates that offshore aquaculture will become the next agriculture, particularly for regions confronted with water shortages and possessing ideal coastal conditions. California is the exemplar. As the world’s fifth largest supplier of food commodities with the strongest ocean economy in America, it has the potential to augment its land-based agriculture success with offshore aquaculture for increasing both economic prosperity and food security.

         

        droppedImage.jpg

         

        Studies have shown that a small percentage of state and federal waters within the Southern California Bight could generate a multibillion-dollar offshore aquaculture industry. With proper planning employing data from an Ocean Internet of Things platform, California could emulate New Zealand, which has the goal to triple the value of aquaculture production to $1 billion by 2025 and in a sustainable way that preserves its pristine environment.

         

        Offshore aquaculture in Southern California has the potential to put a dent in our nation’s escalating $11 billion plus seafood deficit. Mollusk shellfish - mussels, scallops, oysters, and clams - are among the most lucrative and sustainable fisheries in the United States valued over $1 billion. The global export market for frozen shellfish exports to Asia is enormous. Consider that the two largest ports in America are located less than 10 miles away from 40 square miles of ocean space on the San Pedro Shelf that are ideal for sustainable shellfish ranching.

         

        I know you’ve been involved in other startups. What advice do you have for IoT entrepreneurs? 

         

        First and foremost, don’t run out of cash!

         

        Startups take more time and financial forecasts are merely models, which typically understate capital requirements and implementation time. Plan for the worst, hope for the best, and (using today’s VC vernacular) strive for a “moonshot” for your company to become a “unicorn.” Pioneers catching the crest of the IoT wave have an unprecedented opportunity to scale their startups to outrageous valuations.

         

        IoT platforms, aggregating large data sets derived from wireless embedded sensor networks and processed with probability mathematics for predictive analytics, are having a transformative impact. When combined with the power of cloud computing, a technological tsunami of epic proportions promising to usher in a new paradigm for forecasting future trends and predicting outcomes across a broad swath of industries.

         

        Early adopters will boost their commercial productivity by introducing intelligence into products for predictive, data-driven decisions based on empirical science rather than intuition. Analytics for anticipating industry trends will offer radical opportunities for companies to rethink their core businesses and competitive edge. Incumbents reluctant to cannibalize their legacy technologies will allow disruptive new entrants to capture their customers.

         

        In Catalina Sea Ranch’s case, we recognized that our company was uniquely positioned to exploit the erupting IoT phenomenon for the ocean sector, which is currently non-existent, has huge growth potential, and is ripe for innovation. With our “First Mover Advantage” it could become both a growth play for creating a multi-billion metadata driven marine industry and a defensive strategy for producing barriers to entry for the emerging offshore aquaculture industry.

         

        It’s a big question, but in your opinion, what does the future of IoT look like?

         

        It’s difficult to fathom the magnitude of these recent staggering IoT forecasts:

         

        • McKinsey Global Institute: “By 2025, the potential economic impact of having "sensors and actuators connected by networks to computing systems" could be more than $11 trillion annually".
        • New data from Juniper Research claims the number of Internet of Things connected devices will number 38.5 billion in 2020, up from 13.4 billion in 2015: a rise of over 285 percent.

         

        If the above predictions are accurate, our oceans will soon be swimming in sensors and the analysts drowning in data.

         

        If the data can be harnessed, aggregated and fused into coherent information systems providing valuable knowledge, an Ocean Internet of Things platform could provide scientific evidence for solving threats like ocean acidification the "evil twin" of climate change. It could provide governmental agencies, the scientific community, and research institutions with a better understanding of the ocean – the final frontier on our planet.

         

        Furthermore, employing massive amounts of inexpensive miniaturized sensors embedded into wireless networks, an Ocean Internet of Things™ platform has the potential to be disruptive and game changing for the global maritime industry. Those companies capable of harnessing and translating the data into insightful information for making intelligent decisions will have a competitive advantage.

         

        Moreover, an Ocean Internet of Things platform would provide scientific evidence for planners to consider various uses of the ocean for better decisions and more responsible policies. In the near future this would enable taking the environmental pulse of an area in the ocean to understand the short and long-term trends, anticipate problems and devise mitigation measures for the immediate implementation of corrective actions. This would lead to sound regulations based upon solid science for the advancement of sustainable offshore aquaculture and responsible marine spatial planning.

      • Bluetooth Certification Hassles? We’ve Got It Handled.

        lethawicker | 08/229/2015 | 08:05 AM

        Bluetooth® Smart is a hot ticket in the world of IoT development. Hop straight into the fast lane for developing new solutions with Silicon Labs’ new BGM111 Module. The BGM111 Module super-charges your development with best-in-class integration, flexibility, energy-efficiency and tool-chain support. And when you get to the point where you want to use a System-on-a-Chip (SoC) instead of a module? You’ve got the advantage of an easy upgrade path that makes big headaches into routine changes.

         

        BGMAbout that Bluetooth Smart pre-certification--the BGM111 Module comes pre-loaded with a compliant Bluegiga Bluetooth 4.1 software stack and profiles, with field-upgradable capability to Bluetooth 4.2 and beyond. Why build your own software stacks and put them through Bluetooth certification when you can let us do it for you?

         

         

        In fact, Silicon Labs’ wireless SDK gives you the flexibility to use either a host or fully standalone operations through our easy-to-use Bluegiga BGScript™ scripting language. If you are familiar with BASIC-like syntax, you’re good to go with BGScript, and you’ll be creating Bluetooth applications quickly and without using external MCUs to run the application logic. If you’re thinking that doing all the code execution on the module will reduce your cost and board space while speeding time to market, you’d be right. What’s more, Silicon Labs provides an extensive library of application profiles and examples.

         

        Find out how to get your development into the fast lane by getting a starter kit at www.silabs.com/bluegecko.

         

        blog2bgm.PNG

      • Chapter 6: User Interface Experiments Part 1 - Human Reaction Time

        lynchtron | 08/224/2015 | 07:20 PM

        makersguide_ch6_1.png

        User Interface Experiments

        When a user picks up and interacts with your gadget, the physical interface will give that user a sense of the quality of the device.  Touch screens are becoming common, but may not be appropriate for all devices.  Many devices rely on electromechanical switches or buttons.  If the buttons are hard to press, feel flimsy, or if the functions are not intuitive, the user will have a bad impression.  You can fix all of those things with good physical product design, but just as important is how the device reacts to the user input.  If the interface is slow or if a single press results in multiple presses intermittently, that will leave a bad impression.

         

        Button Input Reaction Time

        How fast does your code need to react to user input to be considered a snappy interface?  I wrote some code that you can run on the Starter Kit to find out for yourself.  The following code will illuminate TEST LED1 on the Starter Kit when the button PB1 is pressed, after a DELAY_VALUE in milliseconds is elapsed.  Note that the utilities.h file is described in previous lessons and is available on the blog website.

        #include "em_device.h"
        #include "em_chip.h"
        #include "em_cmu.h"
        #include "em_gpio.h"
        #include "utilities.h"        // Source available on the blog webpage
         
        #define DELAY_VALUE           100 // ms
         
        /**************************************************************************//**
         * @brief  Main function
         *****************************************************************************/
        int main(void)
        {
              // Chip errata
              CHIP_Init();
         
              setup_utilities();
         
              CMU_ClockEnable(cmuClock_GPIO, true);
         
              // Set up the user interface buttons
              GPIO_PinModeSet(BUTTON_PORT, SET_BUTTON_PIN, gpioModeInput,  0);
         
              while (1)
              {
                    if (get_button())
                    {
                          delay(DELAY_VALUE);
                          set_led(1, 1);
                    }
                    else
                    {
                          set_led(1, 0);
                    }
              }
        }

         

        In my tests, I found that delays of 100ms or so were fast enough to be considered responsive, but 150ms started to seem like a slow response, and 200ms seemed like an eternity.  We humans expect the interfaces to respond within 1/10th of a second. 

         

        Reaction tests at Human Benchmark measures both the recognition of an event and the reaction to the event.   The average reaction time is around 250ms.  Interestingly enough, the site mentions that your own computer can introduce a lag of 30ms to the test, and that it is getting longer as technology moves forward, due to operating system overhead.  My reaction times on my computer were around 400ms.  I switched to my smartphone and found that it dropped to an average of 280ms.  I have a wireless mouse for my computer, which could be contributing to the lag, likely due to power-saving sleep states in the mouse hardware: if it can sleep for 100ms and then wake up to check for an event, the batteries will last longer.  These are all things that you will need to consider as you develop your gadget.

         

        To make things more interesting, I changed the program to illuminate TEST LED0 immediately, followed by the original TEST LED1:

              while (1)
              {
                    if (get_button())
                    {
                          set_led(0, 1);
                          delay(DELAY_VALUE);
                          set_led(1, 1);
                    }
                    else
                    {
                          set_led(0, 0);
                          set_led(1, 0);
                    }
              }

        Now when I run the code and press PB0, I see an immediate response in TEST LED0, followed by 100ms and then TEST LED1 illuminates.  It is now very clear to me that TEST LED1 is showing a lag that would be unacceptable if both lights are intended to illuminate at the same time.  By adjusting the value of DELAY_VALUE, I found that 27ms was an acceptable delay where both lights would illuminate simultaneously, at least to me.  Therefore, if you have code that needs to read the environment (like an indicator light, for example) and act on it immediately, you have about 27 ms for your code to react to the event if you are hoping to provide a simultaneous response.

         

        This 27ms translates to 37Hz if the illumination was in the frequency domain, like in a moving video.  I am able see display flicker at up to 75Hz.  Those old monitors that had a 60Hz refresh rate drove me crazy with flicker!  We now have LED HDTV’s that interpolate 24 to 60Hz video feeds and “upconvert” to 120Hz, which is about 8ms between screen updates.  That adds a hyper-realistic quality to video content and seems to be near the lower limit of human perception.

         

        Note: The eye can distinguish colors or frequencies of light in the 450 to 700 GHz range.  However, we can’t tell if two colors such as red and blue are blended to create purple by alternating them faster than 30 Hz, or if the color is truly purple.  Likewise, the ear can distinguish frequencies between 30 and 20kHz, but the sound needs to hang around for a while to be detected by the brain.  The eyes and ears are fantastic sensors, but the brain still needs time to make sense of the data.

         

        Run this program on your Starter Kit and see what YOU think of these delays?  This will help you determine the best response time for your gadget.

         

        PREVIOUS NEXT