Silicon Labs
|
Silicon Labs Community Silicon Labs Community
  • Products
    1. 8-bit MCU
    2. 32-bit MCU
    3. Bluetooth
    4. Proprietary
    5. Wi-Fi
    6. Zigbee & Thread
    7. Z-Wave
    8. Interface
    9. Isolation
    10. Power
    11. Sensors
    12. Timing
  • Development Tools
    1. Simplicity Studio
    2. Third Party Tools
  • Expert's Corner
    1. Announcements
    2. Blog
    3. General Interest
    4. Projects
How to Buy
English
  • English
  • 简体中文
  • 日本語
//
Community // Blog

Official Blog of Silicon Labs

  • Show More
    Publish
    • Immediately
    • Draft
    • At scheduled date and time
     
      • It All Started on the Back of a Napkin

        Tyson Tuttle | 04/112/2021 | 08:22 PM

        It all started on the back of a napkin. 

         

        Stay with me here. Back in 2014, I was at an industry event with famed EE Times editor Junko Yoshida when I took a napkin and scribbled something on it. No, it wasn’t my phone number. The drawing I showed Junko, as I’m sure she would corroborate, was a sketch of what became the first IoT SoC. And thus began Silicon Labs’ journey to become the undisputed leader of intelligent wireless connectivity for the IoT. 

        Not the actual napkin, but you get the idea

         

        Fast forward to present day and Silicon Labs is doubling down to capitalize on the large and rapidly expanding global IoT business opportunity by signing a definitive agreement to sell our Infrastructure & Automotive (I&A) business to Skyworks, a highly respected, publicly traded U.S. semiconductor company that manufactures semiconductors for use in RF and mobile communications systems. When the transaction closes, Silicon Labs will be the world’s number one pure play IoT silicon, software and solutions company.  

         

        I encourage you to read more about the decision to engage in a divestiture of that business in our press release, but the key rationale for this move is ensure both businesses (which are already successful) get the focus they deserve to succeed and grow. As excited as I am for the next chapter in our 25-year history, I do feel a sense of nostalgia as I get ready to say goodbye to the truly outstanding people, products and IP that make up our I&A business.  

         

        I’ve spent the majority of my tenure at Silicon Labs as an active contributor to our I&A business, and it’s a part of who I am. In 2003, I started the company’s line of radio and TV tuner chips to receive over-the-air signals first in mobile phones and media players, and later in consumer products and automotive. I’ve been so lucky to work with wonderful people on a wide range of outstanding broadcast, timing, power, isolation and Internet infrastructure products and many of those people are still at Silicon Labs. In fact, some of the broadcast products we designed years ago are still being bought and sold to this day. That’s a testament to the design talent of this world-class I&A team. The $2.75 billion purchase price of the divestiture speaks volumes about the I&A team’s value, successes, talent and strong IP portfolio. I know that Skyworks is eager to welcome the I&A team into their ranks when the transaction closes. 

         

        That brings me to the rocket ship that is our IoT business. We have a clear path to becoming the pure-play leader of intelligent wireless connectivity for the IoT. With the massive growth in connected devices and positive asset valuation environment, now is the time for Silicon Labs to be laser-focused on the large, diverse, growing IoT opportunity. Our wireless portfolio is unmatched in breadth and depth. We have the industry’s leading secure, IoT hardware and software platform. And our strong and expanding set of ecosystem partnerships like Amazon, Google, Comcast and Tuya are helping to deliver sustainable design win momentum. Most importantly, we have a continued mission to help developers quickly go from idea to innovation with IoT devices that transform industries, grow economies and improve lives.  

         

        We can’t wait to tell our pure-play IoT story to the industry. While the Silicon Labs we’ve been for the past 25 years evolves to this level, we are as committed as ever to helping our world become a smarter, more connected place.  I can honestly say that this is as excited as I have ever been in my career for what’s next. Perhaps it’s time to pick up a pen and grab another napkin! 

      • How to Choose the Right Bluetooth Development Kit for Each Project

        tmonte | 04/102/2021 | 09:10 AM

        Choosing a Bluetooth development kit is like being a 10-year-old in a candy store. There are countless alternatives, everything looks good on the surface, but it’s hard to choose the right one because each project imposes very different requirements for a kit. This blog explains how to choose the right Bluetooth development kit at various stages of your dev process: experimenting, prototyping, optimizing, and product development.

        Choosing the Right Bluetooth Development Kit

        There are no one-size-fits-all development kits. Suppose you are simply experimenting with Bluetooth Low Energy (BLE), then a kit with a few essential features and expansion sockets will do just fine. However, if you are building an IoT prototype and running field trials, you need to focus your eyes on other features, such as on-board sensors and coin-cell battery support. When optimizing RF performance or honing a device's energy consumption, or even better, developing an actual product to be manufactured, you will probably need a pro-level Bluetooth development kit.

        Here is a rundown of what you should look at on a Bluetooth development kit, based on which stage of the dev process you are in – experimenting, prototyping, optimizing, or product development.

        Best Bluetooth Development Kit for Experimenting

        Which type of Bluetooth development kit is ideal for experimenting and testing? What should a newcomer look at in a kit to make the first dive into the world of embedded development?

        The first thing to consider on a Bluetooth development kit is an onboard debugger. It keeps your experimenting project nice and easy because you can flash the code and debug it as it runs in the target processor. It also saves you from buying and configuring an extra board. A built-in packet trace interface gives you invaluable information about the Bluetooth data packets in wireless links, providing in-depth insights for your experimenting. A virtual COM port, on the other hand, saves you from buying an external board for UART/USB bridging and takes the hassle out of your project. External connectors to hardware ecosystems such as MikroE and Qwiic are must-have features when experimenting with new things. The plug-in boards save time because you don’t have to build everything from scratch.   

        Silicon Labs’ Explorer Kit is an ideal entry kit for your experiment. It includes all the essential features listed above, including a few other powerful development features to make the most out of your kit investment.

        Explorer Kit is fully supported by Simplicity Studio, the unified development environment for all Silicon Labs technology, which allows you to develop C-based applications using GCC and IAR compilers. Explorer Kit is easy from unboxing onwards – it automatically customizes and installs the right development environment and SDK for the Explorer Kit hardware (BGM220) to get you going. 

        Optimal Bluetooth Development Kit for Prototyping

        When building IoT prototypes and conducting field trials, a Bluetooth development kit with a built-in coin cell battery connector is optimal. The battery connector helps you getting the prototype off of your desk and out to field trials quickly. You don’t have to spend time tinkering with external batteries or power supplies.

        Silicon Labs Dev Kits are your go-to Bluetooth development kits when prototyping IoT devices and testing them in field trials. Its built-in coin cell battery connector saves your time and money when preparing for trials. The kit provides you all the features needed in prototyping: 2.4 GHz chip antenna, board controller, J-Link debugger, packet tracing, virtual COM, various onboard sensors, and more.

        Advanced Bluetooth Dev Kit for Optimization and Product Development

        When developing actual market-ready products or optimizing RF performance and energy consumption to perfection, you need the most advanced development features out there – energy profiler is undoubtedly the most critical of them. It allows you to optimize every line of code to achieve superior RF performance and energy-efficiency.

        If advanced optimization and product development are your primary tasks, Silicon Labs Pro Kits have your back throughout the process, from the first kit boot-up to the final design. With its onboard Energy Profiler, you can optimize RF performance and energy consumption to perfection, while plug-in radio boards allow you to tune the Pro-Kit based on your RF needs.  

        Why Download a BLE Mobile App?

        Whichever Bluetooth development kit you choose, you should also download a generic Bluetooth Low Energy (BLE) mobile app. You simply want it to save time when debugging an embedded BLE application. The mobile BLE app allows you to test and debug the embedded applications and over-the-air (OTA) firmware update functionality easily during development.  

        How to Order a Bluetooth Development Kit

        Silicon Labs’ Bluetooth development kits are divided into three categories based on your development need – whether you are experimenting, prototyping, or developing a market-ready product, our Bluetooth development kit portfolio has the right solution waiting for you!

      • Providing Developers with a Faster, Simpler Path to Automated Machine Learning

        Andrea Mencia | 03/88/2021 | 02:20 PM

        Integrating artificial intelligence (AI) and Machine Learning (ML) into edge devices is one of the most highly anticipated developments in IoT. Smart devices that are trainable, actionable, and capable of extracting information and learning from the environment are becoming more contextually aware, and ultimately more useful. Performing AI at the edge comes with significant advantages, including low latency, reduced bandwidth, and lower power and cost, as well as privacy and security. AI can be used to achieve capabilities from small microcontrollers that were historically unheard of through conventional code: such small microcontrollers can leverage AI to achieve better decision-making in edge nodes. Adding embedded intelligence to IoT devices will create new opportunities for manufacturers – this is at the heart of why we are teaming up with SensiML, a leading provider of AI and ML.

        Accelerating Development of AI IoT Development

        SensiML offers cutting-edge software that enables ultra-low power IoT endpoints that implement AI and transform raw sensor data into meaningful insights at the device itself. SensiML’s Analytics Studio also provides a comprehensive development platform that enables developers with minimal data science expertise to build intelligent endpoints up to 5X faster than what’s possible with hand-coded solutions. This means that customers can fast-track their development projects and get AI/ML embedded into their design in weeks instead of the couple of years that data science projects usually take. The combination of SensiML Analytics Studio and Silicon Labs’ wireless SoCs and MCUs will make it possible for developers to add features, reduce complexity, and take advantage of low-power, low-cost, small-footprint designs. The SensiML Analytics Toolkit suite automates each step of the process for creating optimized AI IoT sensor recognition code.

        What is the Difference Between AI and ML?

        Both AI and ML are associated with the same computer science. But, while many people tend to use them interchangeably, they do have different meanings.

        Artificial Intelligence

        Machine Learning

        AI is the study of "intelligent agents:" any device that perceives its environment and takes actions that maximize its chance of successfully achieving its goals.

        ML is the study of computer algorithms that improve automatically through experience.

        An AI system is concerned about maximizing the chances of success.

        ML is a subset of AI which allows a machine to automatically learn from past data without programming explicitly.

        AI can help simple MCU-based systems solve complex problems.

        ML algorithms are used where it is difficult or unfeasible to develop conventional algorithms to perform the needed tasks.

         

        The Benefits of Automated Machine Learning and How it Works

        Automating the process of constructing machine learning models brings a host of benefits to developers when it comes to tasks that would otherwise require specialized backgrounds.  For example, without automated machine learning, or AutoML, the following tasks are left to the modeler to determine based on their own understanding of the problem, desired model performance, and – most critically – their expertise in the proper application of signal processing and machine learning classifiers:

        • Segmenting for regions of interest in the input data
        • Determining which pre-processing and feature transforms are needed to convert raw input data into a suitable input vector for the classifier
        • Selecting which type of machine learning classifier to use to deliver best results
        • Optimizing model parameters and tuning of hyperparameters
        • Assessing the need for post-processing to further enhance model performance

        AutoML helps by employing high-performance computing and search optimization algorithms to augment user knowledge in performing the task of constructing. The advantages of AutoML include the ability to evaluate hundreds of thousands or even millions of model permutations in the same amount of time that it would take a human data science expert to evaluate just a few. And with directed search constraints, the combination of AutoML in the hands of a skilled user can focus searches on the most promising permutations rather than just execute brute-force grid searches. This makes AutoML a powerful tool for algorithm development, whether it’s being used by an AI novice or a seasoned data science expert.

        With this partnership, we get closer to living in a smarter, more connected world, and we are proud to have SensiML as a partner in this journey. For more information on SensiML and our technology partner network, please visit our Design Partner Networks.

        To learn more about what we are doing with artificial intelligence and machine learning click here.

      • Timing 201 #9: The Case of the Really Slow Jitter – Part 1

        kgsmith | 03/78/2021 | 05:47 PM

        Introduction

        You have probably read or heard that phase noise is the frequency domain equivalent of jitter in the time domain. That is essentially correct except for what would appear to be a somewhat arbitrary dividing line. Phase noise below 10 Hz offset frequency is generally considered wander as opposed to jitter.  

        Consider the screen capture below where I have measured phase noise down to 1 Hz minimum offset and explicitly noted the 10 Hz dividing line. Wander is on the left hand side and jitter is on the right hand side. The phase noise plot trends as one might expect right through the 10 Hz line.  So what’s different about wander as opposed to jitter and why do we care? From the perspective of someone who takes a lot of phase noise plots, I consider this the case of the really slow jitter. It’s both slow in terms of phase modulation and in how long it takes to measure. 

        The topic of wander covers a lot of material.  Even introducing the highlights will take more than one blog article. In this first post, I will discuss the differences between wander and jitter, the motivation for understanding wander, and go in to some detail regarding a primary wander metric: MTIE or Maximum Time Interval Error. Next in this mini-series, I will discuss TDEV or Time Deviation. Finally, I plan to wrap up with some example lab data.  

        Some Formal Definitions

        The 10 Hz dividing line, in common use today, has been used in synchronous optical networking (SONET) and synchronous digital hierarchy (SDH) standards for years.  For example, ITU-T G.810 (08/96) Definitions and terminology for synchronization networks [1] defines jitter and wander as follows.

        4.1.12 (timing) jitter: The short-term variations of the significant instants of a timing signal from their ideal positions in time (where short-term implies that these variations are of frequency greater than or equal to 10 Hz).

        4.1.15 wander: The long-term variations of the significant instants of a digital signal from their ideal position in time (where long-term implies that these variations are of frequency less than 10 Hz).

        Similarly, the SONET standard Telcordia GR-253-CORE [2] states in a footnote

        “Short-term variations” implies phase oscillations of frequency greater than or equal to some demarcation frequency. Currently, 10 Hz is the demarcation between jitter and wander in the DS1 to DS3 North American Hierarchy.

        Wander and jitter are clearly very similar since they are both “variations of the significant instants of a timing signal from their ideal positions in time”. They are also both ways of looking at phase fluctuations or angle modulation (PM or FM). Their only difference would appear to be scale. However, that can be a significant practical difference.

        Consider by analogy the electromagnetic radiation spectrum, which is divided into several different bands such as infrared, visible light, radio waves, microwaves, and so forth.  In some sense, these are all “light”. However, the different types of EM radiation are generated and detected differently and interact with materials differently. So it has always made historical and practical sense to divide the spectrum into bands. This is roughly analogous to the wander versus jitter case in that these categories of phase fluctuations differ technologically.   

        Why 10 Hz?

        So, how did this 10 Hz demarcation frequency come about? Generally speaking, wander represented timing fluctuations that could not be attenuated by typical PLLs of the day. PLLs in the network elements would just track wander, and so it could accumulate.  Networks have to use other means such as buffers or pointer adjustments to accommodate or mitigate wander. Think of the phase noise offset region, 10 Hz and above, as “PLL Land”.

        Things have changed since these standards. Back in the day it was uncommon or impractical to measure phase noise below 10 Hz offset. Now phase noise test equipment can go down to 1 Hz or below. Likewise with digital and FW/SW PLLs it is possible to have very narrowband PLLs which can provide some “wander attenuation”. Nonetheless, 10 Hz offset remains a useful dividing line and lives on in the standards.

        Wander Mechanisms

        Clock jitter is due to the relatively high frequency inherent or intrinsic jitter of an oscillator or other reference ultimately caused by flicker noise, shot noise, and thermal noise.  Post processing by succeeding devices such as clock buffers, clock generators, and jitter attenuators can contribute to or attenuate this random noise. Systemic or deterministic jitter components also can occur due to crosstalk, EMI, power supply noise, reflections etc.

        Wander, on the other hand, is caused by slower processes. These include lower frequency offset oscillator and clock device noise components, plus the following.

        • Slight initial differences in frequency and phase between clocks
        • Slow changes in frequency and phase between clocks due to environmental differences such as temperature or vibration     
        • Frequency and phase transients caused by switching clocks

        For a good discussion of some of these wander mechanisms and their impact on a network, see [3]. 

        Since wander mechanisms are different, at least in scale, and networks tend to pass or accumulate wander, industry has focused on understanding and limiting wander through specifications and standards.  

        Wander Terminology and Metrics  

        You may recall the use of the terms jitter generation, jitter transfer, and jitter tolerance. These measurements can be summarized as follows.  

        • Jitter Generation - How much jitter is output if a jitter-free input clock is applied 
        • Jitter Tolerance - How much input jitter can be tolerated without affecting device performance
        • Jitter Transfer - How much jitter is transferred if a jittery input clock is applied

        These definitions generally apply to phase noise measurements made with frequency domain equipment such as phase noise analyzers or spectrum analyzers. They are useful when cascading network elements.    

        By contrast, wander is typically measured with time domain equipment. Counterpart definitions apply as listed below.   

        • Wander Generation - How much wander is output if a wander-free input clock is applied
        • Wander Tolerance - How much input wander can be tolerated without affecting device performance
        • Wander Transfer - How much wander is transferred if a wandering input clock is applied

        Wander has its own peculiar metrics too.  In particular, standards bodies such as the ITU rely on masks that provide limits to wander generation, tolerance, and transfer based on one or both of the following two wander parameters. See for example ITU-T 8262 [4].

        • MTIE (Maximum Time Interval Error)
        • TDEV (Time Deviation)

        Very briefly, MTIE looks at peak-peak clock noise over intervals of time as we will discuss below. TDEV is a sort of standard deviation of the clock noise after some filtering. We will discuss TDEV next time. 

        Before going into detail about MTIE, let’s discuss the foundational measurements Time Error and TIE (Time Interval Error).  These are both defined in the previously cited ITU-T G.810.

        Time Error (TE)

        The Time Error function x(t) is defined as follows for a measured clock generating time T(t) versus a reference clock generating time Tref(t). The frequency standard Tref(t) can be regarded as ideal, i.e., Tref(t) = t. 

        x(t) = T(t) - Tref(t)

        Time Interval Error (TIE)

        Similarly, the Time Interval Error function is then defined as follows, where the lower case Greek letter "tau" is the time interval or observation interval.

        TIE(t;tau) = [T(t+tau) - T(t)] - [Tref(t + tau) - Tref(t)] = x(t + tau) - x(t)

        Maximum Time Interval Error (MTIE)

        MTIE measures the maximum peak-peak variation of TIE for all observation times of length tau = n*tau0 within measurement period T. ITU-T G.810 gives the following formula for estimating MTIE. (Note: I am restricted to plain text in the formula below so please interpret "_" as preceding a subscript and "<=" as "less than or equal to".) 

        MTIE(n*tau0) = max_1<=k<=N-n [max_k<=i<=k+n (x_i) - min_k<=i<=k+n (x_i)],
        n = 1,2,...N- 1

        Where
        tau0 = sample period
        tau = observation time
        T = measurement period or (N-1)*tau0
        x_i = i-th time error sample
        MTIE (tau) = maximum xpp for all observations of length tau within T
        xpp or xppk = peak-to-peak x_i within the k-th observation

        The sampling period represents the minimum measurement interval or observation interval. There are many terms used in the industry that are synonymous and should be recognizable in context: averaging time, sampling interval, sampling time, etc. This could mean every nominal period if you are using an oscilloscope to capture TIE data. However, most practical measurements over long periods of time are only sampling clocks. This would correspond to a frequency counter’s “gate time”, for example, if post-processing frequency data to obtain phase data.       

        An MTIE Example

        It’s better to show you the general idea at this point. Below, I have modified an illustration after ITU-T G.810 Figure II.1 and indicated a tau=1*tau0 observation interval or window as it is moved across the data.  (The data are for example only and do not come from the standard. I have also started at 0 as is customary to show changes in Time Error or phase since the start of the measurement.) The initial xppk peak-peak value at the location shown is about 1.1 ns – 0 ns = 1.1 ns.

         

        Now slide the tau=1*tau0 observation interval right and the next xppk peak-peak value is 1.4 ns – 1.1 ns = 0.3 ns.

        If we continue in this vein to the end of the data, we will find the worst case to be between 17*tau0 and 18*tau0 and the value is 7.0 ns – 4.0 ns = 3.0 ns. Therefore, the MTIE for tau=1*tau0 is 3.0 ns.

        I have calculated the MTIE plot for this dataset in the attached Excel spreadsheet Example_MTIE_Calcs.xlsx. Note that the first value in the plot is 3 ns as just mentioned. This is a relatively simple example for illustration only.  MTIE data typically spans many decades and are plotted against masks on logarithmic scales.   

        However, even this simple example suggests a couple of items to note about MTIE plots:

        1. MTIE plots always increase monotonically.
          This is because MTIE acts as a max peak detector over an interval.  Larger variations in the data are encompassed as the observation interval increases.
        2. Large transients will mask smaller transients.
          Again, a max peak detector will not reveal smaller variations.

        Why is MTIE Useful?

        MTIE is a relatively computation intensive measurement.  So what good are these type of plots?  There are at least two good reasons besides standards compliance:

        1. MTIE can be used to size buffers. As noted here [3]:

        1. MTIE indicates phase behavior. [4]   
          1. Plateaus correspond to a phase shift.
          2. Ramps correspond to a phase slope (frequency offset).

        Conclusion

        In this post, I have discussed the differences between wander and jitter, the motivation for understanding wander, and delved in to MTIE, a wander metric important to standards compliance and useful in sizing buffers.

        I hope you have enjoyed this Timing 201 article. In the Part 2 follow-up post, I will discuss another important wander metric: TDEV or Time Deviation.   

        As always, if you have topic suggestions or questions appropriate for this blog, please send them to kevin.smith@silabs.com with the words Timing 201 in the subject line. I will give them consideration and see if I can fit them in. Thanks for reading. Keep calm and clock on.

        Cheers,
        Kevin

        References

        [1] ITU-T G.810 Definitions and terminology for synchronization networks
        https://www.itu.int/rec/T-REC-G.810-199608-I/en

        [2] Telcordia GR-253-CORE, Synchronous Optical Network (SONET) Transport Systems: Common Generic Criteria
        The official version is orderable but not free from
        https://telecom-info.njdepot.ericsson.net/site-cgi/ido/docs.cgi?ID=SEARCH&DOCUMENT=GR-253&.
        My old copy is Issue 3, September 2000 but the fundamentals have not changed with the newer issues.

        [3] Understanding Jitter and Wander Measurements and Standards, 2003
        http://literature.cdn.keysight.com/litweb/pdf/5988-6254EN.pdf
        This old Agilent (now Keysight) document remains a treasure, especially for SONET/SDH jitter and wander. See “Cause of wander” starting on p. 118.

        [4] ITU-T G.8262 Timing characteristics of a synchronous equipment slave clock
        https://www.itu.int/rec/T-REC-G.8262-201811-I/en

        [5] K. Shenoi, Clocks, Oscillators, and PLLs, An introduction to synchronization and timing in telecommunications, WSTS – 2013, San Jose, April 16-18, 2013
        https://tf.nist.gov/seminars/WSTS/PDFs/1-1_Qulsar_Shenoi_tutorial.pdf
        An excellent tutorial. See slide 12.

        [6] L. Cossart, Timing Measurement Fundamentals, ITSF November 2006.
        http://www.chronos.co.uk/files/pdfs/itsf/2006/workshop/05-Cosart.pdf
        Another excellent tutorial. See slides 40 – 41.

         

         

         

        Example_MTIE_Calcs.xlsx

      • Factories are Dirty: How to Protect 24 V Digital Inputs & Outputs in Industrial Environments

        Tracy Boyd | 03/76/2021 | 04:39 PM

        Industrial environments demand a lot from control systems. Devices such as programmable logic controllers (PLCs) must operate continuously with various components and as little maintenance and downtime as possible. However, a PLC is only as good as the input /output capabilities of the digital channels connected to the industrial ecosystem. Harsh, noisy environments and various unknown factors can all contribute to design challenges that affect digital channel reliability, resulting in possible circuit damage, downtime, and system failure. In the dual webinar sessions, Protecting 24 V Digital Outputs from the Unknown and Factories are Dirty – Protecting Industrial Digital Inputs, senior product manager Asa Kirby and applications engineers Travis Lenz and Kevin Huang describe the design challenges specific to industrial digital channels and how to mitigate them using Silicon Labs' Si834x and Si838x digital isolator devices.

        Environmental Challenges

         

        Figure 1. The Harsh Industrial Environment

         

        Industrial ecosystems present a multitude of conditions that can result in damage to digital input and output channels. The most common challenges include:

        • Limited design space. Devices are often inside small cabinets shared with other components.
        • Interoperability with components from multiple vendors due to longevity of systems.
        • Unknown sensors, actuators, and cables 
        • Wide range of power sources with varying quality. For example, undersized power sources or defective or failing power sources can cause brownout conditions.
        • Multiple noise types due to high voltage sources, shared cable trays, and long cable runs with poor shielding and grounding. 
        • Wide range of temperature and humidity conditions due to devices operating outdoors in extreme temperatures.
        • 24/7/365 operation. Controllers are expected to automate tasks with little interaction. 
        • Lifetimes measured in decades. Over time, multiple generations of components are added and serviced by various technicians.  

        Input/output-specific challenges include managing overload conditions and driving inductive loads for outputs and device compatibility and assembly/installation protection for inputs. Industrial systems must be able to handle all these varied design challenges while operating in harsh environments. 

        Silicon Labs’ Digital Isolator Solutions

        Silicon Labs' digital isolators provide optimal solutions to the unique challenges of industrial environments. Our Si834x isolated smart switches are ideal for driving resistive and inductive loads, including solenoids, relays, and lamps commonly found in industrial control systems. They are fully compliant with IEC61131-2, so they interoperate well with other channels. Each switch can detect an open circuit condition and is protected against over-current, over-voltage from demagnetization (inductive kick or flyback voltage), and over-temperature conditions. An innovative multi-voltage smart clamp can manage an unlimited amount of demagnetization energy (EAS). Si834x switches are available in Parallel or SPI input types and sourcing or sinking output types. With substantial power savings and a compact 9x9 DFN package, these switches reduce board space and design headache!

        Figure 2. Si834x Smart Clamp

         

        Our Si838x isolated multi-channel input isolators are high-density, highly flexible devices that are ideal replacements for traditional optocouplers. They offer eight channels of 24 V digital field interface in a single compact QSOP package with integrated safety rated isolation. With a few external components, this structure provides compliance to IEC 61131-2 switch types 1, 2, or 3. The input interface is built on our ground-breaking CMOS-based LED emulator technology, which means the devices can handle sourcing or sinking configurations without a power supply on the field side. By utilizing our proprietary silicon isolation technology, these devices support up to 2.5 kV RMS withstand voltage, enabling high-speed capability, high noise immunity of 25 kV/µs, reduced variation with temperature and age, and better part-to-part matching. One Si838x isolator can replace eight traditional optocouplers, making them ideal solutions for space-constrained industrial facilities.

        Figure 3: Si838x Legacy Optocoupler Replacement

         

        Watch these webinars to learn more about how our digital isolators provide optimal solutions to the unique challenges and harsh conditions of industrial environments: Protecting 24 V Digital Outputs and Factories are Dirty. To learn more about our Si834x and Si838x devices, contact your Silicon Labs sales representative.

      • Level 3 PSA Certification – What it is and Why it Matters

        Mike Dow | 03/75/2021 | 01:30 PM

        Silicon Labs recently received the highest level of certification available (see press release) through the well-known Platform Security Architecture, or PSA. This Level 3 certification, which has been designed to provide laboratory assessment of IoT chips with substantial security capabilities, represents a significant milestone for chip vendors targeting connected devices. We’re actually the first silicon provider to achieve this but what does it mean and why should any device manufacturer care?

        What is Platform Security Architecture?

        Before Arm developed PSA Certified and shared it with the world, it was essentially left to each silicon vendor to develop its own security systems. Of course, this resulted in varying degrees of robustness and confusing terminology in describing the different solutions. Arm responded by spending several years talking to security experts in the semiconductor world and coming up with a universal architecture that took all of those good ideas and put them into a single security architecture specification they named the “Platform Security Architecture” with the mission of providing what they called a “Hardware Root of Trust” in a secure microcontroller.

         

        Some tenants of this “Hardware Root of Trust” philosophy are functions, including:

        • Secure Boot to make sure the initial code running on the silicon can be trusted
        • Secure storage for things like secret keys
        • A secure method for updating the secure trusted code
        • A way to safely isolate secure code from non-secure code bases
        • Solid proven cryptography
        • Secure debug ports

         

        Enter PSA Certified

        If Arm had stopped there, customers would still be taking the word of silicon vendors about its PSA implementation. Arm recognized this and created the PSA Certification process. They formed psacertified.org, joining other heavy hitters in the security certification industry including Brightsight, Riscure, UL Security Solutions, and TrustCB.

         

        PSA Certified’s first priority was to write a simplified protection profile, starting with the PSA Architecture as a base requirement, then add assurance levels on top of that. Protection Profiles define “what” security a vendor is claiming in a particular component. The assurance level just means to what level or extent the security features in the Protection Profile are evaluated or tested.

         

        So PSA Certified set about creating three separate documents. The first was what they called a Level 1 questionnaire which is a self-assessment of how a vendor meets the PSA “Root of Trust”. This questionnaire is submitted to TrustCB for scrutiny to prevent manipulation. The two other documents were Protection Profiles for two different levels of assurance against software and physical attacks.

         

        By far the most common attacks are software attacks, which can be either local (the device is in your hands), or remote (you are connecting to the device either wired or wirelessly via some communication medium). The PSA Level 2 Protection Profile specifically addresses scalable software attacks and details security functions necessary in the silicon to prevent those types of attacks. PSA Level 2 is not simply a questionnaire, but also requires independent third-party labs to spend a specified amount of time and various methods trying to break the prescribed Level 2 security functions.

         

        PSA Level 3 adds hardware attacks (again either local or remote), which have historically required more time,  more experience, a much more expensive equipment to execute. So, if local hardware attacks aren’t as common as software attacks, why would Silicon Labs, or any other vendor, go through the trouble of getting this high level of certification? The answer is because there are tools reaching the market that effectively remove two of these barriers by bringing down the experience required and the cost of equipment for a physical attack. For example, NewAE has a product called ChipWhisperer  and for a mere $3,800 you can get a starter kit that makes it possible to do some pretty effective side channel analysis attacks by stealing secret keys in the device as they are being used in the crypto operations. This same company also sells a tool for $3,300 called ChipShouter which is an inexpensive EMF fault injection tool which can cause the software in a product to glitch (often called glitch attacks) and allow malware to be injected in the product or do things unlock a locked debug port. I am sure there are more advanced tools available on the dark web that are even more deadly, these are just examples of tools that are easily bought by anyone.

         

        The Growing Risks of Inaction Against Physical Attacks

        With these relatively cheap tools, a criminal enterprise can pretty easily do some serious damage to a brand, ecosystem, or the bottom line of a company. An easy way to make money if you’re an organized cyber criminal is to steal the intellectual property of a company and sell it to someone who has the resources to produce knock-offs of those devices. It’s estimated that 10 percent of consumer electronic devices sold on the web are counterfeit, including sophisticated devices like a Wi-Fi router. Companies try to protect against IP theft by locking the debug port to prevent someone from simply dumping the whole contents of the product. With the ChipShouter tool, you can simply perform a glitch attack on the software that locks the debug port and boom, all the IP comes spilling out.

         

        Another example might be when you have a sophisticate attestation procedure for your ecosystem to protect against rouge or fake devices from joining your network. This requires a secure identity in the device and a secure handshake to verify your device is authentic. With ChipWhisper and a real device in your hands, you can steal that secret identity and clone the device easily.

         

        Silicon Labs is committed to anticipating our customers’ security needs and addressing them before they become an issue. That’s why we’ve adopted the PSA Architecture and achieved its highest level of certification - to create products that proactively stay ahead of this ‘cyber mafia’ rather than being forced to react to them after they’ve wreaked havoc.

        For more information on how Silicon Labs is securing the IoT, visit silabs.com/security.

      • Smart Medical Devices are Here to Stay – Securing them is Critical

        Emmanuel Sambuis | 03/64/2021 | 10:03 PM

        The healthcare industry is very focused on treating chronic diseases, providing effective aging-in-place support for an increasingly elderly population, and ensuring a smooth transition between inpatient hospital care and outpatient home care. The coronavirus and its impact on remote care have underscored and accelerated the importance of and demand for continuous patient monitoring provided by intelligent sensor solutions connected remotely to a cloud-based infrastructure. This has triggered the need to build secure, low-power wireless end-products that keep end-user data privacy at the core of their security architecture.

        That was the topic of discussion I had the pleasure of participating in during a recent Parks Associates Connected Health Summit panel discussion regarding smart medical devices.

        I encourage you to watch the discussion, which spanned a range of challenges and opportunities facing smart medical devices, perhaps most importantly the necessity to ensure healthcare data is kept private and secure.

        The rise of connected medical devices has caught the attention of hackers, who are launching more attacks on operational and infrastructure targets, typically using ransomware schemes to enrich organized crime groups. As highlighted at the RSA conference in early 2020, the level of sophistication of these ransomware attacks is growing exponentially, and – if left unprotected – vulnerable wireless devices are an effective means to compromise systems remotely using a wide variety of attacks. In order to combat the threat of cybercrime, it’s clear that the individual components being used in medical devices must have an enhanced level of security robustness that delivers security from chip to cloud.

        Bluetooth® Low-Energy (BLE) has become the most popular wireless connectivity solution for patient monitoring products and the Bluetooth SIG began introducing protocol level security features in 2015 with the ratification of BLE 4.2.

        In addition to the BLE 4.2 security protocol, more stringent system-related security augmentations must be deployed to most effectively secure data and privacy. This is especially true for BLE, as the way to communicate the end-user / patient information to the cloud is often performed using a smart-phone and software application that jointly offers vulnerabilities for hackers attempting to gain control of medical sensors. 

        Additional security starts with the need to identify the end-product application and the silicon ICs used the first time these ICs initiate a connection to the cloud infrastructure. It is also critical to understand that embedded systems assume that the proper software is executed. To achieve this, a Root of Trust (RoT) must be in place so that true software authentication is performed before any code execution. This ensures that malicious software can be detected and reported and that additional measures can be deployed as needed, such as immediately cutting off the potentially infected medical product from the network.

        The lifecycle of many medical products is long, often available for purchase for several years after they are first produced. All the while, hacking techniques continue to evolve. New tools can help expose weaknesses, new hacks can occur, and new flaws can be discovered. It is therefore critical that connected medical devices are equipped to be remotely updated through secure over-the-air (OTA) updates.

        Silicon Labs made a major announcement in 2020 with its Secure Vault Technology on EFR32 Series-2. Secure Vault offers an impressive list of technical hardware and software features that can be used to develop extremely robust, secure IoT wireless solutions. These features include Secure Loader with Root of Trust, Secure Debug with lock and unlock capabilities, Secure Key generation and storage, and Advanced Hardware Cryptography with DPA countermeasures.  Secure Vault has achieved tremendous recognition on the market and earned a gold medal in the 2020’s LEAP (Leadership in Engineering Achievement Program) Awards Connectivity category.

        PSA Certified – a respected security certification body for Internet of Things (IoT) hardware software and devices created by Arm Holdings – officially certified Level 3 status to Silicon Labs’ EFR32MG21 wireless SoCs with Secure Vault. Silicon Labs is the world’s first silicon innovator to achieve PSA Certified’s highest level of IoT hardware and software security protection.

        Secure Vault can help ensure that BLE-connected patient monitoring devices such as Continuous Glucose Meters and Pulse Oximeters remain secure, safeguarding private and confidential healthcare data.

      • Silicon Labs Details the Future of Embedded IoT Solutions at Embedded World 2021

        Mike Silverman | 02/57/2021 | 01:15 PM

        As most in our industry are no doubt well aware, Embedded World 2021 is happening this week. Although this year’s event is virtual instead of in-person as it typically is in Nuremberg, Germany, embedded technology innovators from around the world will be logging-in to participate, and Silicon Labs is no exception.

         

        In fact, Silicon Labs will be sharing our IoT expertise throughout Embedded World 2021, with presentations and papers focused on a variety of topics: the compelling advantages of Wi-SUN mesh technology for smart city utility applications, the most pressing IoT security issues the embedded industry faces today, and how to prevent bad actors from penetrating embedded hardware and software applications.

        We’ll also be front and center of an expert panel discussion exploring the latest developments in wireless connectivity solutions for IoT, ranging from interoperability to security and reliability, and provide an outlook towards the future.

         

        Here’s the roster of the Silicon Labs experts presenting this week, what they’ll be presenting, and when. We hope you’ll join us for all of them!

         

        March 3rd:

        • Soumya Shyamasundar, IoT smart city product manager, and Abitzen Xavier, senior marketing manager, and Wi-SUN Alliance board member, will present “Wi-SUN – Key to Unlocking Massive IoT.” Wireless Smart Ubiquitous Network (Wi-SUN) is the leading IPv6 sub-GHz mesh technology for smart city and smart utility applications. Wi-SUN brings Smart Ubiquitous Networks to service providers, utilities, municipalities/local government, and other enterprises, by enabling interoperable, multi-service, and secure wireless mesh networks. Wi-SUN can be used for large-scale outdoor IoT wireless communication networks in a wide range of applications covering both line-powered and battery-powered nodes. This presentation aims to illustrate the benefits of Wi-SUN as a technology and discuss solutions to grow Wi-SUN to create smart connected sustainable cities.

         

        March 4th:

        • Jakob Buron, IoT senior staff engineer, will present “Analyzing a Real-World Wireless IoT Encryption and Authentication Protocol Using a Threat Modeling Framework.” This presentation analyzes the security properties of the Z-Wave Security2 framework, considering the residential home environment it is designed to operate in. The analysis includes 10 years of experience operating and maintaining real-world secure wireless residential networks. A selection of embedded mitigations techniques will be presented and their effect on selected attacks will also be analyzed. 

         

        • Josh Norem, senior systems engineer, will present “Security is a System Level Problem: a Case Study.” Norem will present two case studies highlighting how the interaction between two function entities can result in non-functional security. The first looks at a cross-discipline issue showing how the function of one group (in this case Product Test Engineering) can unintentionally bypass security processes and render device features such as secure boot or debug port locking ineffective. The second looks at an implementation of secure key storage that when combined with an ECC accelerator cannot store keys securely, despite both blocks being secure when in isolation. These case studies will highlight the importance of conducting comprehensive security analyses and highlight the types of issues that security engineers should be on the lookout for.

         

        • DeWitt Seward, IoT principal engineer, will present “Fuzz Attacks for Embedded Network Devices” and “Sidechannel Analysis in Embedded Devices.” The Fuzz Attacks presentation covers fuzz testing, which is used to find bug bounties for operating systems and web protocols. As more deep embedded devices become connected to the internet, they are subject to the same types of attacks as other internet-connected devices. Fuzz testing can and should be used to effectively harden embedded networking stacks. Seward’s Sidechannel Analysis presentation will demonstrate how easy it can be to extract an AES key using sidechannel in the real world and how to use this to select sidechannel-hardened solutions for future embedded system product developments. As more connected devices are deployed, manufacturers need to make sure their devices are built on secure platforms.

         

        March 5th:

        • Anders Pettersson, IoT field marketing director, will participate in an expert panel entitled “Embedded Connectivity in IoT – Quo Vadis.” This panel will explore the latest developments in wireless connectivity solutions for IoT, touching on a range of topics from interoperability to security and reliability, as well as provide an outlook on the future of the incredibly dynamic IoT space.

        We hope you’ll join our embedded IoT experts online at Embedded World 2021. For more information on Silicon Labs’ state-of-the-art security solutions, visit silabs.com/security. For more information regarding the advantages of Wi-SUN for smart city mesh networking applications, we encourage you to read our recent guest blog Q&A with Wi-SUN president and CEO Phil Beecher.

      • A Trillion-Dollar Question for Industrial IoT

        Mikko Niemi | 02/49/2021 | 04:20 PM

        Yes, the title of this post is correct. In 2017, ARC Advisory Group estimated the global downtime in manufacturing industry is in the range of one trillion dollars annually. That is a lot of money, and to put it into a perspective, the global GDP in 2019 according to World Bank was 87.8 trillion dollars. It is not surprising that reducing the downtime is one of the most attractive outcomes industrial IoT can provide.

        Industrial Engineer

        Why does downtime cost so much and how to reduce it?

        What options exist in reducing downtime? Predictive maintenance has proven a cost-efficient application to address downtime challenges and provide ROI to justify projects. IoT Analytics forecasts that the predictive maintenance market is growing at 39% CAGR to $23.5 billion dollars in 2024. What makes predictive maintenance so attractive is that it addresses two key issues at the same time. If the machinery or components like motors, pumps and bearings are run until they fail, there can be more costly damages done to the equipment due to the failure. In addition, there is the time spent by the staff trying to get replacement parts on site and then working overtime to fix the issue. All of this adds to the final cost of an unplanned downtime event and contributes to lost production. On the other hand, if the equipment is over-serviced by changing wearing parts too often or too early, the downtime also increases because of the too-frequent scheduled service breaks. In predictive maintenance, the algorithms use sensor data collected from the machinery and components to warn the operator of a future failure condition ahead of time, allowing ample time to schedule and plan for the maintenance before the failure occurs.

        Key care-abouts in predictive maintenance

        Predictive maintenance solutions commonly rely on detecting anomalies in vibration fingerprints of motors, pumps, bearings, and other devices that run the industrial and commercial processes. Because cabling costs for adding vibration sensors are immensely high, these sensors are typically leveraging wireless communications and powered from a battery. We have some unique advantages for predictive maintenance solution developers. Our products include industry-leading low-power consumption wireless SoCs and modules. Using the built-in low-power modes, the sensors can benefit from fast wakeup times and balancing time between sleep and active modes. This power optimization translates into longer battery life, which means lower total cost of ownership (TCO) for the end customer because the sensors require less maintenance during their lifetime. 

        Choosing the best-fit wireless technology for your application

        The environments in which predictive maintenance solutions are deployed vary to a large degree. This is why the solution developer should partner with a communications expert like we that can support a wide range of wireless technologies in multiple frequency bands. For longer-range needs, technologies such as Wi-SUN, Mioty, or other sub-GHz options are more suitable. Local networks within a factory or a plant could benefit from Bluetooth and mesh technologies, or leverage existing dual-band Wi-Fi infrastructure to connect the sensors.

        Embedded AI/ML changing the landscape for predictive maintenance

        Artificial intelligence and machine learning (AI//ML) has extended its reach from being a cloud-level application requiring massive computing resources to something that can be efficiently executed at Cortex-M level microcontrollers. Silicon Labs' AI/ML partners have built tools that allow predictive maintenance algorithms to run on just a few kilobytes of RAM memory. The edge pre-processing means that the local radio can be turned off until there is an anomaly that needs to be reported to the back office system and the operator. This can further conserve the precious battery capacity and enhance the TCO.

        How to get started?

        If you want to take part in solving this trillion-dollar question, a good place to start is by exploring our Thunderboard Sense 2 Evaluation Kit. This kit integrates wireless communications with an array of sensors, including accelerometer and temperature, which are the most common in predictive maintenance applications. You can also browse our Design Network for partners, who can help you to design solutions that run on our wireless SoCs and modules. Finally, take a look at our recent case study on Sensemore, which chose our pre-certified Bluetooth modules for its predictive maintenance sensor. This decision allowed them to fast-forward their development efforts and get to the market quicker.
         

      • IoT Hero Arrow Growhouse Helps Commercial Farmers Use Less Water, Space, and Pesticides

        Andrea Mencia | 02/42/2021 | 07:38 PM

        We recently had the opportunity to speak with Dave DeMona, Arrow Electronics’ engineering manager for lighting, about Arrow’s new smart horticulture platform: Arrow Growhouse. Concerns about global population growth, sustainability, and ecologically friendly farming are encouraging growers to adopt innovative technologies to improve farming practices. 

        Late last year, Arrow Electronics – one of the leading electronics distribution companies – introduced a new IoT platform with superior lighting controls. These controls help the commercial farming industry improve crop yield and gain better control of their indoor crops, decreasing water, space, and pesticide usage. The platform also equips growers with remote wireless control and monitoring of indoor farming operations and conditions. The demand for smart agriculture products such as this one has been growing rapidly. Dave explains below what prompted Arrow to build the scalable and smart horticulture system and how exactly it works.  

        Can you tell us about Growhouse? 

        The Arrow Growhouse platform is a flexible, scalable, smart agriculture solution for monitoring and controlling key aspects of a commercial growing environment. It combines environmental and plant-level monitoring and multichannel lighting control into a single, cloud-based user interface with both a web and mobile app. It's compatible with most of the horticulture luminaires currently in the market, and the underlying architecture allows for easy development of additional sensing and control modules based on a customer's individual needs. 

        What components are included in the platform? 

        The system can be bought either piecemeal or as a complete system, depending on what the user needs. The basic kit includes a gateway that communicates back to the cloud and a multichannel LED controller that connects to the horticulture luminaire itself, allowing the user to control the different color channels. The kit also includes a soil sensor to monitor the moisture level and the pH of the soil. Customers can add more sensors and controllers as needed.  

        The architecture of the system is customizable: if a farmer has unique needs and wants to monitor aspects of the system that the base package doesn't cover, it's easy for us to develop additional sensor modules to fit their needs. 

        What was the inspiration behind creating this smart horticulture solution? 

        Over the past few years, we've been involved with a number of different horticulture and horticulture-adjacent customers. We noticed that – although clients had great ideas on how to optimally grow plants – there was an underlying set of fundamental requirements. This client base is predominantly growers, not hardware and software experts, so we thought: What if we built a base platform that could be individualized and customized for their unique needs?  

        How long has the product been available? 

        The product was launched last year and was enabled by a combination of recent technology advancements:  

        • The maturation of LED technology enables practical implementation of controllable LED luminaires for horticulture. Suddenly, farmers could control the spectrums that a plant sees throughout its growth, which can trigger specific characteristics.  

        • In addition, advances and cost reductions in communication and sensing started to allow for better monitoring of what's happening at the plant level.  

        These combined factors sparked a revolution a few years ago and this reflects on the feedback surrounding Growhouse to date. Systems have historically been disparate and manual (such as lighting, environmental controls, and fertigation), but Growhouse integrates all of the monitoring and control capability into a single, intuitive user interface.   

        Why did you select Silicon Labs’ technology for your platform? 

        Like many IoT platforms, Growhouse involves a gateway, end devices, and communication to a cloud and a user interface. Communication between our end devices is via Zigbee, and communication for commissioning is via Bluetooth. We chose Silicon Labs Zigbee modules for the radio because it’s a high-performing, integrated dual technology that tackles our needs. 

        What are the primary market drivers of smart horticulture? 

        Growth in the market is due to a variety of needs: resource conservation, population growth, a desire for local production, reduced transport of produce and grown items, and the reduced use of pesticides and fertilizers. A lot of these needs tie back to the intent of creating an ecologically sustainable method of farming.  

        Smart agriculture also provides a highly controlled environment, so growers end up with not only faster-growing crop yields, but more consistent yields with less waste fallout. Adding control to different aspects of the growth environment allows the grower to ensure their crop is behaving the way they want it to, when they want it to.  

        There has been a boom in indoor horticulture in recent years. How is indoor farming better for the planet? 

        It really is all about the control of the plant environment. When you're growing outside, you're subject to the whims of the weather. With indoor horticulture, the grower has complete control over that environment, leading to significantly reduced water usage and needs for fertilizers and pesticides. Indoor agriculture also allows for farming in regions that may be unsuitable for certain outdoor crops. For example, in some areas in Africa where you really can’t grow certain crops in the ground, growing food within a warehouse or container allows people to cultivate locally. 

        How do you see IoT technology supporting sustainable agriculture in the future? 

        We look at the evolution of farming as the evolution of human history. Until recently, we haven't had a lot of insight and data into how to farm better. The direction I see IoT going in smart agriculture is in the implementation of AI: doing something with all the newly derived data now being gathered on a more and more granular level. I think we will see a continuation of automation from the time the seed is planted in the ground until it's ready to harvest.  

        Everything will be based on the sensors' data and the rules developed, enabling better quality and crop consistency, less fallout, and more locally grown crops. We'll start seeing smaller versions of these systems at a local level – whether that be for a small city or a college campus – all the way to the point where we may have these systems in our own homes, much like a micro-garden in your kitchen. Regardless of how green your thumbs are, you'll be able to create quality produce at home, and get rid of all the transportation needs and other external factors. 

      • 1
      • 2
      • 3
      • 4
      • 5
      Next

      Tags

      • Wireless
      • High Performance Jitter Attenuators
      • EFR32FG22 Series 2 SoCs
      • EFR32MG21 Series 2 SoCs
      • Security
      • Bluegiga Legacy Modules
      • Zigbee SDK
      • ZigBee and Thread
      • EFR32BG13 Series 1 Modules
      • Internet Infrastructure
      • Sensors
      • Wireless Xpress BGX13
      • Blue Gecko Bluetooth Low Energy SoCs
      • Z-Wave
      • Micrium OS
      • Blog Posts
      • Low Jitter Clock Generators
      • Bluetooth Classic
      • Makers
      • Flex SDK
      • Tips and Tricks
      • timing
      • Smart Cities
      • Smart Homes
      • IoT Heroes
      • Reviews
      • RAIL
      • Simplicity Studio
      • Tiny Gecko
      • EFR32MG22 Series 2 SoCs
      • Mighty Gecko SoCs
      • Timing
      • Temperature Sensors
      • Blue Gecko Bluetooth Low Energy Modules
      • Ultra Low Jitter Clock Generators
      • General Purpose Clock Generators
      • EFR32BG22 Series 2 SoCs
      • Industry 4.0
      • Giant Gecko
      • 32-bit MCUs
      • Bluetooth Low Energy
      • 32-bit MCU SDK
      • Gecko
      • Microcontrollers
      • Jitter Attenuators
      • EFR32BG21 Series 2 SoCs
      • News and Events
      • Wi-Fi
      • Bluetooth SDK
      • Community Spotlight
      • Clock Generators
      • Biometric Sensors
      • General Purpose Jitter Attenuators
      • Giant Gecko S1
      • WF200
      • Flex Gecko
      • Internet of Things
      • 8-bit MCUs
      • Oscillators
      • Wireless Jitter Attenuators
      • Isolation
      • Powered Devices
      • Power

      Top Authors

      • Avatar image Siliconlabs
      • Avatar image Jackie Padgett
      • Avatar image Nari Shin
      • Avatar image lynchtron
      • Avatar image deirdrewalsh
      • Avatar image Lance Looper
      • Avatar image lethawicker

      Archives

      • 2016 April
      • 2016 May
      • 2016 June
      • 2016 July
      • 2016 August
      • 2016 September
      • 2016 October
      • 2016 November
      • 2016 December
      • 2017 January
      • 2017 February
      • 2017 March
      • 2017 April
      • 2017 May
      • 2017 June
      • 2017 July
      • 2017 August
      • 2017 September
      • 2017 October
      • 2017 November
      • 2017 December
      • 2018 January
      • 2018 February
      • 2018 March
      • 2018 April
      • 2018 May
      • 2018 June
      • 2018 July
      • 2018 August
      • 2018 September
      • 2018 October
      • 2018 November
      • 2018 December
      • 2019 January
      • 2019 February
      • 2019 March
      • 2019 April
      • 2019 May
      • 2019 June
      • 2019 July
      • 2019 August
      • 2019 September
      • 2019 October
      • 2019 November
      • 2019 December
      • 2020 January
      • 2020 February
      • 2020 March
      • 2020 April
      • 2020 May
      • 2020 June
      • 2020 July
      • 2020 August
      • 2020 September
      • 2020 October
      • 2020 November
      • 2020 December
      • 2021 January
      • 2021 February
      • 2021 March
      • 2021 April
      Silicon Labs
      Stay Connected With Us
      Plug into the latest on Silicon Labs products, including product releases and resources, documentation updates, PCN notifications, upcoming events, and more.
      • About Us
      • Careers
      • Community
      • Contact Us
      • Corporate Responsibility
      • Privacy and Terms
      • Press Room
      • Investor Relations
      • Site Feedback
      • Cookies
      Copyright © Silicon Laboratories. All rights reserved.
      粤ICP备15107361号
      Also of Interest:
      • Bring Your IoT Designs to Life with Smart,...
      • Using Wi-Fi for Low Power IoT Applications
      • A Guide to IoT Protocols at Works With...