This week, we’ve introduced a Wireless Gecko software solution created to simplify industrial and commercial IoT applications using sub-GHz wireless connections by adding Bluetooth connectivity. The new hardware and software solution enables simultaneous sub-GHz and 2.4 GHz Bluetooth low energy connectivity for commercial and industrial IoT applications, such as smart metering, home and building automation, and commercial lighting.
This is important for the industrial and commercial sectors for several reasons – for one, it’ll make it much easier for people working in these environments to set-up, control, and monitor sub-GHz IoT devices using Bluetooth low energy mobile apps.
Sub-GHz wireless protocols are used extensively in industrial and commercial settings because many of them require a combination of energy efficiency, long battery life, and extended range for remote sensor nodes. Proprietary sub-GHz protocols allow developers to optimize their wireless solution to their specific needs instead of conforming to a standard that might put additional constraints on network implementation. With our new software solution, sub-GHz protocols can still be utilized for their benefits, but users can also easily manage the system using Bluetooth mobile apps on a variety of devices, such as tablets or smart phones.
Sub-GHz environments are typically low-data-rate systems, such as simple point-to-point connections to large mesh networks and low-power wide area networks (LPWAN). By adding Bluetooth with low energy connectivity to wireless networks in the sub-GHz band, developers can deliver new capabilities such as faster over-the-air (OTA) updates and deploy scalable, location-based service infrastructure with Bluetooth beacons.
Single Chip Reduces Cost by 40 Percent
IoT developers stand to gain tremendous development benefits by avoiding the complexity of two-chip wireless architectures. By using a single chip with both sub-GHz and BLE connectivity, developers can simplify hardware and software development, which can speed time-to-market and reduce bill-of-materials (BOM) cost and size by up to 40 percent.
Accenture estimates industrial IoT could add $14.2 trillion to the global economy by 2030, making the deployment potential of this solution especially massive. Any new technology developments such as this one that helps developers control and monitor industrial and commercial devices and data more easily leads to efficiency and economic gains for both businesses and the users.
Mobile control applications are often a crucial piece of industrial and commercial automation, giving system operators a quick and easy way to control equipment. For instance, commercial lighting depends heavily on mobile devices, which control lighting on/off schedules, energy efficient modes and rules, and dimming based on occupancy using ambient light sensors. Often times, the mobile app may be the only control interface installers, designers and site managers have for project commissioning and configuration.
Bluetooth connectivity allows the device apps and interface to be simple, which can make a difference in user adoption, as many lighting and commercial controls can be complex and difficult to manage.
Our new solution will clearly yield impressive benefits for both developers and the users of the industrial applications. Fortunately, the new multiprotocol software is now available using Silicon Labs’ EFR32MG and EFR32BG Wireless Gecko SoCs. Check out more details here if you’re working on a product that could benefit from the solution.
We recently had the chance to speak with Jean-Noel PAILLARD, advanced studies manager at Hager Group, a 62-year-old German-based company providing solutions and services for electrical installations in residential, commercial, and industrial buildings.
With an extensive history of making electronics work seamlessly within buildings, the family-owned Hager Group has a unique perspective to modern today connectivity issues. Hager Group solves multi-protocol and inoperability issues regularly for its global client base, and recently released a new smart home platform for building automation. Jean-Noel shared his insight on why the company developed the new platform, and explained some of the current challenges associated with connectivity standards.
Tell me about the importance of multiprotocol connectivity and why it’s important to your customer base.
New applications and devices are coming out so quickly, making interoperability a key challenge in today’s technology landscape. There are numerous wireless protocols on the market, and each has its own connectivity strengths and weaknesses, depending on the application. So instead of building new connectivity protocols for each application, we use existing standards and figure out the best ones for each application. We work hard to find the right connections and build the bridge to create the right technology for each of our services and solutions.
Sometimes it seems as if the market sees existing wireless standards as a “standards battle” vs. everyone trying to work together. Do you think eventually one standard will emerge as the winner?
In my opinion, no connectivity protocol has emerged as the winner yet, and it’ll be an extremely long time before that happens - if it does at all. In the meantime, you have to be agile and willing to work with numerous technologies and standards. Hager Group has the right tools and technology on board to do this effectively, and it is one of the core values we provide to our customers.
Can you tell me about your new smart home platform and smart RF module? What was the impetus for creating the technology?
The first driver for us to create the platform was the size and growth potential of the smart home market in the future. We built the platform to ensure we could serve our customers as successfully as possible as IoT adoption in buildings continues to grow. Depending on the country or region we are serving, the technologies and standards vary greatly, creating inoperability and wireless challenges. By building a new platform, we could overcome this challenge and address all kinds of services and solutions, regardless of region. But in order to do this, we needed a platform that could handle multiple frequencies and protocols.
We built the platform with our OEM customers in mind, as they have specific requirements and really need a platform addressing a variety of protocols. In addition to being multi-protocol, we knew the platform had to be as small as possible, require low-power, and be able to address numerous applications.
Is that how Silicon Labs’ Wireless Gecko became involved - size and energy consumption were important?
Yes, exactly. The Gecko is tiny and great in terms of RF transmitting and receiving, plus the security and encryption elements on the SoC are ready to implement and best in class.
What was the technology evaluation process like?
We began the project in 2015 and we were originally looking at three companies, with Silicon Labs being one of them. We ended up rejecting one company early on based on its proposal specs, and the other two competing technologies were directly benchmarked on technical design and technical experimentation. We conducted a good amount of measurement and tests, and finally, after a 4-6-month process, we selected Silicon Labs as the best and the most evolutionary solution.
Tell me a little bit about the 2-year development process for the platform. What were your obstacles and/or surprises?
From a timing standpoint, we wanted to be aggressive, so we worked together with Silicon Labs in a tight partnership to build the optimum design together. We provided the right specification needs to enable your team to adapt the design for our requirements. Technically speaking, the big challenge on our side was understanding the capabilities of your platform because it’s a comprehensive platform. That’s why a solid partnership was so important in this design process - both of the teams at our companies reacted fast to changes and development hurdles and always figured out the right answer at the right moment. The multiprotocol management was difficult because each time we modified one protocol, we had to verify that the other protocol wasn’t affected. Therefore, we were constantly checking to make sure the protocols were not compromising the performance of the other protocol and/or platform.
I know it was just released in January, but what has response been like so far?
Yes, we have implemented the platform for the first time on the hager solution that was introduced in January , making residential distribution boards connected, serviceable and safe : “Hello”.
Hager Group has been protecting homes and families for many years thanks to its reliable and safe electrical installations. As an innovative industrial company, we constantly extend beyond our technological foundations to face the growing demand for connected devices and smart solutions. An example of this is the breakthrough solution is “hello” , A connected plug-in device for an existing electrical installation. It provides real-time alerts in case of electrical issues to guarantee peace of mind for end-users. Away for the weekend? Got some special wines in the fridge? Meat or specific dishes in the freezer?
hello ensures the power availability on important circuits/appliances and will let you know in case of any electrical issue. Your wine cellar can therefore stay at the right temperature. We are currently working on new implementations that will come on the market soon.
Where do you see IoT and connectivity heading over the next 5-10 years?
Two big current trends requiring a lot of IoT connectivity are robotics and artificial intelligence. These new technologies will change IoT from being an obedient system to a mindless system, where you don’t have to care about your system – it works on its own. Today you still have to ask your system to do something, and I think tomorrow you won’t need to.
Moreover , I see IoT solutions, services, and applications being used more for mobility in the future, especially when we speak about transportation, such as electrical vehicles. The challenge will be to connect the electrical cars and the smart home together in a secured, efficient and eco-friendly way.
We want to be very clear: installed, previously paired Z-Wave devices are secure and not vulnerable to a downgrade attack. This represents practically all 100 million Z-Wave devices in homes today.
This type of attack would require physical proximity to the device during the pairing (inclusion) process. The pairing is done during initial installation or reinstallation. Pairing must be initiated by the homeowner (or installation professional), which means the homeowner is present at the time of the attempted attack. It would not be possible to execute an attack without the homeowner becoming aware that the link is running S0, as they would for any other S0 device added to the S2 controller.
We take what Pen Test Partners has reported very seriously and are taking steps to tighten the certification requirements regarding warnings presented to the user. We also believe any warning for a security step needs to be explicit. We are updating the specification to ensure that any user will not only get a warning during a downgrade to S0 but will have to acknowledge the warning and accept it to continue inclusion.
We believe it's important for all smart home devices to have the highest possible levels of security available, and our development team will continue to work with the security community to make improvements to the Z-Wave specification.
We have yet to see the full-fledged economic value of billions of new IoT devices entering multiple industries, though we can prepare ourselves with what we know will come along with it. As with any new innovation and/or market, malicious adversaries and attackers will lurk and invade for their own piece of the pie.
Despite the looming security threats, companies and developers designing new IoT products often like to focus their attention on the application itself versus proper security. Security slows the time-to-market and is often viewed as inconvenient because it increases cost.
But no one wants to design an application that’s prone to hacking or data theft. Undesirable events like high-profile hacks can lead to serious brand damage and loss of customer trust, and worst-case is a slow down or permanent reduction in the adoption of IoT.
When it comes to security, IoT is no different than previous technology innovations such as PCs, smartphones, and the Internet itself. If security is not addressed sufficiently by the creators of the technology – in this case, IoT product designers - the oversight could have devastating effects on the entire market, and it will no doubt have negative consequences for the individual companies opting to design irresponsibly.
Varying Degrees of Security
To avoid these scenarios, designers need to change how they view IoT security. Unfortunately, it’s not as simple as a “to have or not to have” decision. Security is not binary. The reality is there are many different levels of security. A device can only be considered secure in the context of an attacker, when the level of security is higher than the capabilities of the attacker.
Moreover, the capabilities of the attacker are typically non-static, and therefore, the security level will change over time. The improved capabilities of the attacker can come about in several different ways, from the discovery and/or publication of issues and vulnerabilities to broader availability of equipment and tools.
History has taught us some valuable lessons about how fast security threats can change for an object. A typical lifetime of an IoT-device depends on the application, but in industrial applications, 20 years is a common timeframe. A device launched in 1998, for example, was once only vulnerable to nation-state attacks; today it must be able to withstand DPA attacks by hobbyists with $300 for tools, some spare time and lots of coffee. Predicting the future capabilities of a class of adversaries is very difficult if not impossible, especially over a 20-year timespan. How does the adversary look in 2040? One might speculate if it is even human?
The only reasonable way to counter future attack scenarios is for the security of the device to evolve with the increased capabilities of the adversary. This requires IoT security with upgradable software.
Of course, there is functionality requiring hardware primitives, which cannot be retrofitted via software updates. However, it is incredible what can be solved in software when the alternative is a truck-roll. Though, it impossible to predict and account for all future attacks.
Secure updates involve authenticating, integrity checking, and potentially encrypting the software for the device. The software handling such security updates is the bootloader, typically referred to as a secure bootloader. The secure bootloader itself, along with its corresponding cryptographic keys, constitutes the root-of-trust in the system and needs to have the highest level of security. A secure bootloader is functionality IoT vendors should expect to get from the IC manufacturers.
The authentication and integrity check should be implemented using asymmetric cryptography, with only public keys in the device. This way, it is not necessary to protect the signature-checking key in the devices. Since protecting keys in deployed devices is (or at least should be) harder than protecting keys in control of the device owner, it is also acceptable to use the same bootloader keys for many devices.
Encrypting the Software
Encrypting the software running on the IoT device has two benefits. First, it protects what vendors consider to be intellectual property (IP) from both competitors and counterfeiting. Secondly, encryption makes it more difficult for adversaries to analyze the software for vulnerabilities. Encrypting the new software for secure boot does; however, involve secret keys in the device, and protecting secret keys inside a device in the field is becoming increasingly harder. At the same time, newer devices have increased resistance to DPA attacks. Furthermore, a common countermeasure against DPA attacks is limiting the number of cryptographic operations that can take place to make it infeasible to get sufficient data to leak the key. Even though protecting the key is difficult and motivated adversaries will likely extract it, key protection makes attacking more difficult for the attacker.
Another consequence of secure updates is the likely future need for more memory in the IoT device. This is a complicated trade-off for several reasons. First, software tends to expand to the memory available in the device. So, a larger memory device requires discipline from the software team to leave room for future updates. The other complication is the value of free memory in the future versus the device’s initial cost. More memory tends to increase the cost of the device. This cost must be justified both from the device maker and the consumer point of view.
Finally, it is important to have a plan for distributing the security updates. For most devices, these updates use the device’s existing Internet connection. But in some cases, this requires adding or using physical interfaces such as USB drives (i.e., sneakernet). It is also important to consider that the devices might be behind firewalls or in some cases disconnected from the Internet.
IoT device software is often fully owned and managed by the device maker, meaning the device maker should have proven processes in place to internally protect the signing keys and particularly those who can issue updates.
Securing the Future
There is no such as thing as a 100 percent secure-proof device, especially during the entire duration of a product’s lifecycle.
Yet it is possible to understand and prepare for the most likely threats and safeguard for future threats by designing in the ability for upgradable software updates. IoT developers must adopt themselves to this critical mindset of responsible security design. Otherwise, they are placing their innovations, and IoT’s market potential, into the hands of adversaries.
For more on upgradeable security, Silicon Labs’ senior director of product security Lars Lydersen hosted a webinar in which he provided the insight and background to help in evaluating what security functionality is necessary in an IoT design.
Today, we’ve announced the acquisition of Sigma Designs’ Z-Wave business. Adding Z-Wave to our wireless portfolio gives ecosystem providers and developers of smart home solutions access to the broadest range of wireless connectivity options available today.
Together, we’ll open the door to millions of potential users of smart home technologies by expanding access to a large and varied network of ecosystems and partners. Z-Wave’s reputation as a leading mesh networking technology for the smart home with traction in more than 2,400 certified interoperable Z-Wave devices from more than 700 manufacturers and service providers worldwide, coupled with Silicon Labs’ position as the leader in silicon, software, and solutions for the IoT, make this a great match.
Silicon Labs has been actively driving the IoT for years, and we recognize the large following Z-Wave has among developers and end customers in the smart home. With our experience in mesh technologies, we are uniquely positioned to advance the standard and grow adoption with input from the Z-Wave Alliance and partners.
Adding Z-Wave to Silicon Labs’ extensive portfolio of connectivity options allows us to create a unified vision for the technologies underpinning the smart home market: a secure, interoperable customer experience is at the heart of how smart home products are designed, deployed and managed. Our vision for the smart home is one where various technologies work securely together, where any device using any of our connectivity technologies easily joins the home network, and where security updates or feature upgrades occur automatically or on a pre-determined schedule.
Silicon Labs recently had the opportunity to speak with Larry Poon, chief operating officer of IMONT, a start-up software company taking a radical approach to connecting IoT devices by circumventing the cloud. Larry shared how IMONT’s interoperable software connects any type of device to other devices, regardless of the manufacturer. Graham Nice from Skelmir, one of IMONT’s key integration partners, joined our conversation to explain how companies are reacting to IMONT’s new IoT option for connectivity – and how he sees a potential move in the future away from the cloud.
So tell me about IMONT – what exactly do you offer?
We develop device connectivity software. If a company wants software to connect their devices to other devices, we can help them do so in a unique way.
We lower the barrier to entry and the ongoing operational costs of scaling out – we do this by being cloudless and hubless. We’re also much more secure, and we’re interoperable. For example, if a utility company wants to offer a smart home solution that includes devices from other manufacturers - they can connect them all using our software. Otherwise, they would have to use different apps to connect the different manufactured products. By not using the cloud, we save a lot of money for certain customers, such as smart home operators. And obviously, if you don’t use the cloud, it’s more secure.
Can you tell me how your platform avoids using the cloud? And why is it more secure?
The software is mesh-based, and we do everything locally. So if we have to do any transaction or use analytics, we use the edge. That is a big advantage of our system - we never have to connect the device to the cloud. Also, when I say we have no hub, I mean any device in the configuration can be the hub – we don’t require a separate hub. All of the data is within each device itself; therefore, you don’t have to move anything to the cloud. But the cloud option is there because we have made it flexible enough with MQTT for cloud transmission, if a customer wants it.
You can offer this because of your software expertise, whereas a hardware company needs a hub, unless they write software for the edge?
That’s right. Let’s say Samsung, a device manufacturer, wants its products to connect to other devices in a smart home. Everyone wants choices, so it’s hard to find a home with all Samsung devices. In order for all of those devices to be connected, Samsung would typically create a hub, then use their cloud service to interoperate with the other manufacturers’ cloud services, which is not the most efficient way of doing it. But with our system, we’re already there, we’ve already written the code to connect manufacturers; therefore, we are able to avoid using the cloud and a hub.
How do you approach customers with your value proposition?
We’ve been around since August 2016 – so awareness is key right now. We’re a young company, small and lean. We’re knocking on the doors of anyone offering IoT systems, but we partner with companies like Silicon Labs to offer this solution to your customers, who could be looking for this type of solution. We also partner with implementation partners who can get this done for them.
Have you seen people searching for your type of solution, or are you educating people about the option?
It’s a little of both. Every time we talk to someone about it, they say exactly what you say – “oh, this is kind of novel, I never thought about it that way.” But then there’s a certain group of people who are beginning to say, “we don’t really need the cloud.” New articles are starting to crop up about cloudless approaches, but it’s just starting to get noticed. Anyone we end up talking to likes the idea once they hear it – but to go so far as say people are actively looking for a cloudless solution, we’re slowly getting there.
Is data an issue if you’re not using the cloud?
No, our customers can collect all of the data they want – we give them that flexibility, and they can move it to the cloud if they want.
So there’s no real drawback to moving away from the cloud?
No, we don’t think there is. People have no option but to move away from the cloud, data is too expensive.
Graham, tell me about the Java integration and how your companies work together?
Our company is turning 20 years-old this year. We started out providing our virtual machine for running Java on set top boxes in the German-speaking European Pay-TV market. Since then, our customers have deployed over 120 million devices using various iterations of that core virtual machine. We have a history of deploying predominantly in the digital TV space around the world.
In the past six years, we’ve worked in the IoT market, supporting Java-based IoT industry standards and proprietary solutions. In the case of IMONT, we had worked with one of the founders previously and he reached out to us to use our VM to host his new solution.
Since IMONT’s software runs on Java, our role is to help IMONT’s customers get up and running extremely quickly on various platforms and devices.
As a close partner, what is your impression of the market reaction to IMONT?
IMONT has a disruptive approach to deploying IoT. Everybody is all about the cloud, but the cloud has some significant downfalls. For one, it’s horrendously expensive, and you have vast amounts of data constantly feeding up to the cloud, chewing up bandwidth. You also still have privacy concerns - a lot of consumers have an issue with their personal data being moved to the cloud. All of that data incurs costs to operators. The reaction IMONT is getting from service providers is – first, that can’t be done. But then IMONT proves them wrong. Yes, it can be done, and when operators see the cost benefits, it becomes a very compelling proposition. There are a lot of people realizing that the cloud isn’t the way forward and edge computing makes more sense. IMONT provides the framework for edge computing, and hopefully we provide the vehicle to get their technology running on low-end devices, bringing the cost point down for service providers in the home. But it’s not just the home, industrial IoT deployment applications is a market for IMONT, as well.
Larry, how did you start using Silicon Labs’ products?
Our partnership with D-Link strengthened our ties with Silicon Labs. D-Link offers a lot of devices built with Silicon Labs’ technology, so we started making our software work with Silicon Labs.
Where do you see IoT going in the next 5-8 years?
From our perspective, we see devices getting smarter than they already are, yielding greater power efficiency and eventually operating independently of the cloud. We also expect the number and types of IoT device deployments to continue to explode, but consumers are pushing for greater security and seamless connectivity, so we will see significant improvements in those areas, as well.
If you’re planning to develop IoT applications for the EFM32 Giant Gecko or Pearl Gecko, you’re probably already thinking about using a real-time operating system.
It’s quite true that many embedded developers can get by with less sophisticated software based on a simple loop. But the latest EFM32 microcontrollers are packed with complex peripherals that require correspondingly complex application software. And designing IoT devices means dealing with both elevated user expectations and challenging design requirements. All this means that it’s become increasingly difficult for your projects to succeed without an operating system.
So how to get started? It can be daunting to make the sudden jump from bare-metal programming to kernel-based application development. So help you overcome that hurdle, we're producing a ten-episode video series to help smooth the way: Getting Started with Micrium OS.
The series is hosted by Matt Gordon, who has spent more than 10 years helping developers learn how to maximize the potential of the Micrium real-time operating system. He helped establish the Micrium training program, and is currently RTOS Product Manager at Silicon Labs.
The first episodes in the series starts with some basic information about what a kernel does and how kernel-based applications are structured. Matt covers initialization, how the kernel performs task scheduling, and how context switches pass control of the CPU from one task to another. Later in the series, Matt will discuss synchronization, resource protection, and inter-task communication. The series will leave you with a cohesive picture of real-time kernels and Micrium OS.
That’s not all: this series is supplemented with some of the best developer documentation ever produced for embedded systems programming. You can visit https://doc.micrium.com to learn much more about kernel-based application development and the networking and communication stacks that make up Micrium OS.
The Micrium OS kernel is available for free download through Simplicity Studio v4. To download and to find out more about Micrium OS, visit: https://www.silabs.com/support/getting-started/micrium-os
To find the series on YouTube, visit: https://goo.gl/JQ4UaV
And be sure to subscribe to the Silicon Labs YouTube channel to receive notifications of new episodes! https://www.youtube.com/user/ViralSilabs
Check out the first video in the series here:
Although not an entirely new concept, the smart meter market continues to be a major global growth market based on the device’s ability to greatly improve efficiencies for both utility companies and consumers. Markets and Markets estimates the smart meter market to be worth $12.79 billion (2017), and it is expected to grow at a CAGR rate of 9.34 percent from 2017-2022.
Interestingly, the first smart meter was developed pre-Internet, in the 1970s, and it wasn’t until the mid-nineties after the U.S. National Energy Policy Act, and similar utility deregulation efforts across the globe, that smart metering really took off. Widespread deregulation set-up a market-driven pricing environment for utility companies, creating an immediate demand for utility companies to understand the energy consumption rate of their customers in order to keep their costs down, hence a crucial need for smart meters was born.
Modern day smart meters record and report, via a communications network, the consumption of electricity, gas, water, or heating/cooling. By obtaining this level of consumption detail in real-time, utilities can simultaneously reduce costs while increasing customer satisfaction, making smart meter deployments a valuable investment for any type of utility company. Smart meters also play a key role in helping regions meet aggressive climate goals set-up by state and federal governments in many countries.
The benefits are obvious, but from a designer perspective, the types of metering technologies are vast and require detailed knowledge of the market.
The most common type of smart meters use one-way, transmit only communications and are called Automatic Meter Reading (AMC). These meters started out as walk-by or drive-by meters, but eventually have become fully automated with wireless capability, running on a Wide Area Network (WAN).
Advanced Metering Infrastructure (AMI) meters are two-way communications networks that not only produce a reading, but control the meter and equipment and allow the utility to connect or disconnect customers; monitor and anticipate usage changes, allowing for a smart grid operation; and enable software and security updates.
Traditional metrology equipment was used in the earliest smart meters, but today almost all new smart meter designs use electronic equipment, referred to in the industry as static meters.
Electricity meters are probably what most people think of when they hear the term smart meter, and there are two primary kinds of electricity meters. Current Transformers (CT) were the original meter, though now a wide range of MCU-based meters exist, which don’t have the problems associated with transformer-based meters, such as the tendency to get saturated with heavy currents and the susceptibility to tampering.
One of the more popular types of smart meters deployed extensively in Europe and urban areas are Heat Cost Allocator (HCA) devices. These meters are typically used in multi-tenant residential and commercial buildings, and enable a fair cost allocation of a shared heating system, giving tenants heating bills proportional to their usage of the heating system. This meter is hailed by energy conservationists, as it encourages users to reduce consumption, unlike a flat heating bill that doesn’t reward tenants for reduced energy consumption behavior.
In-Home Displays (IHD) is another desired piece of metering, and IHDs are common in homes part of the GB Smart Energy program in North America. These devices have direct wireless connections to the smart meters in the home, and typically use a Zigbee mesh network to display varying utility cumulative and real-time usage rates.
To no surprise to embedded designers, there are numerous communications technologies to choose from when designing a smart meter.
Typical installations use a sub-GHz Field Area Network (FAN) with a star or mesh topology, though another popular option is using equipment with WAN capabilities built directly into the meter with a M2M connection using 2G, 3G or 4G. The new NarrowBand IoT standard has improved the power and cost performance of this approach, creating numerous unlicensed band Low Power Wide Area Network (LPWAN) technology providers. Another major communications network is the Zigbee-based Home Area Network (HAN), which is already deployed in more than 23 million homes in the U.K. The HAN meters have a built-in Zigbee radio, and come with an IHD.
Though, Wi-Fi, Bluetooth and Z-Wave are nowhere to be found in smart meter deployments, due primarily to power constraints. But Bluetooth Low Energy is a viable option if based on a 2.4 GHz radio using a multi-protocol SoC, such as a Silicon Labs Mighty Gecko.
The Power Play
Power is not an issue for electricity meters since they have their own power supply, but power becomes a pivotal issue for heating, gas, and water meters. Specialized lithium batteries have been created for smart meters in recent years – lasting close to 20 years - but not all markets embrace these batteries. China is a good example, as it requires utility customers to replace their double AA batteries every 12-18 months.
Maximizing battery life is an important part of smart meter designs, making the underlying technology components critical to creating a high-performance smart meter unburdened by power restrictions.
Whatever smart meter electronic design pursued, smart meters will continue to prove their worth as a highly efficient way for utilities to compete and run more efficiently, consumers to save money, and societies at large to reduce their environmental footprint.
Morrie Altmejd, a senior staff engineer at Silicon Labs, wrote this article that recently appeared in Electronic Products Magazine.
Designing and implementing an optical heart rate monitoring (HRM) system, also known as photoplethysmography (PPG), is a complex, multidisciplinary project. Design factors include human ergonomics, signal processing and filtering, optical and mechanical design, low-noise signal receiving circuits and low-noise current pulse creation.
Wearable manufacturers are increasingly adding HRM capabilities to their health and fitness products. Integration is helping to drive down the cost of sensors used in HRM applications. Many HRM sensors now combine discrete components such as analog front ends (AFE), photodetectors and light-emitting diodes (LEDs) into highly integrated modules. These modules enable a simpler implementation that reduces the cost and complexity of adding HRM to wearable products.
Wearable form factors are steadily changing too. While chest straps have effectively served the health and fitness market for years, HRM is now migrating to wrist-based wearables. Advances in optical sensing technology and high-performance, low-power processors have enabled the wrist-based form factor to be viable for many designs. HRM algorithms also have reached a level of sophistication to be acceptable in wrist form factors. Other new wearable sensing form factors and locations are emerging, such as headbands, sport and fitness clothing, and earbuds. However, the majority of wearable biometric sensing will be done on the wrist.
No two HRM applications are alike. System developers must consider many design tradeoffs: end-user comfort, sensing accuracy, system cost, power consumption, sunlight rejection, how to deal with many skin types, motion rejection, development time and physical size. All of these design considerations impact system integration choices, whether to use highly integrated module-based solutions or architectures incorporating more discrete components.
Figure 1 shows the fundamentals of measuring heart rate signals, which depend on the heart rate pressure wave being optically extracted from tissue. Figure 1 shows the travel path of the light entering the skin. The expansion and contraction of the capillaries, caused by the heart rate pressure wave, modulates the light signal injected into the tissue by the green LEDs. The received signal is greatly attenuated by the travel through the skin and is picked up by a photodiode and sent to the electronic subsystem for processing. The amplitude modulation due to the pulse is detected (filtering out motion noise), analyzed and displayed
Figure 1. Principles of operation for optical heart rate monitoring.
A fundamental approach to HRM system design uses a custom-programmed, off-the-shelf microcontroller (MCU) that controls the pulsing of external LED drivers and simultaneously reads the current output of a discrete photodiode. Note that the current output of the photodiode must be converted to voltage to drive most analog-to-digital (A/D) blocks. The Figure 2 schematic shows the outline of such a system. Note that the I-to-V converter creates a voltage equal to VREF at 0 photodiode current, and the voltage decreases with increasing current.
The current pulses generally used in heart rate systems are between 2 mA and 300 mA depending on the color of the subject’s skin and the intensity of sunlight with which the desired signal needs to compete. The infrared (IR) radiation in sunlight passes through skin tissue with little attenuation, unlike the desired green LED light, and can swamp the desired signal unless the green light is very strong or unless an expensive IR blocking filter is added. Generally speaking, the intensity of the green LED light where it enters the skin is between 0.1x and 3x the intensity of sunlight. Due to heavy attenuation by the tissue, the signal that arrives at the photodiode is quite weak and generates just enough current to allow for a reasonable signal-to-noise ratio (SNR) (70 to 100 dB) due to shot noise even in the presence of perfect, noise-free op amps and A/D converters. The shot noise is due to the finite number of electrons received for every reading that occurs at 25 Hz. The photodiode sizes used in the design are between 0.1 mm2 and 7 mm2. However, above 1 mm there are diminishing returns due to the effect of sunlight.
Figure 2. The basic electronics required to capture optical heart rate.
The difficult and costly function blocks to implement in an optical heart rate system design, as shown in Figure 2, are the fast, high-current V-to-I converters that drive the LED, a current to voltage converter for the photodiode and a reliable algorithm in the MCU that sequences the pulses under host control. A low-noise (75 - 100 dB SNR) 300 mA LED driver that can be set to very low currents down to 2 mA while still creating very narrow light pulses down to 10 µs is an expensive block to achieve with discrete op amps.
The narrow pulses of light down to 10 µs shown in Figure 2 allow the system to tolerate motion and sunlight. Typically two fast light measurements are made for each 25 Hz sample. One measurement is taken with the LEDs turned off and one with the LEDs turned on. The calculated difference removes the effect of ambient light and gives the desired raw optical signal measurement that is, most importantly, insensitive to flickering background light.
The short duration of the optical pulses both allows and requires a relatively strong light pulse. It is essential to stay brighter than the sunlight signal, which may be present and not allow the PPG signal carrier to be dwarfed by the sunlight signal. If the sunlight signal is larger than the PPG carrier, then although it may be removed by subtraction, the signal can be so large that external modulation such as swinging an arm in and out of shadow can create difficult-to-remove artifacts. As a result, systems that use low-current LED drivers and large photodiodes to compensate suffer severely from motion artifacts in bright light situations
Much of the desired HRM sensing functionality is available pre-designed and integrated into a single device. Packing most of this functionality into one piece of silicon results in a relatively small 3 mm x 3 mm package that can even integrate the photodiode (PD) itself.
Figure 3 shows an example of a schematic with an Si118x optical sensor from Silicon Labs. This HRM design is relatively easy to implement. The engineer just needs to focus on the optical portion of the design, which includes optical blocking between the parts on the board and coupling the system to the skin.
Figure 3. An integrated heart rate sensor requiring only external LEDs.
While the approach shown in Figure 3 results in a high-performance HRM solution, it is not as small or power efficient as some designers would like. To achieve an even smaller solution, the LED die and the control silicon must be integrated into a single package that incorporates all essential functions including the optical blocking and the lenses that improve the LED output. Figure 4 illustrates this more integrated approach, based on a Silicon Labs Si117x optical sensor.
No external LEDs are required for this HRM design. The LEDs and photodiode are all internal to the module, which can be installed right below the optical ports at the back of a wearable product such as a smartwatch. This advantageous approach enables a shorter distance between the LEDs and the photodiode than is possible with a discrete design. The reduced distance allows operation at extremely low power due to lower optical losses traversing the skin.
Integrating the LEDs also addresses the issue of light leakage between the LEDs and the photodiode. As a result, the designer does not have to add optical blocking to the printed circuit board (PCB). The alternative to this approach is to handle the blocking with plastic or foam inserts and special copper layers on the PCB.
Figure 4. A highly integrated HRM sensor module incorporating all essential components.
There is one more part of an HRM design that the developer does not necessarily need to create: the HRM algorithm. This software block residing on the host processor is quite complex due to the signal corruption that occurs during exercise and motion in general. End-user motion often creates its own signal that spoofs the actual heart rate signal and is sometimes falsely recognized as the heart rate beat.
If a wearable developer or manufacturer does not have the resources to develop the algorithm, third-party vendors provide this software on a licensed basis. Silicon Labs also offers a heart rate algorithm for its Si117x/8x optical sensors that can be compiled to run on most host processors.
It is up to the designer to decide how much integration is right for the HRM application. The developer can simplify the design process and speed time to market by opting for a highly integrated module-based approach using a licensed algorithm. Developers with in-depth optical sensing expertise, time and resources may opt to use separate components (sensors, photodiodes, lenses, etc.) and do their own system integration, and even create their own HRM algorithm. Ultimately, when it comes to HRM system design, the developer has a choice of doing it all or purchasing it all.
This week we’re at APEC 2018 and we’ve just introduced two new PoE powered device families designed for best-in-class efficiency and integration for the IoT. Power-over-ethernet is ideally suited for application that require both power and data at a device connected to an Ethernet switch. A couple of the advantages include lower equipment costs and lower installation costs compared to separate data cables and power cables. It also makes use of the massive installed base of UTP cabling for wired Ethernet networks, and is part of IEEE’s 802.3at Ethernet standard, which specifies the technical requirements for the safe and reliable distribution of power over the same CAT-5 UTP cabling.
Our new Si3406x and Si3404 devices offer the highest level of integration available for high-voltage devices on a single power delivery chip and support IEEE 802.3at PoE+ power functionality, power conversion options with up to 90 percent efficiency, robust sleep/wake/LED support modes, and electromagnetic interference (EMI) performance. These features will help developers reduce system cost and help them get to market faster with high-power, high-efficiency PoE PD-powered applications.
Designers face a number of challenges in creating new devices, including low power conversion efficiency, electromagnetic interference problems, oversized PCBs with a lot of BOM, and running out of headroom on power. The Si3406x and Si3404 can help relieve all of these through high efficiency, proven EMI results with suppression and control techniques, superior BOM integration, and 30W power headroom.
IP cameras are a good use case because two cables are needed; one for power and one for data. With PoE, these two cables are combined into one. With a complete power supply built with Si3406x or Si3404 PD devices, designers can focus on their more value-added portions of an IP Camera design.
The growth of the IoT is raising demand for PoE+ connectivity across application areas, and the increasing popularity of the PoE+ standard, coupled with the requirement to support 30 W designs, these parts represent the next movement in PD interface solutions for homes, businesses, and industrial environments.
The Si3406x family integrates control and power management functions needed for a PoE+ PD applications, converting the high voltage supplied over a 10/100/1000BASE-T Ethernet connection to a regulated, low-voltage output supply. The highly integrated architecture minimizes printed circuit board (PCB) footprint and external BOM cost by enabling the use of economical external components while maintaining high performance.
Its high-power PoE+ capabilities also make it possible to develop advanced IoT products including IP cameras with pan/tilt/zoom and heater elements and newer protocol 802.11 wireless access points that demand much from power supplies. The Si3406x family’s on-chip current-mode-controlled switching regulator supports multiple isolated and non-isolated power supply topologies. This flexibility, along with Silicon Labs’ comprehensive PoE/PD reference designs, makes it easier and faster for developers to deploy critical power supply subsystems.
The S3406x and Si3404 Family bring a large number of additional benefits over our previous, single offering of Si3402.