10 Tips for improving IoT design success
From wearable tech and home automation gadgets to intelligent industrial sensors, there are a number of design pitfalls awaiting the unwary design engineer
Developing a new IoT-ready device leads to challenges for even the most experienced design engineers. The successful integration of an IoT edge device calls for good design and manufacturing, adequate deployment, timely battery replacement if applicable, and an ability to accept and incorporate software/firmware updates as required.
Here are 10 ways to avoid potential pitfalls and ensure effectiveness of your IoT designs:
- Robust project planning
According to a recent Cisco study, one third of all completed IoT projects are considered a failure, with 60% of IoT initiatives stalling at the Proof of Concept (PoC) stage. Businesses can reduce the potential for failure by ensuring their IoT project is well planned, scoped, and the proof of concept tested prior to rollout. The pace of IoT development makes it tempting to rush to scale to achieve a market lead, but teams can reap far greater benefits by testing use cases in live environments on a small scale, adopting learnings, and making changes before proceeding to full-scale deployment.
Additionally, project managers need to examine the breadth of the project to understand the competencies required – ensuring the team has the right skills, or partnerships where required.
- Utilizing modular solutions
Some challenges can be reduced by building in modular solutions, simplifying the development process. Raspberry Pi has an ecosystem of products that allow designers to focus on their core expertise while delivering their end product quickly. The Human Vision Component (HVC) from Omron Electronic Components also enables design engineers to easily build in the latest capabilities without in-depth experience. These, and other modular solutions offered by Newark, reduce design times and help move projects forward quickly.
3. Considering compliance upfront
IoT edge installations can contain varied device types as well as router or gateway platforms that manage communications among all the devices and the wider IoT infrastructure. Project designers should choose a platform that supports an extensive mix of protocols for data ingestion, such as OPC-UA, BACNET and MODBUS, as well as more current ones like ZeroMQ, Zigbee, BLE and Thread. A platform incorporating modular support for protocols also allows design engineers to customize existing asset communication.
- Choosing the right power capability
Most IoT products depend on having exactly the right level of processing power. Recent developments in edge technology, including AI at the edge, provide design engineers with options around where that processing power sits – either at the cloud, gateway or edge itself. Clearly, insufficient capability at any point will render them unable to handle their target application. However, if the processor is too powerful, it can also cause problems related to PCB real estate, cooling, power consumption and cost.
A broad approach to finding the right CPU for an application is to consider a wide-ranging processor family, such as Arm’s Cortex series designed to cover IoT applications from sensors to servers, scalable and backwards comparable – providing maximum flexibility as your project develops.
- Identifying the right memory option
There are many memory options for design engineers: traditional external flash memory, embedded flash memory, multichip package memory and multi-media cards. Making the right choice depends on your project’s priorities. Important considerations include:
- Cost –The more expensive the memory selection, the more expensive the final device.
- Size – The amount of space required for memory processing and the more silicon wafer space required, the more costs go up.
- Power Consumption –Most IoT devices either run on small batteries or rely on energy harvesting for recharging. Design engineers should choose an option that uses the least power and lowest voltage, while in use and during standby.
- Startup time –Memory must support a quick startup to ensure excellent device performance. Implementing a code-in-place option, which allows the device to execute code directly without needing to copy operating code from a separate EEPROM chip, reduces the time required to boot up, as well as the cost of the chip with less need for RAM with substantial on-chip storage.
6. Getting the most out of your firmware
IoT devices are controlled by coding implemented in firmware. Design engineers should make the following considerations when designing their firmware:
- Ensure a stable firmware architecture that is scalable and well documented, using professional firmware toolchains and languages like C and C++.
- Design for constrained systems such as low-power MCUs with limited memory, no memory management, and no direct interfaces like keyboards or screens.
- Build in stability and error recovery including application watchdog timers, error correction, and auto-recovery from system faults.
- Paying attention to inputs and outputs including sensor data gathering, digital signal processing, local compression, and storage of data.
- Minimizing power consumption by writing firmware that allows devices to enter sleep mode and consume the bare minimum energy required.
- Optimizing bandwidth for cellular communication from your device to the cloud.
- Revise firmware continuously with OTA firmware updates to improve stability and functionality, adding value without needing to change hardware.
Note: there can be pitfalls associated with OTA. Don’t fall into the trap of mass deployment before you are ready – you can always update later – and consider the customer when deciding how often firmware updates should be pushed to the device.
7. Effective software management
In the IoT ecosystem, first to market is a huge competitive driver, which can mean that security, quality and dependability can be sacrificed for speed to release. There are four important software development practices for IoT:
- Review –Proper code review and repetitive testing should be a priority with a call for strict software quality measures.
- Assessment –Continuous deployment in the connected world is common with updates often getting pushed multiple times a day. With the quality assurance burden on the software that interacts with IoT devices is greater than ever, if the software isn’t continuously monitored and the code evaluated, failure is almost guaranteed.
- Responsibility –Management must take responsibility for quality assurance. Any manufacturer that doesn’t have a set of analytics to track its software risk is negligent in its responsibility to customers and other stakeholders.
- Advocacy –In addition to measurement and analytics, a cultural shift to include education needs to occur. Developers and management collectively must champion the need for standards.
8. Proper power management
Excessive power demand will drain an IoT device’s battery too quickly and can cause overheating. Minimizing a SoC’s power demand as much as possible is key for IoT edge device designers, especially if the device is battery-operated or relies on energy harvesting. This can be done by reducing voltage, frequency or capacitance.
Design engineers could also consider reducing switching activity, sacrificing transistor density for higher frequencies, layering heat-conduction zones within the CPU framework, recycling at least some of the energy stored in the capacitors, or optimizing machine code by implementing compiler optimizations that schedule clusters of instructions using common components.
9. Prioritizing security
Security issues include protection of critical assets, safe crypto implementations, secure remote firmware updates, firmware IP protection and secure debugging. Such protection is available at the hardware CPU level. The Cortex-M33, Cortex-M23 and all of the Cortex-A processors, for example, include Arm TrustZone technology to provide a secure foundation in the SoC hardware. TrustZone is a widely deployed security technology, providing banking-class trust capability in devices such as premium smartphones. The Cortex-A32 can also be coupled with TrustZone CryptoCell-700 series products to enable enhanced cryptographic hardware acceleration and advanced root of trust.
10. Reducing Electrostatic discharge (ESD)
Certain components used in electronic assemblies are sensitive to static electricity and can be damaged by its electrostatic discharge (ESD). The circuit board assembly can be damaged as the discharge, caused by physical contact with a person charged with static, passes through the conductive pattern to a static sensitive component. Note that usually the static damage level of less than 3000V for components cannot be felt by humans.
Problems can also be caused by electrical overstress (EOS) which is caused by spikes from soldering irons, solder extractors, testing instruments and other electrically operated equipment. This equipment must be designed to prevent unwanted electrical discharges.
ESD/EOS safe work areas should protect sensitive components from damage by spikes and static discharges.
The IoT is still a new phenomenon, and as such, is still creating plenty of opportunities for edge product and system designers in home automation, industrial, medical and many other applications. While this novelty is attractive, it can tempt some organizations into areas where they have limited experience, creating risk at many levels. By planning properly and leveraging the right technologies, these risks can be avoided.