Skip navigation

Blog

1 2 3 Previous Next

Internet of Things

503 posts

You don’t often hear the words “ASIC” and "IoT" in the same breath. The traditional volume requirements of ASICs don’t usually jibe with the roiling, tight time-to-market, fast-changing world of IoT applications. But that’s changing.

 

Don Dingee at Semiwiki wrote recently about companies such as ARM (with things like DesignStart) and Open Silicon (with offerings like Spec2Chip) that have laid the groundwork for “fit-for-purpose” designs. Check out his insights and for a deeper dive, and you can listen to a webinar from ARM and Open Silicon about how to succeed in the IoT market with custom SoC methodologies.

 

Related stories:

Design, simulate and test Cortex-M0 based systems for free!

Open-Silicon ARM based Spec2Chip Case Study

DesignStart

IoT Platform Cortex-M Series

 

ARM DesignStart graphic.png

emilyhier

Smart Farming with SPARKL

Posted by emilyhier Apr 27, 2016

By Dr Andrew Farrell, Lead Software & Research Engineer at SPARKL.

Global food production must increase by 70% in order to feed the world’s projected population of 10 billion people by 2050.

There is an inexorable pressure on the farming industry, as it is, to become more efficient given tighter margins determined by the global market. Throw in ecological issues, such as river pollution and ecosystem disruption caused by intensive practices (of the kind needed to ‘feed the world’), and farming looks to be in a combustible state.

Sensors, Sensors, Sensors

The use of technology may go a long way to solving these issues.

The proposition of ‘Smart Farming’ recasts the industry as an optimization problem to be solved by analytics on data pooled from hundreds, if not thousands, of farm sensors. Sensors tracking movement of the cows, sensors in the land and in the air measuring temperature and moisture levels, sensors in the farm buildings and elsewhere all may produce data from which insights can be drawn, and with which the running of the farm may be optimized.

For example

  • Is there an ill cow in your herd? Apply behavioural analytics to movement data sourced from location sensors on cows. On big cattle stations, this question is not easily determined by a farmer with limited time on their hands. Sensors on animals can also be used in detecting theft of animals.
  • Where should the cows be grazed to get the best milk yields? Take a multifaceted approach that accounts for current weather forecast data, combined with localised models of the farmland and its pastures, whilst allowing for recent data from environment agencies regarding pollution levels (e.g. from dairy nitrates) which may be used to bias the decision on ecological as well as monetary criteria.
  • When should the farm gates be opened to allow cows to graze inside? When should sliding doors in cowsheds be opened to give more ventilation? Weather forecasting may again be used, but if cows subsequently move away from drafts caused by opening doors, then reverse the decison..

 

These are all smart decisions that can be made by smart technology. We propose SPARKL as a solution for all of the Smarts (cities, buildings, and so on), including Smart Farming.

Imagine the following scene. The farmer installs sensors on the cows, and elsewhere. He plugs SPARKL on a stick into a Raspberry Pi. SPARKL automatically detects the network of farm sensors. As it has been pre-configured for a farming context, SPARKL is immediately able to start performing analytics over the data. This suits the farmer as they want a solution based on zero configuration. The farmer is able to see what’s going on by logging into a web-based dashboard, either from a laptop or smartphone.

On the Edge

A key aspect of SPARKL is that it does analytics and makes decisions ‘on the edge’. It’s important to maintain the balance between ‘local’, where simple decisions should be taken locally, and ‘the cloud’ for more intensive number crunching. The decision to shut the cowshed door, based on local sensor readings and forecast data pulled (every 24 hours) by SPARKL over the farmer’s internet connection, can be taken locally without routing it via the cloud.

This is important, for example, if the internet connection is patchy, or, perhaps, for privacy, the farmer does not want to share all of their sensor data with the cloud. This is edge-based analytics and autonomics and its use is complementary to the cloud.

Technology is at the heart of these examples of advanced decision-making. In fact, the Internet of Things will bring a ‘decision support system’ flavour to farming. Smart Farming is inevitable, and once farmers see that technology is benefiting them in meeting production targets, they will trust it more. The hope also is that they will trust it to make decisions that are ecologically advantageous as well as giving them a good living in ‘feeding the world’.

An example of SPARKL in the Internet of Things, making things happen at the right time in the Virtual Factory.

2016 May 16 - 19 we will again be at the NXP Technology Forum.

Formerly called #FTF it is now the nxpftf 2016

 

We will be there and show our latest small Modules or chat with you about our Software and where to get it.

Of course we are also looking forward to see all the new cool things we are expecting NXP to show.

TechNexion_PICO_IMX6_1509.png

Being exited to have our tiny PICO Module scaling from Cortex-A7 i.MX6 Ultralite over Cortex-A9 single/dual lite/quad or with the NXP i.MX7 solo and dual (both Cortex-A7) but also introducing our other solutions.

 

I'm sure there are a lot of customer which will find our EDM solution the perfect fir for their needs:

TechNexion_EDM_1509.png

 

We are looking forward to meet yo and all the other amazing people and exchange our ideas!

 

For sure it will be a great event, also if now under a slightly different name, but that's still ok 

 

FTF, here we come!!

 

Register here: http://www.nxp.com/support/classroom-training-events/nxp-ftf-tech-forum:NXP-FTF-TECH-FORUM-HOME

 

Embedded, TechNexion, carlwilliamson, Flo

Introduction

Deep in the darkest depths of ARM®’s Cambridge campus reside a small team of engineers whose job it is to envision and create forward thinking technology demonstrations based on ARM. That team goes by the name of Applied Systems Engineering for which I am an Engineer, concentrating on the electronic elements of the demos that we create.

 

Last year we designed and developed a demonstration of an ARM-based, end-to-end, Internet of Things (IoT) application to demo at some of the biggest electronics trade shows around the world. We called this demo Sensors to Servers. Over a three part blog my colleague ericgowland talks about how we took the concept and made it a reality (part 1 with video, part 2, part 3).

 

To briefly recap, Sensors to Servers is a demo which collects and visualises live data from around our trade show booths. To make this happen we instrument our booths with a network of sensor nodes featuring ARM Cortex®-M4 based microcontrollers from NXP Semiconductors. These sensors nodes are strategically positioned around the booth and report their findings to an ARMv8-A based 64-bit server from Applied Micro®. The key to bringing this demo together is the ARM mbed IoT Device Platform. The sensor node application software was written on top of mbed “Classic” (v2.0) and the data was collect by our server using the mbed Device Server software.

 

Over the next two blogs I will show you how the demo evolved from a stack of development boards to a cloud hosted application which we deployed simultaneously at Mobile World Congress and Embedded World 2016. For this instalment we will concentrate on how we took the original sensor nodes and updated them with new custom hardware.

 

Custom Sensor Nodes

When we were originally creating this demo we selected one of the many available mbed development boards to use as the basis of our sensor nodes. For rapid prototyping (and other reasons) we selected the NXP FRDM-K64F which gave us Arduino compatible headers, from this we could stack shields together that would very quickly give us the functionally which we required. This was great; the speed at which we could develop and test was instrumental in meeting our deadline for a stable demo. We went on to use this setup when we deployed Sensors to Servers in its first few outings.

 

Original sensor node:

S2S_devhw_stack.png

S2S_devhw_flat.png

 

Although brilliant for development and testing the stack of shields was far from an elegant solution and frankly an ugly addition to our skilfully crafted trade show booths. We therefore started working on custom sensor nodes which would give us the same functionality and be software compatible while being far more professional and pleasing on the eye.

 

So where did we start? Documentation; we had the schematic for our sensor shield but we needed to know exactly what hardware we were using on the FRDM-K64F board and mbed 6LoWPAN shield so we could replicate it. We downloaded the design files for the FRDM-K64F from the NXP website (schematic, BOM and Gerber files). The mbed 6LoWPAN shield is simply a 2.4 GHz IEEE 802.15.4 transceiver module routed to Arduino R3 standard headers. We looked up the part number of the module and found the datasheet and schematic for it. Using these, a keen eye and a continuity tester it didn't take us long to reverse engineer the shield. Now we had all of documentation we needed we could start designing our custom, 4 layer PCB.

 

Design & Assembly

Using CadSoft EAGLE PCB design software we put together our schematic. We studied the documentation we had gathered and carefully worked out what components we needed using the documentation as a reference. We tried to copy as many components that we needed from the original setup as we could. Once we were confident in our design we laid out and routed our PCB and sent the design files off to Eurocircuits, a PCB pooling company for a quick turnaround run of test boards.

 

The first revision of the board didn't quite work out as we liked. After we hand assembled some boards and corrected the inevitable mistake or two we found that the Ethernet signal quality degraded significantly over only a short distance, well short of the 100 meter standard. We went back to the drawing board and re-worked the layout of the Ethernet MAC and jack, particularly focusing on minimising cross-talk by carefully sculpting the ground and power planes of the PCB. The revised spin of the PCB came back working perfectly.

 

We decided that 40 nodes would be enough for our needs and none of our team fancied the pain staking job of assembling them by hand (unsurprisingly!) so we needed to find a company that would assemble them for us. After tonnes of phone calls and quotes we luckily found a small assembly house with a pick-and-place machine a mere 4 miles away. After liaising with the assembly house we ordered our final PCBs in panels of 12 so the pick-and-place machine could do its job. 48 hours after dropping off the PCBs and components the now completed nodes were back in our hands and working perfectly much to our relief. One last touch was required; a laser cut acrylic protective casing and we were done!

 

Hardware

S2S_customhw.png

 

Power supply: We choose to power the nodes using a micro-USB connector with a standard 5V USB input. We required three voltage rails to power all of the various components, 3.3V, 5V and 12V. To step-down the 5V input we selected a fixed 3.3V low-dropout (LDO) voltage regulator. To step-up the 5V input to 12V we included a boost converter IC and supporting circuity. Power consumption wasn't really a big concern for the nodes as they are powered by either USB chargers or mobile phone power banks.

 

Microcontroller (MCU): The target MCU was plucked straight from the FRDM-K64F board. The part in question is from the Kinetics K Series from NXP Semiconductors. It features a Cortex-M4 processor with a digital signal processor (DSP) and floating-point unit (FPU), 1 MB of flash, 256 KB of SRAM, Ethernet MAC and many other peripherals. Although capable of being clocked at a maximum 120 MHz the MCU is clocked at 50 MHz. This is because the clock signal is derived from an external Ethernet PHY as per the FRDM-K64F board.

 

Programming Interface: We opted to stray from the mbed Hardware Development Kit (HDK) specification and not include an on-board CMSIS-DAP interface. The CMSIS-DAP interface provides provisions for drag and drop programming and USB virtual COM port. We did this to save board space and BOM costs. So how did we program the nodes? We used the CMSIS-DAP interface on the FRDM-K64F board as it is ready to program and debug the target MCU. A simple modification on the FRDM-K64F board allowed us to program the nodes with a 10-pin ribbon cable:

 

S2S_programming.png

 

Connectivity: To communicate with our server we included wireless and wired options, 6LoWPAN and Ethernet respectively. Large trade shows are notoriously bad for having a crowded and noisy RF environment. To ensure that the demo functions in all scenarios we made sure we could fall back on wires if needs be. The 6LoWPAN functionality came from the same 2.4 GHz IEEE 802.15.4 transceiver module as on the mbed 6LoWPAN shield.

 

Sensors: We refined our choice of sensors from our original setup from the experience we gained in the demos early deployments. The PIR sensor was dropped in favour of an accelerometer however, we kept the option to reinstate it. We found that measuring the vibrations of a table was a more effective way of determining presence in a meeting room than trying to position a small PIR sensor effectively. We swapped the temperature sensor for one which we could surface mount and it was placed in corner of the board within an isolated area for greater accuracy. The microphone was re-used, however we integrated it and it's amplifier into the PCB instead of using the module. We added an optional 4-pin M12 connector to the board (number 11 below) which allows us to attach the same laser door trip sensor as we used previously. We only fitted the connector to the required nodes.

 

Components

The image below highlights all of the major components we used on our custom sensor nodes:

S2S_customhw_numbered.png

 

  1. Temperature sensor: Measurement Specialties HTU21D
  2. Microphone: CUI CMA-4544PF-W
  3. Boost converter, 12V: Texas Instruments LM2731
  4. USB UART interface: FTDI FT232R
  5. Low-dropout voltage regulator, 3.3V: Microchip® MCP1824T
  6. Micro-USB receptacle
  7. Accelerometer and magnetometer: NXP FXOS8700CQ
  8. 2.4 GHz IEEE 802.15.4 transceiver: Atmel® ATZB-RF-233-1-C
  9. Ultrasonic proximity sensor module: MaxBotix® MB1014
  10. MCU: NXP Kinetis K64 (MK64FN1M0VLL12)
  11. Door trip 4-pin M12 connector (fitted when needed)
  12. Programming interface, 10-pin Cortex debug connector
  13. Ethernet jack with magnetics: WIZnet RB1-125BAG1A
  14. Microphone amplifier: Maxim Integrated MAX9814
  15. Ethernet PHY: Microchip KSZ8081

 

Conclusion

I hope you enjoyed reading about some of the work that goes on to bring ARM demonstration to life. Stay tuned for the next instalment where I will talk about how we modified the demo for cloud hosting which we deployed simultaneously at Mobile World Congress and Embedded World 2016.

We've all experienced our fair share of sleepless nights. From snoring partners, noisy neighbors and city sounds, there’s a lot that can leave us counting sheep at night and craving coffee in the morning. That is why Hush, a new ARM®-based Kickstarter project, has created what it says is the world’s first smart earplugs.

 

As showcased in ARM’s Innovation Hub, Hush earbuds have been designed specifically for sleep, blocking out the unwanted noises that leave you tossing and turning at night. The device has sound eliminating foam which provides passive noise reduction as a first sound barrier, while the in-ear speaker plays up to eight hours of soothing music to mask any residual noise to allow users to sleep without noise from surrounding environments.

 

Couple+in+Bed+2.jpgHush+in+case+with+white+background.jpgHush+on+bedside.jpg

 

To learn more about the technology behind the innovation and find out what the future holds for the smart device, we caught up with Daniel Lee, Cofounder and CEO of Hush.

 

Can you talk a bit about the device?

Hush creates the world's first smart earplugs that combine a soothing sound machine with earplugs to block out noise that keeps you awake at night. By connecting wirelessly with your smartphone, Hush lets you select which notifications you receive, such as your alarm clock or an emergency phone call. In essence, Hush lets you block out the world but still hear what you need for a truly peaceful sleep.

 

What tech is inside the device?

The device is enabled by a Bluetooth low energy chipset from NXP based on the ARM Cortex®-M0 processor and connects wirelessly to a smartphone. It also features parts from Micrel Inc., Texas Instruments and Atmel.

 

How did you go about in deciding the technology for the device?

We chose Bluetooth Low Energy (BLE) because we needed to create a very small device as miniaturization was crucial for creating an in-ear product suitable for sleep. BLE enabled us to use the smallest battery while simultaneously providing meaningful battery life. We then upload a sound file to the earplugs and play them back locally so we can maintain an extreme low duty cycle of Bluetooth connections which conserves power.

 

What was the most challenging hurdle you had to overcome in creating the device?

Creating a comfortable in-ear device that someone could sleep with all night was a particularly hard challenge to address. We're not going to be perfect for everyone as the variance in individual ear shapes and sizes is massive, but even so, we can provide earplugs for a large majority which is something we are very proud of.

 

What changes did you make before coming up with your final prototype?

We made it smaller and smaller. In fact, both the earplugs and charge case were shrunk twice before we arrived at a prototype we were satisfied with.

 

What are some of the smart devices that Hush can connect to?

Hush works with a variety of smart devices. These include devices running Android 4.3 or higher while enabled with BLE and all iPhone models from the iPhone 4s and newer.

 

What’s next for Hush? Do you have any other integrations or advancements in the pipeline?

We have lots of interesting plans on the road map. Expect there to be a Hush two that improves on all the key factors of the first generation Hush using the learnings we've taken on board from Hush one.

 

To find out more about the Hush, please see our article on the ARM Innovation Hub.

Hush+Side+View+2.jpg

SANTA CLARA, Calif.—Flashlights are fantastic tools until they die and you find yourself in the dark, fumbling around for replacement batteries. That type of dynamic will constrain the growth of Internet of Things applications and devices unless we resolve the power problem.

 

That was the message from Prithi Ramakrishnan and Charles Dittmer (pictured nearby) at a Bluetooth World presentation March 15 at Levi’s Stadium here.Charles and Prithi present at Bluetooth World 2016.jpg

"How do we get to billions of devices," Ramakrishnan, ARM wireless product manager, said. "Will we achieve these volumes if we have to change billions of batteries every year? Perhaps not. But billions of devices need not mean billions of batteries."

 

She and Dittmer, ARM wireless technical marketing manager, walked their Bluetooth World audience through elements of low-power design best practices, highlighting how the industry moves toward optimizing power-constrained devices and effective use of energy-harvesting systems.

 

The latter is still in its formative stages, but the former — systems using 1V and sub-volt designs — is within our grasp today, Dittmer said. In some cases, with a sub-volt design, battery life improvements of 60% on existing 1.2V alkaline batteries are possible.

 

Low-power landscape

The road to IoT ubiquity begins with a good wireless protocol, as well as a well-considered IP and systems design, such as low peak and sleep currents and low voltages.

 

As an example, Dittmer showed a picture of an oscilloscope image of ARM Cordio wireless IP, noting the Tx (transmit) current was roughly 7 mA.

 

"You say, ‘gosh, 7 milliamps is not that bad,’ but others are doing 5-6 mA," Dittmer said. "But you … really have to compare apples to apples and talk about milliwatts."

Battery decay curve slide from Bluetooth World 2016.jpgThose solutions pulling 5-7 mA are at 3V, which, according to Dittmer equates roughly to 15-21 mW in Tx mode. ARM radio IP runs at 950mV (sub-volt). Rounding up to 1V, 7 mA equates to 7mW, he said. And the technology goes to sleep at 800nW, he added.

"This enables true 1V solutions to take advantage of different 1-volt battery-size topologies," Dittmer said. "If you’re powering your radio and your SoC at 1V, you’re extending your battery life because as the battery decay curve goes down, you’re still operating at 1V." (see slide left).

 

Dittmer called out as an example an ARM Cordio BT4 Bluetooth Smart test chip that uses a Zinc/Air hearing aid battery. Running at 1V, the micro-beacon has a battery life of more than two years.

 

If 1V and sub-volt power unlocks system-design creativity, then hearables are becoming the new wearables, Dittmer noted. The Bragi Dash wireless smart earphones, which can serve as a small MP3 player, just might be the poster child for hearables, he noted.

"Beyond audio and hearing, the ear is also a great place for biometrics. This is enabled by moving to 1V technology and the associated small batteries," Dittmer said.

  

Harvesting energy

If optimizing low-voltage battery-backed designs is possible today, designing systems that really take advantage of energy harvesting is very close.

 

1V and sub-volt technologies mean that pulling in energy from mechanical, thermal, vibrating, RF, natural and other sources is much more realistic.

 

Solar-powered devices deployed by Fraunhofer, between panes of glass, can enable alarm and temperature-sensing systems, storing enough solar energy to run overnight.

Dittmer closed with a call to action, noting that there are some sticking points within this ecosystem.

 

The processor and radio technologies are at 1V but "a big problem for IoT nodes is many of these are sensors which typically run at 3.5V, 2 and 1.8V today," he said. "Any sensor people in here? We need 1V sensors! That's my call to action. Other pieces of ecosystem have to follow this low-power low-voltage trend."

 

Related stories:

Bluetooth World Panel: Is the IoT hype or hope?

Bluetooth Smart IP from ARM - what a difference a year makes . . .

The waves of Bluetooth Smart Applications

Choosing IoT connectivity technology needs careful consideration of multiple technical and commercial factors linked closely to individual use cases. Different applications favour different technologies so what the product is designed to do will be a major factor in the decision. Last time out we took a brisk gander through the seemingly endless technology options in order to establish a framework. If you’re following this series you’ll recall that I grouped the technologies into three categories – LAN/PAN at one end, 3GPP options at the other and an increasingly busy centre ground roughly described as LPWAN. We are seeing the emergence of new players and convergence from incumbents – in particular from the 3GPP stable. LPWAN is going to be the focus of the rest of this series and my goal is going to be provide guidance to help you make decisions.

 

trafficjam.jpg

What matters?

 

Choosing connectivity is complex – there are a lot of features and benefits, often conflicting, to weigh in the balance. So let’s distil some of the key characteristics that will define an IoT connectivity technology and from there we can more easily make decisions about what is important to our particular use cases. We think that the strength of an IoT connectivity technology can be defined in terms of the following eight parameters.

 

  • Capacity
  • Quality of Service
  • Range
  • Reliability
  • Battery life
  • Security
  • Cost
  • Proprietary vs Standard

 

 

Random timing

 

Most cellphone interactions start at a random time – the point when someone wants to call a user, or they decide to instigate a search on a smart phone. The device then goes through a “random access” phase to initiate communications with the network after which the network provides dedicated resource for the duration of its communications or “session”. Random access is great for a user with a mobile phone. For a network it is inefficient. The larger the network, or connections on the network, the higher the probability that multiple users will attempt to access the network resource at the same time and clash. When this happens often all communications are lost. The users then repeat their transmission in order to increase the probability of a successful connection. The efficiency of such channels is well defined in the“Aloha access” theory which tells us at best they achieve around 30%. Above these levels there are so many message collisions and re-transmissions that then collide again multiple times that the channel capacity spirals downwards and a reset is needed. In the cellular world the random access phase is only a tiny fraction of the total data transmitted so its inefficiency is of little relevance. The difference in the size of data transmissions in a typical IoT deployment where all of the data can be encapsulated in the first message means that virtually all transmissions are typically random access. In this case efficiency drops to 1/3 at best. If devices could be told when to transmit next – for example thermostats given periodic slots – then 3X efficiency improvements can be made.

 

Power adjustment

 

In cellular systems handsets are tightly controlled by the network to use the optimal type of modulation and power levels, with these varying dynamically, often second-by-second. In typical IoT implementations transmissions are so short that there is little time for the network to adjust the device. Hence, the device will typically use higher Tx power than needed resulting in more interference. Networks need to be designed both with clever ways to adjust device power based on knowledge such as whether the device is static (and so transmit power can be steadily adjusted over time) and other cues from the network.

Multiple overlapping networks

 

Cellular operators have their own spectrum and can design networks free of interference from others. Conversely, most IoT networks are deployed in unlicensed spectrum where there can be interference from other IoT networks using the same technology, other IoT networks using different technology and other users. To date, this has not been a key issue but as competition grows and more networks are deployed it could become a constraining factor. Some techniques, such as code-division access (CDMA and similar) rely on orthogonality between users which is only effective where users are controlled in time and power. With a single network this is possible, but with multiple networks there is rarely coordination between them and the impact of interference can be severe. Instead, techniques such as frequency hopping and message acknowledgements are much more important as are networks that can adapt to their interference environment.

 

Flexible channel assignment

 

This further enhances network capacity by enabling frequency reuse in large scale deployments and adaptive data rates permit optimal radio resource usage to maximise capacity. Time synchronised base stations allow for radio resource scheduling and utilisation.

 

For all of these reasons and more, the efficiency of an IoT network should not be measured in the classical manner. A network could have apparently worse modulation but simply through smaller message sizing be 10 times more efficient.

 

There are many technologies that are sub-optimal and have the potential to suffer severe capacity constraints. For example, UNB technologies will typically resend messages multiple times to increase the probability of successful transmission. This is clearly inefficient and has limited or no ability to take any action once a cell is overloaded. Wide band systems rely on orthogonality between transmissions which could suffer badly when multiple overlapping networks are deployed in the same spectrum. 3GPP solutions are still in definition but often have large minimum packet sizes. Issues that may not become apparent during a trial where network capacity is not stressed may only emerge when tens of thousands of devices are deployed. At this point changing the technology is very expensive.

 

Narrow band modulation regimes offer a compromise between the benefits of UNB and wide band. They are optimised for uplink dominated traffic of moderate payload sized data packets and moderate duty cycles. A carefully designed narrow band IoT regime is optimised for high network capacity and support for networks necessary to enable the tens of billions of predicted connections. An optimised technology will offer very short message sizes, frequency hopping, adaptable radios, group and multicast messages, minimal use of random access through flexible scheduling and much more. Although its bits/Hz may not be materially different from other solutions, in practical situations it is potentially orders of magnitude more efficient. If IoT devices were like handsets and replaced every two years, that might not matter, but with some being 10 – 20 year deployments, getting it right from the start is critical.

SANTA CLARA, Calif.—The Internet of Things is neither hype nor hope, but moving to a world of billions of interconnected, secure, easy-to-use devices with a profitable business model is anything but simple.

 

That was the consensus of a panel here (March 14) at Bluetooth World that explored that very "hype or hope" question. charlene marini bluetooth world panel.jpeg

“Why not both and why not neither?” panelist David McCall, chair of the Liaison Task Group of the Open Interconnect Consortium (OIC), responded.

 

The five panelists, led by moderator Mark Powell, executive director of the Bluetooth SIG, dived into IoT evolution aspects such as security, user interface, standards (or lack thereof), scalability and notion of device and network orchestration.

 

“There is a lot of hard work, a lot of difficult problems that need to be solved,” McCall said. “Yes, there is hype, but if you think this thing isn’t going to happen, you’re wrong.”

 

Scaling new heights

The fragmented nature of IoT applications and development is a key area that needs to be addressed.

Charlene Marini, vice president of segment marketing with ARM (pictured right), said scale will only come if IoT thinking becomes less siloed.

 

“We are seeing pockets of connected systems, connected back to the cloud, and an end user is analyzing the data and making real-time decisions based on the information being sensed in some environment. A lot more needs to be done to get the scale we're looking for.”

 

The industry also needs to offer the consumer an environment in which she doesn’t have to download an app for every point task, such as controlling a lighting system in the home.

To this point, Wayne Piekarski, Google’s senior developer advocate, said we need to think more about orchestration and how to serve consumer interests. Rather than have a consumer walk into his home opening different apps to trigger the lights, washing machine and television, orchestration would occur the moment the homeowner walks inside, automatically or programmatically activating or deactivating connected technologies based on experience, time of day, and so on.

 

“You want to have these generic schemas so an app developer can write the app and the OEM makes the devices,” he said. “We want to try to decouple that because it allows for interesting orchestration possibilities.”

 

Wanted: Holistic security solutions

Silos are also what the industry needs to address to nail down end-to-end security for IoT, panelists agreed.

 

Security is “everyone’s responsibility,” said Adnan Nishat, senior product manager of IoT wireless Products for Silicon Labs.

 

When a security attack happens, it may damage a particular brand or company but it certainly weakens consumer trust, which impacts everyone, he said.

 

Marini noted that IoT is an outgrowth of embedded systems design, which originally were vertically conceived and implemented in “very siloed value chains.”

 

“IoT means all of that is broken apart,” she said. So how do you have security in an open environment — an environment where you’re mixing and matching value chains?

 

“We need to enable security end to end, to have trust, integrity of the device, to ensure that even tiny bits of data coming from good known devices are secure and to have security all the way to the edge.”

 

The security solutions need to be simple enough that any part of the value chain can implement them. The industry's working on that at every level, Marini added.

 

Related stories:

-- ARM at Bluetooth World 2016: A guide

-- Bluetooth World: Wireless connected microcontrollers are changing the rules

-- Bluetooth Smart IP from ARM - what a difference a year

Let us start with the first decade, by this I mean 2000 to 2010, the Triassic period of Bluetooth devices: Bluetooth was introduced in 1999 and its goal was to be a wire replacement technology. And that it achieved. The ubiquitous Bluetooth headset which took off after the government mandate for hands-free driving, wireless key pad and mice, and even Bluetooth speakers. This was the black and white decade, however, make no mistake the consumer electronics market, including PC peripherals is and will be a huge market for Bluetooth, 600M devices today and close to 1B by 2020.

 

We are in the Second decade, the more colorful one, the one with apps and appcessories: Accessories that exist because they can talk to your smart phone.

These devices have some common characteristics -- they are battery operated, low power operation, they are not on all the time, periodically wake-up to send data and then go back to sleep. All of this is possible only if the underlying wireless protocol/standard supports low power operation - the Bluetooth technology evolved and in 2011 Bluetooth 4.0 was introduced with a low energy specification; called Bluetooth smart or Bluetooth low energy or simply BLE in anticipation of these needs. This was designed for the IoT node devices, the appcessories market – the sensors and the beacons. The new class of devices like the toy that interacts with the story app on your phone, the Parrot flower power device that measures humidity in your flower pots and sends messages to your smart phone to alert you to water it, or the smart light bulbs with different lighting for differed moods, turns on and off the light based on when you are in the room.

 

Lots of neat little applications, but the investment is where the volumes are and where the profits are. 2015 was the year in which wearable technology arrived – Fitbit’s successful IPO, Pebble moving from a crowd-funded company to a legitimate smart watch company and of course the introduction of the Apple watch.

 

The next few years could belong to Smart and connected Homes – Googles acquisition of Nest, announcement of Thread, a wireless protocol specifically targeting the home connectivity market. With big names like Google, ARM, Freescale, Silicon Labs behind Thread, this market is gaining credibility. The Bluetooth Smart standard in its next generation plans to support long range and have mesh capabilities to address this market better, will Smart homes be the trend to tunnel through the trough of disillusionment and make it to the slope of enlightenment?

 

Join ARM and other industry leaders as we explore what next and what is possible at the Bluetooth World, Santa Clara. https://bluetoothworldevent.com/

Within a days, if all goes well and the weather holds, ARM software engineer mattdupuy will stand atop one of the world’s tallest peaks, kitted out with ARM-powered wearables and mobile devices and sporting a big grin on his bearded face. Matt Du Puy head shot.png

Du Puy (pictured right), who’s been with ARM for four years and is based in San Diego, Calif., is scaling Annapurna (26,545 feet) with a small team. After the team tackles Annapurna, it will head across the Kaligandaki River valley and attempt Dhaulagiri (26,795 feet) on the same trip.

It’s a testament to his passion for mountaineering, electronics and ARM that he reached out and offered not only to take along some ARM patches and flags and additional wearable and mobile technologies but to share dispatches along the way.

And he’s doing this on his sabbatical.

And he’s doing it on a mountain that his climbing partner, Christine Burke of New Zealand, describes as one of the most dangerous of the 8000-meter mountains: The fatality to ascent ratio is 32 percent.

“Mountain climbing is equal parts preparation, problem solving and will power,” he said. “It is also a lovely excuse to travel and make amazing, passionate friends along the way. And each time we get more adept at kitting ourselves out, gear gets lighter and, in the case of electronics, more interesting and useful for climbers.”

Du Puy took off from Southern California March 9 and landed in Kathmandu to start the preparation and acclimatization process. Sunday (March 13) he flew to Pokhara, 120 miles from Kathmandu.

“Today, as we flew in to Pokhara, we caught small glimpses of the Annapurna range through the clouds,” Du Puy wrote in an email over the weekend. “I’m in awe of the explorers who came here a decade ago and even considered setting foot on the flanks of these giants and humbled by the fact that our plane was cruising around 20,000 feet and we were still looking up at the peaks.”Annapurna route map Google Earth.jpeg

Tuesday  he’ll fly to Jomsom and drive Muktanath for additional acclimatizing until the 16th when he’ll drive early back to Jomson to catch helicopter to base camp.

Du Puy said that on his GPS watch he’s cached terrain data and plotted some waypoints for the team’s summits. That’ll help optimize their route and aid in case of low-visibility conditions.

He added:

“The sat modem is up and running with a new SIM card so I’ll be able to post updates and get weather regularly. That is no less than four satellite systems (Thuraya, Iridium, GPS, GLONASS) we’re using with four different devices. I’m glad we have all of these satellites whizzing above our heads and gadgets to talk to them so I can focus on what I do best; putting one foot in front of the other and repeating. A lot.”

“We also have questionable taste in movies and shows so I’m making sure we have Zoolander 2 and The Expanse on a portable WiDi disk station I’m taking to base camp,” he said.

He wrote a kickoff blog on Hexus.net where he described the incredible technologies he’s bringing with him, and I described the project for the audience over on Semiconductor Engineering. You’ll be able to follow his progress on his DeLorme GPS site, here and on social media.

It sounds almost like a tale: Every Year again all the Experts of the Embedded Market meet in a little old Town in the Forrest of Bavaria. They are coming from all nations in flying machines to see the latest inovation and talk to each other in a language filled with Buzz words only few might understand if they not belong to this totally Engineering group....

 

Also this Year Embedded World took place like always in end of February.

I have been on a few times on Embedded Worlds now, but it is always great to go again.

Every year there is something new to spot and I also enjoy to meet old friends and silicon vendors to discuss about the latest evolution/revolution.

 

This Year I attended the Show for TechNexion and we showed our latest Products based on NXP's ARM chips, like our tiny PICO Modules which are also available with NXP i.MX6UL now and the ideal choice for multimedia applications or where not much space is available. With i.MX6 and i.MX7 options it offers a nice scalability and with just 36mm x 40mm it's also small enough to fit almsot everywhere.

Not to forget the fitting baseboard which makes us the HW provider of choice for NXP and Google for the Brillo OS

 

Of Course we also showed our EDM Modules, the work horse of the industry. Reliable. Rugged. Open Source Standard.

With 82mm x 60mm it's a small formfactor with a great scalability from Single to DualLite or Quad core. And of course also available with i.MX7

 

New have been our TEK-Series, a Series of ruggedized, fan-less, cable-less BoxPC's with a Modular approach so Customer can choose the right configuration from off-the-shelf components

 

Also new have been our TEP-Seriees, which is a Series of HMI's ranging from 7" to 10" and 15", also based on the same modular, cable-free- fan-less approach like the TEK-Series.

 

Embedded World has been very busy for us, but never less we found the time to record a video, introducing our Company (for the few who don't know about us yet..) and in the second half of the video having a detailed introduction of our new products.

You can watch the video here:

TechNexion Company Introduction (@Embedded-News.TV)

or directly on YouTube here:

TechNexion Company Introduction and Product presentation (@YouTube)

 

Thanks all our visitors and Looking forward to see you again next year!!

Just Imagine, if you can monitor and control your garage door remotely on your phone, thus ensuring yourself a peace of mind or imagine you can set your home thermostat 10 minutes before you reach home or your washing machine can give you a text message, when your clothes are washed and ready to be placed in the dryer. Internet of things applications has made all this imagination into reality.

 

The Internet of Things has become a buzzword in recent years. The term “Internet of Things” was first coined by Kevin Ashton in 1999 to describe a system where a computer is able to collect the data from the physical world via sensors without any human intervention. Today IoT is referred in terms of providing internet connectivity to physical devices enabling us to monitor and control them. This article focuses on how IoT can benefit in personal and home use.

 

In Internet of Things paradigm, devices with sensors are connected to the internet as shown in Figure 1. These devices need not have inbuilt Wi-Fi support to connect to a wireless router as you would expect for a smartphone or a tablet. They can use different protocols like Bluetooth LE, ZigBee, Z-wave or Wi-Fi to communicate with a gateway. The gateway can support these radio protocols to communicate with the device on one end and hook into a home internet router on the other end. One such gateway or hub called “Revolve” was showcased at CES 2014. Use of low power radio transceivers results in longer battery life for battery operated devices like smoke detectors and thermostats.

 

The data which is captured by these devices is uploaded to cloud. The cloud hides Z-Wave, ZigBee, Wi-Fi and the other protocols from the user application. Any product using any of these protocols can be controlled on the same smartphone; thus,, in turn, makes heterogeneous home networks a reality. Each of these devices can be uniquely identified on the cloud. A smartphone app or web application can access this device data as well as send a command to any of these devices. Appliances like a washing machine, refrigerator, microwaves, air-conditioners, etc. as well as simple devices like light bulbs can thus be monitored and controlled. This can result in better home and energy management.

 

This is just the glimpse of Internet of Things in the field of personal and home usage. IoT products can be used to create a home monitoring system for children and elderly people. Doctor can monitor patients at their homes thereby reducing hospitalization costs.There are endless possible applications of IoT which can improve our lives and help us manage ourselves and our houses more effectively.

 

Source: IoT | Personal & home applications - Volansys Blog

MWC Intro.jpgLast week saw the annual mobile industry pilgrimage to Mobile World Congress in Barcelona.  One theme that remains strong is how the industry is working to evolve the traditional smartphone into something that is far more connected and interacting with your everyday lives. Exhibitors were keen to show connected products alongside the smartphone from connected cars, bicycles through wearables and extending into health and wellbeing.  In short, everything will need to get connected if this vision is to be realized.

 

 

At ARM we are working hard with our partners to help deliver that connected world.  As well as enabling the Cortex-A applications processors that power so many of our devices, ARM is also working hard ‘under the hood’ with Cortex-R and Cortex-M processors to bring efficient and right size connectivity to billions of devices.

 

In a world today that is essentially dominated by Bluetooth, Wi-Fi and Cellular what can we expect in the future to help connect the next billion and what glimpses did we see at MWC that show we are on that path?

 

First of all let’s consider cellular.  With the major focus over the last decade solving mobile broadband we don’t really think of cellular standards such as LTE being used to connect anything other than our smartphones.  This is set to change and the industry is working hard to deliver lower power connectivity that works over long range, promises indoor coverage and sits comfortably with existing cellular standards.  There were many demonstrations of so called LTE Cat-M at mobile world congress and also the inaugural NB-IOT Forum meeting was held which attracted representatives from right across the industry to begin discussing how to deliver the emerging technology.  You can read more about these technologies in the recent ARM Whitepaper.

 

What about beyond cellular? Several un-licensed band technologies were on show including an ARM mbed based demonstrator from the LoRa Alliance. LoRa is a low power wide area technology that is gaining increasing interest as a low cost of access technology for connecting devices.

 

Not to be left out, the Wi-Fi Alliance recently announced HaLow which is to be based upon the 802.11ah standard from IEEE. 802.11ah is a low power, wide area standard that aims to integrate into the traditional Wi-Fi access points thus giving an upgrade path for consumers to enable connected devices within their homes. We were excited to see the 1st public demo of 802.11ah from Newracom at MWC showing how this technology can be used for streaming hi-fi quality audio to low power end points.

 

Newracom.jpg

Newracom prototype 802.11ah demonstrator at MWC16

 

So what about 5G?  As expected, there were plenty of early proof of concept systems on show at MWC bringing examples of Gigabit connectivity. These early demonstrators are based on proprietary technology which forms an important part of the pre-standards proving ground.  Just ahead of MWC ARM announced the Cortex-R8 processor in readiness to support the new wave of multi-Gigabit LTE-Advanced Pro handsets and paving the way to 5G. You can read more on 5G in the recent ARM Whitepaper.

 

5G demo.jpg

Korea Telecom 5G Demo at MCW16

 

There is still a long way to go for the industry to deliver on the connected devices vision.  What we have seen at MWC16 last week is that the industry recognises this opportunity and is working on a number of key enablement technologies that will ultimately come to form the foundation of the future internet of smart connected devices.  Once we have those standards in our hands, we can look forward to a plethora of new device classes, use cases and services giving us something to look forward to seeing at a future MWC!

Zach Shelby

See you at Embedded World

Posted by Zach Shelby Feb 22, 2016

Like many of you in the ARM community, I am excited to be headed to Germany for Embedded World 2016. It is amazing to be part of ARM, the world's #1 embedded ecosystem, which will be ubiquitous across the show. This year I expect to see a focus on secure and connected offerings that will simplify embedded intelligence, taking advantage of the ARM architecture and mbed IoT Device Platform.

 

We are seeing mbed widely embraced by our nearly 60 partners and over 170,000 developers at the show, solving real problems in key IoT market segments including smart city and wearable applications.  To help product developers make the most of mbed, we are launching a series of application pages, including Wearable, Smart City along with Smart Home in the future.  These pages will showcase how mbed is being used in world-changing IoT solutions,  and the partner reference designs to get you started.

 

At the ARM booth (Hall 5, 338) we will show a Smart City product from Converge for smart concrete monitoring at construction sites using the mbed Smart City Reference Design. We are also showing a system from MultiTech and IBM using LoRA for real-time worker safety. A LoRA based smart city taxi application running mbed OS is also being shown by SemTech (Hall 2, 631).

 

Wearable products benefit from mbed OS through improvements in battery life, time to market and better security, all of which will be useful in enterprise, health and consumer applications. During EW we will be releasing the hardware and software designs for a complete wearable device, which I have had the pleasure of wearing daily for the last 7 months [Wearable Reference Design]. Together with our partners, we have achieved an incredible battery life of over 8 weeks. My last charge lasted for just over 11 weeks, the biggest problem now being how to find the charger!

 

Zach Shelby

Introduction

ARM and its Partners share a vision where the creation and deployment of commercial, standards-based IoT devices at scale is as easy possible. The ARM® mbed™ IoT Device Platform enables a software ecosystem that helps make it as easy as possible by providing a common platform for developing connected IoT devices.

 

The ARM IoT subsystem for Cortex®-M processors allows design teams to create IoT endpoints faster and with lower risk.  ARM’s scalable IP solutions are designed to target across the value chain from sensors to servers.  ARM’s IoT subsystem with mbed OS is a complete reference system that reduces the complexity and risk of a SoC design for IoT endpoints. The subsystem features a range of peripherals and interfaces. It is specifically designed for the use with Cortex-M processors and Cordio ® Bluetooth® Smart radio. ARM has taken this subsystem and generated a proof of concept platform called Beetle.

 

Beetle proof of concept

 

The test chip used on the Beetle platform provides partners with a low risk proof of concept methodology that showcases how designers can rapidly move from Register Transfer Level (RTL) which is a high-level hardware description language used for defining digital circuits to silicon with minimum engineering effort. The central element of the Beetle test-chip is the IoT subsystem, which is pre-validated allowing the user to hit the ground running. We built on the IoT subsystem attaching the Cortex-M3 processor, ARM’s Cordio BLE 4.2 radio, TSMC embedded flash and a host of other complementary peripherals from 3rd party vendors. This test chip was also built using the ARM Artisan® physical IP platform specifically tailored for IoT applications. The design is fully compliant with ARM’s mbed IoT Device Platform to enable rapid development and prototyping.

IoT subsystem.png

Beetle FPGA prototyping

 

As part of the of Beetle pre-silicon validation plan, the RTL was ported to the Cortex-M Prototyping System (MPS2). This is a low cost FPGA development board from ARM which is ideally suited to prototyping IoT endpoints. We synthesised the design for FPGA, replaced the embedded flash with FPGA block RAM and used an external ARM Cordio evaluation board instead of the ASIC macro. This gave us a platform on which to perform functional testing of all the peripherals and to develop peripheral drivers ahead of silicon.

AN491.png

 

We demonstrated the ARM IoT subsystem at Computex in Taipei June’15, using our platform and an external ARM Cordio radio to send sensor values to an ARM server over Bluetooth.

 

Beetle FPGA.pngcomputex demo.png

 

The Beetle test chip was sent for fabrication using TSMC’s 55nm process technology. While waiting for silicon to arrive, we developed the drivers on the FPGA prototype.

 

Software Frameworks

 

Our initial development work started when mbed OS was still at an early alpha phase. We decided to proceed with an mbed SDK (mbed Classic) driver port, so that we could create a stable demo for TechCon. Meanwhile the new mbed OS development moved from alpha, to beta, and then to the mbed OS 2015.11 technology preview release. Our ultimate goal was to have a complete IoT software framework ported to the device enabling rapid IoT device application development.

 

Showtime!

 

With ARM TechCon just around the corner we received our Beetle chips back in mid-October.

Beetle TC.png

Then it was all hands on deck to bring up the chip and integrate the Cordio firmware with mbed. The demo included the Beetle platform with an external Cordio evaluation board sending sensor values over BLE to a nearby phone that displayed the data. In parallel to BLE transfers data was also sent over WiFi to an ARM based server, which collated this data with all the other sensors data from other sensor nodes around the ARM booth.

 

TechCon demo.png

 

Activities after ARM TechCon

 

After TechCon, we started the port of mbed OS and updated the firmware, improved the power consumption, performed some benchmarking and added additional features.

 

EW demo.pngBeetle board.png

 

So what have we been working on for the last few months?

 

Benchmarking

 

With Beetle intended as a proof of concept, we were interested in performing some initial benchmarking tasks. We ported CoreMark to the platform with good results:  a score of 137 (running from flash at 48MHz with 1 wait state) which was comparable to other Cortex-M3 based SoCs.

 

Firmware updates

 

While porting mbed OS, we also began a series of firmware updates which included:

  • CMSIS-DAP support over USB with drag and drop programming of Beetle software binaries to the embedded eFlash or external QSPI.
  • Serial wire debugging (SWD) for use with ARM Keil uVision.
  • Virtual serial port commination.

 

Board improvements

 

We put some effort into refining energy consumption for the Beetle board. Among of the changes we made to the board are:

 

  • The largest single power saving was ensuring that the CMSIS-DAP microcontroller on the board was put into deep-power down mode when not required. This alone saved 12mA . In the microcontroller, when the ARM Cortex-M0 is put into its Wait for Interrupt (WFI) low power state, the surrounding logic detects this and turns off the power to the majority of the chip, including stopping the clocks. The CMSIS-DAP is put into this mode by default when the board is battery powered, but is reawakened when USB power is applied, allowing for the use of USB debug and virtual serial power connections
  • Added pull ups to the QSPI Flash, SPI ADC chip selects and clocks. This ensured that devices were put into their lowest power modes if the I/O was explicitly configured by the software.
  • We found that the ORing diodes used to power the ARDUINO® shield’s supplies had a fairly high reverse leakage current, which was wasting energy. The solution was to utilise FETs rather than diodes, slightly more complex but less wasteful.

 

Migrating from mbed Classic to mbed OS

 

mbed OS differs from mbed Classic and introduces a new lifecycle tool called yotta, which makes it easy to reuse software modules in C and C++. This requires new structure definitions, modules and targets organization. Supporting the new strategy meant understanding the object oriented design of the new operating system. From the board support package (BSP) standpoint, this meant refining the driver architecture in order to fit the new design and be compliant with the new model defined through JSON, which is a lightweight data-interchange format.

 

Once the definition was complete it was time to start adding some code. All the drivers are split between CMSIS and HAL and built through CMake infrastructure. One important thing to keep in mind is the introduction of the new MINAR scheduler, which requires an lp_ticker and a sleep driver, both of which are not present in mbed Classic (take a look at the advantages of the MINAR scheduler). After this addition, with only a couple of changes to the scatter file and an understanding of the memory allocation in mbed OS, our Beetle was ready to boot. And of course “Welcome to mbed OS” was the first message we saw on the serial port.

 

mbed OS Cordio BLE Integration

 

After completing the base port we moved onto our next activity, bringing up the ARM Cordio BLE radio. We needed a BLE infrastructure in our mbed port. mbed OS already provides a BLE Object, which is the main interface between the OS and the BLE world. It is responsible for providing a set of compliant BLE services that can be instantiated directly from the main application. In order to talk with the low level IP, the BLE Object interfaces with a lower level BLE stack, which in our case was the ARM Bluetooth Stack (formally WiCentric).

 

Next we had to integrate the Cordio firmware and Cordio drivers into the rest of the system. What we noticed immediately was that we could not use the SysTick as the main timer anymore, because this was making it difficult to synchronize with the Cordio macro. The main challenge here was that we had to re-architect our timer strategy, because the various sensors and IP had specific requirements for timing. After some investigation, we used the Dual Timer for the lp_ticker and the Cordio IP, and the Timer0 for the us_timer, which worked like a treat.

 

mbed OS sensors integration

 

Now that we had a functional system it was the time to transmit some real data over BLE. For a previous demo we created a shield containing proximity, microphone and humidity and temperature sensors and we decided to reuse this. Together with the internal True Random Number Generator (TRNG) we had enough data to stream over BLE.

 

Android demo app

 

ARM’s demo team had already created an android app for the TechCon demo, which was to be used for the final phase actually transmitting the data over BLE. The initial app’s services needed a minor modification to be able to recognize the new sensor data coming from over Bluetooth. Then it was time to click “connect” and see the application read and display sensor data from the board.

 

Conclusion

 

The experience of taking something from prototype to silicon and pulling all the software together was amazing. It was a rewarding and interesting challenge for engineers to get involved in. We learned on the value of FPGA prototyping and software development ahead of silicon. Found ways to reduce the board power consumption. Learned how to port mbed OS and integrate Cordio BLE software and add sensors. We plan to release the source code for Beetle shortly so you can benefit from everything we’ve learned to shorten your development time and focus on the differentiating part of your design.

Filter Blog

By date:
By tag:

More Like This