1 2 3 Previous Next

Internet of Things

381 posts

If you've been reading my blogs (here and on EETimes) you know how excited I am about the wave of innovative products coming out of Kickstarter, Indiegogo and incubators like Y-Combinator - so when I came across SoundFocus, I had to call them. I actually found SoundFocus and their AMP audio product through ARM partner and audio engineer Paul Beckmann, founder of DSP Concepts, who has an interesting take on the future of DSP’s in applications like this (follow me in the community to get notified when I publish that interview). Paul connected me with Alex Selig and Varun Srinivasan of SoundFocus who are the co-founders, and they have a really interesting story to tell. Alex and Varun worked together at Microsoft on the Office suite of products, but had an itch to do something entrepreneurial - so when Alex wanted to modify his Smartphone to compensate for his hearing loss, a company was born.


Alex wanted to know why in a world of increasingly smart devices can’t there be better audio for people with hearing loss. One in five people in the US over the age of 12 has some form of hearing loss and we all know that audio on a Smartphone in some environments can be a challenge. Their first idea was to figure out if a smartphone could become a hearing aid and they developed a music companion app to compensate for hearing loss. The SoundFocus app has been downloaded over 150,000 times and users customize the sound of their music to their own hearing profile and taste. Go and check out the SoundFocus app on iTunes.


iPhone Screenshot 2


The next logical step for them was to build a hardware version to improve the listening experience for iPhone users and they quickly built a prototype (they claim in a day!) using a Texas Instruments BeagleBone in a Tupperware container (if anyone has a picture of that please post).  In their defense, anyone who has had hands-on experience with a Raspberry Pi or BeagleBone knows how fast you can get the lights blinking and something happening. What was key in this case was Alex and Varun found the AudioWeaver audio algorithm builder from Paul Beckmann’s company and they were off and running with the software. Now they have a working prototype, but how do they transform it into the final product they are showing us below? 

AMP founders.jpg


This is where the design tradeoffs start and choosing your partners is key. The team knew they needed a very low-power processor (with a small footprint) that could run the bare metal level code. They also needed USB integrated connectivity and embedded flash memory so they turned to Future Electronics (a major electronics distributor) who recommended they look at STMicroelectronics' STM32F401 chip which has an ARM Cortex-M4 processor on board and many of the integrated peripherals they needed. The fact that the chip is only 3mm square and has the DSP processing capability they need at very low-power without using a separate DSP cannot be underestimated. For the technically minded I will explore this in an interview with Paul Beckmann next week.


Now they knew the form factor and power would work but they needed a physical prototype with a custom PCB board and at this point many companies go to Shenzhen and try to get it built. I saw this happening in real-time two years ago and saw firsthand the toll it takes on a team but yet again the SoundFocus team took a different approach and partnered with San Francisco based 3D printing company Fictiv https://www.fictiv.com/and local quick turn PCB supplier Sierra Circuits. By having key partners so close those inevitable changes and redesign meetings could happen whenever needed and that meant they could get the product to market faster than taking the Shenzhen route.  At some point they may take manufacturing offshore but when that day comes they will have a stable and successful design to replicate.

Just as interesting to me as the design is how the original team got the company funded. Alex and Varun won a place in a Y Combinator cohort in 2012 and “graduated” in the summer of 2013, they then raised money to develop the prototype and finalize the design.  Alex told me their experience at Y-Combinator was a huge help, it was unstructured but the advice was incredibly valuable. Y-Combinator is perhaps the most visible of the incubators and I mistakenly thought it was mainly for software companies but Alex and his team pointed to some very cool Y Combinator hardware companies to check out - such as Coin, Gbatteries, Estimote, Graft Concepts, Meta, Senic, Lockitron, Airware, Beep, TerrAvion, iCracked, Cruise, Boosted Board, Double Robotics and let’s not forget Pebble (got mine!).

You can probably tell I got very excited about this product because it brings a number of major forces in technology together; first is idealistic founders getting funding and doing someone useful for humanity with passion and second is the new wave of ARM processors like the Cortex-M4 which can operate at such low power they will enable devices like the AMP and billions of new devices which will bring us the Internet of Things. Also let’s not forget 3D printing, this is a perfect example of how useful it can be.

If you want to check out the AMP or order one go to the AMP website, mine is on order and now I have to go out and buy that iPhone 6. Have you seen something as cool as this recently?  If you have, please comment below.

Omate, the Kickstarter founded smartwatch company launched a new smartwatch yesterday. Building on the success of the original Omate TrueSmart and recently launched Omate-X the new device ‘Lutetia‘  is targeted solely at women.  It has been a common complaint in smartwatches that the often bulky devices are not suitable for the more slender female wrist..... well Omate has a solution for that with this new device.And a quick history lesson from Omate, Lutetia was a town in pre-Roman and Roman 'Gaul' and is the origins of present-day Paris.





                                                                                Omate Lutetia & Omate-X                                       David & Laurent discussing Omate’s latest Smartwatch: ‘Omate Lutetia’


I was fortunate enough to catch-up with Laurent Le Pen, CEO of Omate this week in London, hot on the heels of his latest product release. Based out of Shenzhen, China, Omate is striving to make waves in the fledgling smartwatch space.  The original Omate TrueSmart is based on the Mediatek Cortex-A7 platform and runs Android. This watch gives full cellular and Wi-Fi connectivity, eliminating the need for tethering to a smartphone.


Omate-X marked a new direction for the company and brings an ultra-low power tethered watch with fashion and design style at its heart. The Omate-X was launched in August and is based on an ARM powered Mediatek Aster chipset. The combination of BLE tethering and ultra-low screen gives over a week’s battery life.


Lutetia‘ builds on the Omate-X platform but with a focus on design for the female user. A round screen, smaller form factor, and a choice of stylish finishes shows how the look and feel of a wearable device is becoming every bit as important as the technology inside. The watch was designed by women for women, something hard to find elsewhere in this space currently.


So what next for Omate? Laurent was keeping his cards close to his chest on that, but was keen to emphasize that a focus on design style and ease of use would be centre to the business. Laurent has been very vocal in his support for ARM - MediaTek based wearables and we look forward to working closely with Omate in the future on exciting new devices.

With my smartphone staying in my pocket and my ARM powered Android Wear LG ‘G' watch pointing the way, I navigated across London and joined the Cambridge Wireless Special Interest Group meeting yesterday at the London HQ of Deloitte. The focus of the meeting was Wearables.


Cambridge Wireless is a network of nearly 400 companies across the globe interested in the development and application of wireless and mobile technologies to solve business problems. Cambridge Wireless connects those companies and stimulates collaborative innovation through a range of thought-provoking high-profile networking events.


I spend a lot of my time looking at and talking about the exciting wearable space and am constantly surprised at the level of innovation taking place on ARM powered wearable devices. It was nice to sit back and listen to some other views and gauge the opinion of where this exciting technology space is headed.


So what was the main takeaway? What did I learn? I think for me the most fascinating theme was that of how wearables has the potential to deliver new services to consumers and redefine business models and that in turn is going to present challenges to the market and how it adapts. Rather than talking only of devices and technology, in wearables you need to turn it around and rather talk of what benefits does it bring to the consumer?


For wearables to break out to mainstream they need to appeal to the wider market. Traditionally in wearables this appeal comes from areas such as desirability and status, but increasingly it is switching to usefulness i.e. it fulfills a purpose that somehow enriches our lives. Looking at the omnipresent smartphone this has been successful in many areas such as social media, navigation, music and video to name but a few. 


Wearables needs to deliver something new, this could be in aiding a better and healthier lifestyle, monitoring our health or ensuring loved ones are kept safe. The fact that we wear these devices has the potential to integrate them right into the heart of our daily lives and will make them very personal to us. To achieve this they need to be small and unobtrusive, low power and easy for the consumer to use.  At ARM we are working tirelessly with our partners to enable those attributes.


As an industry we clearly have some work to do in order to address all these points and bring wearable to truly mass market, but I do believe all the building blocks are there to make it happen. It has been an exciting journey so far and I can’t wait to see how this fast paced market continues to evolve.

Soshun Arai

ADAS Mapping by ARM

Posted by Soshun Arai Oct 8, 2014

Towards 2020


Advanced Driver Assistant System (ADAS) is a key application leading towards autonomous drive in the not distant future. Almost all car OEMs and tier 1s are developing and competing with each other with the aim to introduce semi-autonomous driving vehicles into market by 2020. ADAS requires much higher computing performance than existing other automotive applications. It needs sophisticated technologies such as mobile and consumer electronics tailored for automotive qualifications which are quite huge challenges for today’s automotive industries.


Under such circumstances, a great deal of attention is paid to ARM processors for their unparalleled performance with high power efficiency and broad ecosystem, which will help a lot for the automotive space.


Actually ADAS contains very wide applications but here we want to focus on camera, sensor and radar related applications and discuss how ARM processors will be used in the near future.


Simplified ADAS application workflow is described below: sensing, perception, decision making (sometimes displayed to the monitor/dashboard) and then actuated by chassis application like braking or steering system.


ADAS Fig 1.png


For sensing, a multitude of sensors such as radar, lidar, cameras, MEMS or ultrasonic sensors are used here.

  • Radar/Lidar: Already widely used and established in the automotive and related technologies, they are good at measuring the distance and finding objects even in bad weather or night time. The price points have been decreasing for mass production. High performance CPU such as ARM® Cortex®-R4, Cortex-R5 and Cortex-M7, associated with real time responses taking into account safety critical events and information are required for this application.


  • Front/360° view camera: This emerging application requires the highest computation power ever for the SoC. There are several approaches to meet requirements and also a lot of algorithms are developed. It is getting more common to use not only CPU but also additional hardware accelerator like DSP, FPGA or GPU to achieve higher performance under the constrained power and space. Cortex-A15 or ARMv8-A processors supporting 64-bit such as Cortex-A57 and Cortex-A53 will be main streams in this space and ARM Mali GPU can be also used for GPU compute acceleration.


  • MEMS/Ultrasonic sensor: A huge amount of sensors are used today in automotive from affordable cars to premium cars. Ultrasonic sensors are mainly used for parking assistant systems to measure distance and MEMS like G-sensors to detect acceleration and collision. The number of these sensors per vehicle will increase to obtain more precise data, but on the other hand the cost will decrease, as is the common trend in automotive semiconductor. To achieve higher performance with high power efficiency and affordable price than 8, 16-bit MCU or current 32-bit MCU, the ARM Cortex-M0+, Cortex-M3 and Cortex-M4 are very suitable CPUs here.


ADAS Fig 2.png


Possibility of Data Fusion in Vehicle

ADAS will bring a lot of extra MCUs and SoCs into vehicles. They will start communicating or cooperating with chassis applications like braking system or steering system via domain controllers. Communication via automotive Ethernet, data fusion seems to be inevitable to happen to handle enormous amount of data but it is not clear yet what kind of new SoC will be needed or if improved Graphic SoC (for GPU compute) to handle the extra job. Perhaps real-time CPUs supporting functional safety related to ISO26262 will be needed to make decisions. Nevertheless, the performance requirements for the SoC for data fusion should be higher than ever. I’m thinking that if heterogeneous computing is to happen in automotive space in the future, this will be the place. We offer ARMv8-A processors supporting 64-bit, ARMv8-R architecture supporting virtualization by hardware with real-time response, and Mali GPU for GPU compute and ARM’s announced Open CL support for NEON for heterogeneous multiprocessing and parallel computation for performance and efficiency.



ADAS, which is definitely required to support automotive quality and safety, at the same is very similar to mobile and consumer application technologies with rapid development cycle. ARM’s solutions in terms of technologies and ecosystem address such complicated requirements. ADAS is a unique domain where cooperation or co-development with third-party of software companies is needed to remain competitive or to develop new algorithms for fast time to market. ARM’s ecosystem with its broadness and diversity, enables the automotive industry to overcome the difficulties it faces today by providing new partnerships and to leap into the next stage.


ADAS Fig 3.png

IPSO Smart Objects: Data Interoperability for the Internet of Things


On September 30th, the Internet Protocol for Smart Objects (IPSO) Alliance published the Smart Object Starter Pack Guideline. This document, while not a formal specification, provides a missing piece of the interoperability puzzle for the Internet of Things (IoT).


It’s generally acknowledged that there are plenty of emerging standards for communication protocols that run the IoT. Web technologies and architecture like the REST design pattern, URIs and hypermedia, provide a sound foundation for scalability, interoperability, and ease of use.


Internet Protocols from edge devices to the cloud


Internet protocols including CoAP (IETF RFC7252) have been developed that can be run on very small and low power constrained devices, down to the order of 16KB of Ram and 128 KB of flash. CoAP implements the REST design pattern, with specific extensions for Machine to Machine (M2M) communication use cases. This allows a new IP level interoperability, making constrained devices into tiny web servers that can interact with other devices and systems using internet and web protocols.


At the same time, low power communication technology is consolidating around some standards that are deploying IP networking capability. The Thread alliance, Zigbee Alliance, and Bluetooth SIG are all planning to deliver standard IP network interfaces in their communication stacks.


These developments will create a system of layered interoperability at several levels. With IP networking, many standard application layers can be accommodated easily. Using CoAP, devices communicate in a standard way using a 2-way client-server REST pattern.


Interoperability needs more than protocol standards


Even so, adherence these standards does not guarantee the kind of interoperability expected from the Internet of Things. For example, we would like to enable any sensor or actuator, out of the box, to work with any service, application, or another compatible device.


What is missing is a layer of interoperability around the data and metadata exchanged between devices and application software. API structures, data formats, data types, and data meanings all need to be harmonized and normalized in order to achieve this level of interoperability.


The IPSO Smart Object Guidelines are meant to solve this “Last data mile” interoperability problem. Using the standardized layers of IP and the REST design pattern, including CoAP, IPSO Smart Objects provide a common Object Model, consisting of a URL template  and set of data types that enables application software to meaningfully interact with devices, including software in other devices.


IPSO Smart Objects are analogous to web pages for devices, a uniform data layout for RESTful web objects that enables discovery, interoperability, and reuse of client and server software. Device software can use a simple library to make instances of diverse sensors and actuators. Common server and application middleware code can be used with any IPSO compliant sensor of any definition, current or future. Using a common set of object definitions, any application software can interact with any devices, subject of course to the provided authorization and permission controls.


Smart Objects available for use today


IPSO Smart Objects work with existing server software and middleware, including the ARM® mbed™ Device Server. The guideline is based on OMA Lightweight M2M (LWM2M), which is a device management and device server specification that defines how to use CoAP and related IETF standards in a consistent way for interoperability. IPSO Smart Objects can also be used with any protocol or server that supports the REST pattern and a few basic content types, including HTTP.


LWM2M provides for standardized device management, including Onboarding, Security provisioning, firmware management, network instrumentation, and resource level access control. IPSO Smart Objects work with any LWM2M server, effectively turning it into an Internet of Things API server in addition to performing device management. LWM2M servers are lightweight and can run in small gateways and devices, as well as cloud scale deployments.


The Smart Object Starter Pack


Today’s document is the Smart Object Starter Pack, a definition of 18 objects that can be used in common application use cases from smart home, energy, and wearables. Some examples are temperature sensor, barometer sensor, lighting control, load control, and accelerometer.


The Starter Pack is published as a way to acquaint developers and users with the object model and provide some example objects to get started with. The model allows new objects and resources to easily be created, deployed, and tested in new application areas and new domains. IPSO and OMA are developing tools and processes to enable new object sets to be created for new application domains, and for registering these as open standards. Of course, private object sets can also be created for private enterprise domains.


For more information, and to obtain a copy of the Smart Object Starter Pack, please see the press release and IPSO web page at:

IPSO Alliance Publishes Smart Objects Guideline – Starter Pack 1.0 | IPSO Alliance


CoAP, LWM2M, and IPSO Smart Objects are supported on ARM mbed. For further information see:

mbed | welcome


Standards mentioned in this article:

IETF CoAP (RFC 7257)

CoAP — Constrained Application Protocol | Overview


OMA Lightweight M2M (LWM2M)


Pre conference resources and post conference discussion:

ARM TechCon 2014 Software Developers Workshop Sneak Peek

It's almost time for Software Developer Workshop's Code Jam here at ARM TechCon!


If you are planning on participating on the IoT side of the software I highly recommend you take a look at the examples for the peripherals we will have available:


DHT11 Temperature and Humidity Sensor: Grove - Temp&Humi Sensor | mbed


Galvanic Skin Sensor (GSR): Seeed Grove GSR Sensor | mbed


Light Sensor: Grove - Seeed Light Sensor | mbed


Ear-Clip Heart Rate Sensor: Grove - Ear-clip Heart Rate Sensor | mbed


Also if you want to use a putty you can download one here: PuTTY Download Page

If you are coming to the workshop today and want to take full advantage of these peripherals, please make sure your Eclipse SDK and mbed compiler are up and working ahead of time. The set up process is outlined here: ARM TechCon 2014 Software Developers Workshop Sneak Peek

Today we announced an exciting new software ecosystem for Internet connected embedded devices, the ARM® mbed™ IoT Device Platform. This includes a free operating system, mbed OS, and server software, mbed Device Server, designed to simplify and speed up the creation and deployment of IoT products. This builds on ARM’s industry-leading technology and expertise in connected devices by integrating ARM, mbed, and Sensinode in one solution. We believe that having a widespread software ecosystem that enables the use of open Internet standards is key to accelerating the Internet of Things (IoT) market. The expanded mbed Ecosystem already includes 26 partners, from silicon and devices to the cloud.


So why are we building a new software ecosystem for IoT? Simple - our partners and their customers have expressed the need to enable more innovation, get products to market more quickly, and enable the widespread use of open standards. Together with our ecosystem partners, we are going to make that happen by making the software do exactly that and make it easily available.



mbed OS is a free operating system for ARM Cortex®-M based devices that consolidates the fundamental building blocks of IoT in one integrated set of software components. It contains security, communication and device management features to enable the development of production-grade, energy-efficient IoT devices. mbed OS complements existing operating systems such as Linux or Android that are suitable for more capable processors or RTOS solutions that are ideal for real-time industrial applications. The first public alpha release of mbed OS is in 1Q/2015, and we expect the first production devices integrating mbed OS in 2015.  


[Click image to enlarge]


mbed Device Server is software technology that provides the required server-side technologies to connect and manage devices in a secure way. It also provides a bridge between the protocols designed for use on IoT devices and the APIs that are used by web developers. This simplifies the integration of IoT devices that provide ‘little data’ into cloud frameworks that deploy ‘big data’ analytics on the aggregated information. Built around open standards, the device server scales to handle the connections and management of millions of devices. mbed Device Server is available now in both free and commercial versions. 



[Click image to enlarge]

Learn more about mbed by visiting mbed.com.


At ARM, we’ve had the opportunity to work with our partners to define the mobile revolution and now we’re hard at work doing the same for the growing wearable devices category. We’re simply cracking the surface of wearable technology – a market set to expand to $30 billion a year by 2018 according to market research firm IHS.


And though we’re in early days, we’re already coming to some important conclusions on how best to build a wearable device that’s efficient, accurate, powerful and useful. At its core, the key challenge in wearable devices is achieving an all-important balance between processing power and energy efficiency – and it’s at this place that ARM’s expertise and pedigree has helped us lead the way in architecting wearable technology.


A deep dive into our position on wearable technology can be found in a new ARM whitepaper but let’s look at a few of our findings on how we can help build the best wearable devices.


One of the most interesting and challenging parts of the wearable technology sector is addressing just how diverse they can be – encompassing simple fitness trackers to high end smart watches and everything in between. They need to be able to take in and process data streams from any number of sensors including accelerometers, gyroscopes, GPS antennas, temperature and pressure sensors. They can run anything from a simple real time OS to a rich OS implementation such as Linux or Android. And in this diverse landscape there’s a common set of challenges, the need to be “always-on, always-aware” while also being lightweight and highly power efficient.


ARM has been a major innovator in this space and products like the Cortex-M are especially designed to maximize power efficiency. These chips are perfect for fitness trackers or simple smartwatches – providing just enough processing power and incredible power efficiency. And if you need a bit more power, the Cortex-A series can enable advanced performance without adding a large drain on energy. Coupled with a Mali GPU, these chips can bring a rich graphical user experience to a smartwatch or similar device.


Wearables bring a whole new power class requirement vs. that of a traditional smartphone. For example, the average smartphone requires a daily charge of around 3,000 mAh, but a wearable device needs to use around 300 mAh per week… For wearables to make sense for end users, they need to be able to work often and charged infrequently. But even so, we can use the knowledge gained in our experience with smartphones, tablets and other mobile devices as a jumping off point. At ARM, we’ve been working on optimizing SoCs for use in wearable devices. This includes things like using smaller memory caches to save on both die area and power consumption – making our chips smaller and more energy efficient. And these changes are meaningful: we’ve been able to halve the L1 cache size from 32K to 16K while only experiencing a 10 percent impact on performance.


One of ARM’s greatest strengths is our vibrant and robust partner ecosystem. It means that we have the privilege of tackling the challenges of wearable devices from all angles and across the diverse spectrum of wearable technology.  We fully expect to see this category of devices blossom in the coming months and years – and with it, we’ll discover new and better ways to address the specific needs of wearables. We look forward to seeing where this path takes us and fully expect to see uses and form factors that we’ve not yet even imagined. And throughout the journey, we’re confident that ARM offers the best platform – providing an industry leading balance of processing power and power efficiency – on which to build the next generation of wearable devices.

ARM Step Challenge Photo.jpg


Catch me if you can!


The "infamous" competition returns for ARM TechCon 2014 with 21 competitors (representatives) from the event's medalist sponsors as well as some folks from ARM vying to be the grand champion of the ARM Step Challenge. Armed with an ARM-powered FitBit (see Fitbit Flex Teardown by iFixit), each competitor's steps will be precisely tracked from 12:01 AM October 1st to the end of October 2nd and projected on a live leaderboard via ARMStepChallenge.com (note: website will be live once the expo floor opens on October 1st). Each competitor has a daily goal of 10,000 steps, but the winner of the ARM Step Challenge at 51st DAC averaged nearly 20,000 a day which equals out to be 10 miles a day! So in short, you won't see competitors taking elevators, escalators or taxis to their next engagement.


For everyone else, if you don't yet own a Fitbit, I encourage you to jump on the wearable fitness bandwagon. Besides tracking your steps, the wristband can also track calories burned and your sleep quality. Personally, it quickly made me realize how few of steps I was walking since I'm in an office majority of the day and has forced me to take a nightly run (the only way I can keep up with my wife's steps who is busy chasing two young toddlers all day).


Now let me introduce you to the competitors:

From ARM, we have Ian Drew, CMO and Executive VP Marketing and Business Development, and John Heinlein, VP Corporate Marketing


Altium: Sara Hosely, Marketing Projects Coordinator

AMD: Sumit Agarwal, Senior Manager, Software Engineering

AppliedMicro: Gaurav Singh, VP Technical Strategy

Cadence: Brian Fuller, Editor-in-Chief

Cavium: Gopal Hegde, VP and GM, Data Center Processor Group

Express Logic: Scott Larson, Senior Software Engineer

Freescale: John Dixon, Director, Corporate Marketing

HP: Larry Kelmar, Senior Director, HP/ARM Alliance

IAR Systems: Marie Gylldorff , Web Director

Lauterbach: Jerry Flake, Western Sales Manager, USA

Mentor Graphics Corporation: Colin Walls, Embedded Software Technologist

Qualcomm: Adam Kerin, Senior Manager, Marketing

Rambus: Carolyn Robinson, Senior Global Corporate Communications Manager

Samsung: Kelvin Low, Senior Director, Foundry Marketing

STMicroelectronics: Alec Bath, Field Applications Engineer

Synopsys: Phil Dworsky, Director, Strategic Alliances and Publisher, Synopsys Press

TSMC: Lluis Paris, Deputy Director, Worldwide IP Alliance

Xilinx: Dave Tokic, Senior Director, Partner Ecosystems and Alliances

Xively: Fraser MacDonald, Program Manager


UBM, the ARM TechCon Event Organizer, has Brian Gillooly (VP and Editor-in-Chief, Events) as their representative.


Best of luck to everyone and I look forward to seeing all the friendly banter via the community and social media (#ARMTechCon on Twitter)!

With wearable products such as fitness bands, watches and glasses gaining traction we are excited to be working with ARM partners in building the wearables revolution.  A joint whitepaper between ARM and Freescale has recently been published and gives fascinating insight into the enablement path of wearables from a silicon partner perspective.


Taking the reader through the market dynamics this paper also showcases many of the exciting wearable devices that are already launched on ARM powered Freescale MCUs and processors.  As well as existing products, key design challenges are also examined including the needs to balance ultra-low power, small form factor and always on functionality.


Getting the technology into the hands of the innovator is seen as a critical enabler of wearables and the paper finishes up with a look at the newly launched Freescale WaRP board (Wearable Reference Platform) which combines an ultra-low power Cortex-M based Kinetis KL16 MCU with an i.MX Cortex-A based 6SoloLite Applications Processor running Android operating system.


The Whitepaper ‘Overcoming the size and power trade off in wearable designs’ is available now to download at Freescale.

At the HPC User Forum in Seattle this week, PayPal along with HP and Texas Instruments talked about making order out of Chaos using the HP Moonshot systems ARM Cortex-A15 based TI cartridge.



PayPal Engineers talked about how they need to make order out of chaotic data streams that include 3 million events per second, 25TeraBits of data ingestion per hour and 20MB/second of machine data from thousands of servers. They need to process all this disparate data real time and now with creative processing they can correlate events in seconds vs hours.


PayPal talked about watching the HPC space because their use case falls between the grey area of Enterprise servers and HPC and they were not able to scale and meet their processing needs using existing methods and approaches. It was exciting to hear PayPal engineer Ryan Quick talk about his Aha moment when he saw a presentation from HP about the Moonshot cartridge from TI and the realization that with the powerful combination of Cortex A-15 processors, 8 TI DSPs, internal fabric and networking capabilities, TI had effectively built an HPC cluster on an SoC. The proof point was when he measured the power running his application and found that running his application he was getting 11GigaFlops/Watt processing throughput.


You can see his presentation at the HPC forum here - HPC at Paypal: Leveraging DSPs for Systems Intelligence - insideHPC. He talks in detail about the SoC capabilities he uses on the TI Cartridge in this blog from our partner TI here - Creating order out of chaos – in real time - Multicore Mix - Blogs - TI E2E Community.

As I was watching the video the statement about 11GigaFlops/Watt peaked my interest and got me thinking about how would this compare to the world's Top 500 super computers. So I checked out the data on the Green500 website (The Green500 List - June 2014 | The Green500) from June of this year. To be fair your mileage will vary based on the HPC workload that is being executed and some configurations are more suited to a broader set of HPC workloads than the example discussed in this blog. But… I looked for a supercomputer that was in the Top10 on both the Green list and the Top500 list and the best number that I got was 3.2GigaFlops/Watt. To me that is certainly something that engineers will sit up and take notice of.

ARM TechCon is rapidly approaching and this year is shaping up to be even bigger than in the past! The 2014 event, which is now just a couple of weeks away, will offer a lengthy list of technical sessions addressing some of the biggest challenges facing embedded systems developers today.


For example, Christian Légaré, our (Micrium) EVP and CTO will present on “Embedded Systems, Catching the IoT Wave," in Location G on Wednesday, October 1, at 4:00 pm. As part of his presentation, Christian will review and analyze the IoT design challenges related to code size and RAM requirements for the major networking stacks, optimizing TCP/IP resources versus performance, using Java from Oracle or other vendors vs. WiFi (radio only or integrated module), Bluetooth (classic vs. LE), and IoT protocols.


This relates on our recent news announcing our latest µC/TCP-IP driver for Qualcomm's low power QCA4002 Wi-Fi chip used in Netcom's GT202 module, a ready-to-use module that rounds out the hardware and software ecosystem to achieve IoT connectivity.


But perhaps most importantly, we will roll out our IoT strategy during this year's show. Our approach and product set is geared towards helping developers connect their real-time “thing” with the cloud, and bringing this power to your embedded device. Watch for news on this the week of September 22 – and be sure to come by and learn more during the show, we'll be in Booth #801.


As the leading provider of commercial operating systems for ARM-based processors, we welcome the opportunity to meet to discuss the changing face embedded software and help you to understand how to accelerate your projects and meet their most pressing deadlines through the efficient use of a professional-grade RTOS.


Mike Kaskowitz

Vice President Sales and Marketing


I have been involved with the Internet-connected embedded device industry since the late 90s, and have always been involved with creating open standards. Now, anyone who has done this knows how slow and at times painful this can be. So why in the world have I gone through all that trouble? Thinking back, what really got me excited about Internet technology was how it completely transformed the world. A fairly simple set of Web and Internet standards, with clear layering and just enough interoperability, enabled innovation across our entire society. Today people use the Web in just about every function of life and society. What has always really intrigued me, is can we do the same for Things? I strongly believe we can, and the ingredients are the same.

There are three major ingredients needed to make the Internet of Things a really successful platform for innovation:


  • Internet connectivity,
  • Secure data transportation,
  • and Data semantics.


Why does Internet connectivity matter for things? Can't we just gateway everything in their specific application and "hide" the embedded from the Internet? Internet connectivity for embedded devices really does matter, as it enables us to leverage the Internet as an platform for innovation, rather than using technology as a control point. Besides that, there are a several technical advantages:


  • Clean layering for connectivity allows us to build generic, high-volume, inexpensive network standards that are re-usable even in a fast moving innovative industry.
  • Networks are easier to build and maintain than application specific gateway infrastructure.
  • The Cortex-M architecture enables advanced cryptography on any embedded device, IP to the edge means strong security to the edge.
  • IP based networking allow us to deal with mobility on, and choice of networks.


We're excited about all the positive developments in this space such as WiFi and Thread (see Thread: What makes it different?), but also the roadmap for low-power LTE and the new standard for IP over BT Smart.


The real goal of the Internet of Things is to leverage devices in services, and to do that we universally need to securely move data around. This is achieved using application layer protocols (yes, many) and secure transports (yes, always). Let me be the first to say it, there is no single protocol for the Internet of Things. Instead, there is a set of Internet protocols that together cover the needs of most IoT applications: CoAP, HTTP and MQTT. I helped develop the Constrained Application Protocol (CoAP) to enable Internet communication for constrained micro-controller based devices over low-bandwidth networks. It provides most REST features found in HTTP, but with minimal overhead, simple processing and asynchronous transactions. Using CoAP is like riding a bike, it gets you exactly where you need to go with little energy. Although HTTP is mainly known for transferring media, it is also the most widely used protocol in IoT today. HTTP is like flying a plane, it is suitable for less constrained devices over networks where synchronous transactions are suitable between two end-points. Finally, the publish-subscribe paradigm sometimes makes sense in applications where a single-to-many relationship exists and a broker can be used. MQTT is like a freight train, the publish-subscribe alternative to HTTP, but with all goods to central station and then to transferred to points beyond. Recently we published a solution to combine the best of both CoAP and publish-subscribe called CoAP-MQ. All of these protocols make use of Transport Layer Security (TLS), or in the case of CoAP, Datagram TLS (DTLS), to ensure end-to-end integrity and confidentiality of data. We are seeing great progress in both security protocol support for the needs of IoT applications, as well as the ability to leverage public-key cryptography thanks to Cortex-M micro-controllers.


In the end though, what really matters is the data and the relationships we create between the services that use it. Through simply providing media types, simple REST interfaces and a markup format, the Web enabled endless innovation. Application layer standards for IoT, combined with suitable data formats for IoT applications are the key to enabling a similar wave of innovation for the Internet of Things. Device management is a key requirement for embedded devices, as embedded devices need their security, provisioning, updates and configuration managed typically without a human in the loop. Two promising standards provide us with a way to achieve both standard application data formats along with device management using the same underlying protocols. The OMA Lightweight M2M standard provides a secure system specification for both application data and device management flows, defining a re-usable Object model and basic device management objects. These objects are simple media types in either binary or JSON format. The IPSO Alliance will be announcing its first set of Smart Objects compatible with these standard formats for application data soon.


This is an exciting time for the Internet of Things, a great toolkit of standards is available to let smart people innovate. For that to happen though, we also need to provide a software ecosystem to make standards easily available to developers. We will be doing what it takes to make that a reality.


Come join us on October 1-3 at ARM TechCon 2014 in Santa Clara, CA to hear more about our plans!



ARM has long been credited with having brought the mobile revolution to the world. Its adoption helped create devices which brought in a level of connectivity and mobility unseen in the past.


But, it’s clear now; what we've seen so far seems to only be the beginning of something even bigger. The mobile revolution has seeded a revolution of its own and we’ll soon be presented with the next levels of possibilities in connectivity and convenience.


Though yet to reach mainstream adoption, wearables are gaining a lot of traction. These devices demand a level of efficiency much higher than those of mobiles due to tighter constraints on area and weight and the need to be ‘always-on and always-aware’.


A typical wearable consists primarily of a CPU, a connectivity device, battery and last but not the least; sensors.


At MakerFaire 2014 Bay Area, iFixit team did teardowns of some of the popular wearables in the market. Here’s a video showcase.




Wearables however are only a subset of a larger class of devices.  This larger class is basically composed of devices which possess the capability to interface and interact with you through an app on your smartphone, with communication being enabled by Bluetooth technology called Bluetooth Low Energy.  This class of devices can simply be titled ‘Appcessories’ – accessories which use a smartphone app for interfacing, thus enabling them to stick their bare basics of data collection and communication. Here’s Simon Ford, Director of Platforms at ARM introducing you to Appcessories.



Products like the Scanadu Scout and Estimote are perfect examples of appcessories. Both are based on the ARM® Cortex®-M0 CPU. I am excited to see what appcessories lay in store.


The various development platforms made available by ARM and its partners bring affordable innovation and exploration possibilities, so you can your bring your clever ideas to life.


There is a wealth of knowledge available for you to get started. So, visit the following links and your road to innovation can begin right away.


Getting Started | mbed

Filter Blog

By date:
By tag: