1 2 3 Previous Next

Internet of Things

443 posts

The dawn of cable-TV systems started over 30 years ago, marked by the need for an electronic tuning device to receive TV channels available on frequencies not able to be tuned in by standard television sets.  These “cable converter boxes,” as they were called back then, were large rectangular boxes which sat on top of, you guessed it, the TV Set.   I still recall the old 3 foot high Zenith TV set, nicely recessed in a wooden enclosure with black box sitting on top on which we would receive cable TV in the early 1980s.   These gradually evolved into analogue descrambling devices for Pay-TV systems which eventually became digital then grew, smaller and smaller and resulted in what we have today which are very small form factor “pucks” or HDMI “sticks” to decode TV.   As content moves to OTT this functionality has migrated into the Television itself with Smart TVs having Netflix, Roku, Apps and even being able to view Operator premium content through standards like Vidipath (formally known as DLNA CVP2), majority of which have been written for and have extensive support on the ARM® Architecture today.

The majority of these STBs and TVs however have historically revolved around the need for reasonably complex software/client middleware to run on the device to allow the user to navigate the content to which they want to see.   For linear content we are all familiar with the Electronic Program Guide (EPG). This piece of software has seen many evolutions from native clients, JAVA, Android, and HTML variants just to name a few running inside the STB.   This client software is still required but one major shift is happening in industry and that is this software is on the verge now of migrating completely out of the client and into the cloud.   Through the advent of high-bandwith, low latency networks made possible by the roll out fiber and the latest DOCSIS standards, wireline operators are able to leverage these capabilities to offer cloud virtualization of key applications in the home, one of ripe, low-hanging fruit applications is virtualization of the Set-Top Box.

The driving principle behind virtualization of STB is that the majority of the compute (application execution & graphics rendering) moves into the cloud and client requires minimal compute with solid video decoding capability.   All application execution is then converted to a standard video stream (e.g. H.264 or HEVC at the relevant resolution 1080p or UHD) and sent to the client where it is decoded.  Any “remote clicks” are received by the client and sent back to the server where they are processed within the application – the application itself is only delivered to a client as a running video stream.   There are several benefits to this kind of architecture:

  • Greatly simplifies the STB, the components required for app streaming has the potential to drastically reduce CPE complexity and cost
  • The EPG can be changed at will, even having different versions for different customers.
  • Unlimited number of additional apps can be made available to customers. These apps can be anything from simple weather widgets to the most powerful interactive 3D games.
  • The server app environment is componentize and secure, allows for 3rd parties to also stream apps.
  • Ability to run apps from multiple different operating systems on the same client hardware


ARM has partnered with Netzyn who is leading Software provider for Virtual STB to demonstrate this on 2 different ARM based server hardware platforms:

1.  Applied Micro X-Gene
Based on ARMv8 Architecture, the AppliedMicro X-Gene® processor was designed with scale-out cloud applications in mind, balancing CPU performance; robust memory capacity and bandwidth; and high-speed I/O.  The X-Gene processor’s core design and power efficiency in a high density scale-out server platform is ideally suited for a variety of NFV applications and fits very well into NFV definition of Standard Server.

Samsung Exynos based Microserver

Based on ARMv7 Architecture, this product is based on a high volume mobile phone SoC used in popular phones like Samsung Galaxy Note 3 utilzing an octacore big.LITTLE configuration with 4xCortex-A15  & 4x Cortex-A7.   However what makes this solution unique is that it also contains a GPU, in particular the ARM Mali-T628 MP6 along with hardware video accelerators.   These type of SoCs are especially well suited to vSTB applications because applications such as 2D/3D games can take full advantage of the GPU.   The advantages of having this capability in the SoC enable better performance and density on applications such as 2D/3D Gaming which are made possible by vSTB.

The client is leverages the Cubox-i2eX, based on a Freescale i.MX6, Dual-core Cortex-A9 controlled using an Roku IR remote control and gaming enabled through an X-Box 360 USB controller.  The applications shown using this vSTB architecture include XBMC, Angry Birds, CNBC, YouTube, Frogger, and TL Racing. 

Here is a diagram of the demo setup:


Both demonstrations can be seen firsthand at the ARM booth (#48) at NFV World Congress in San Jose, May 6-8, 2015.

Predictions of a trillion sensors by 2025. Will sensors supporting the Internet of Things stop at being just discrete devices? The Internet of *Embedded* Things will utilise on-chip sensors (for example temperature, process and voltage supply monitors). The opportunity to internet-link embedded sensors for mulitple reasons is on the horizon.


SemiWiki.com - How is Trillion Sensors by 2025 Panning Out?

This is what happens when Zebra's Zatar team and ARM's mbed team get together, they build a connected wine rack! The IoT wine rack is now being featured in the mbed demo room at ARM's headquarters in Cambridge, England.


Find out more here: Zatar and ARM: collaborating on the future of IoT




You’ve all been there: Upon arriving back home from the store, you find that there’s not enough milk to get through breakfast in the morning. Or, while strolling through an aisle, you can’t seem to recall if there’s enough jarred sauce for pasta tomorrow night. Wouldn’t it be great to know the answer simply by checking your smartphone? That’s idea behind SmartQsine, a smart inventory system developed by the team at NES Italia.


The system is comprised of several small pads, which are placed beneath the items that a user would like to monitor, and an accompanying smartphone app that lets them know when they are about to run out of something. Measuring just 8cm x 8cm x 1.8cm, the intelligent pads are compact enough to easily fit inside any drawer, on any shelf or atop any counter.

How it works is relatively simple: To get started, a user simply places an item on the pad and sets its current volume level. From there on, the pad will communicate with its paired mobile device, continuously monitoring and exchanging information around its remaining quantity.


Through its app, a user can seamlessly access their fridge or pantry data to see if they are in need of an item. Beyond that, they set an alarm that will notify them when something reaches a certain level, send a message to a person of choice when something is nearing its end, as well as allow for real-time edits to the shopping list.

The system appears to have been built around the Nordic nRF51822 mbed dev kit (ATSAM3U2C), and is equipped with Bluetooth Low Energy connectivity. The pads are powered by standard coin-cell batteries with a life of around six months.


Users can choose between two different lines of pads: gold and silver. Gold enables the pad to communicate with its accompanying mobile app and to monitor not only what the user places on it, but also to obtain the data coming from other connected pads. Whereas, silver lacks communication capabilities and can only be read in the app after being linked to a gold pad.

Moving ahead, the team is entirely open to integrating SmartQsine into existing and future home automation systems on the market. Developers will soon be able to devise and integrate apps of their own as well.

Interested in the system for your home? Head over to its official Indiegogo page, where the team is currently seeking $80,000. Shipment is expected to kick off in August 2015.

This blog originally appeared on Atmel Bits & Pieces.

Cypress and Arrow Electronics are pleased to announce the ten excellent design submissions received for the PSoC Pioneer Challenge: Maker Faire Edition.

You now have a chance to vote for your favorite #PSoCMaker to narrow down to the five finalists of this IoT-based design competition, featuring the Cypress PSoC BLE Pioneer Kit.

The winner and a runner-up will be selected by a panel of judges and announced via makezine.come on April 24th.

The winner of this IoT-based design competition will be showcased at this year's Bay Area Maker Faire, in San Mateo, CA.


VOTE NOW for your favorite #PSoCMaker. Hurry, voting closes April 15th!




Clunky, noisy and inelegant, the ceiling fan hadn’t changed for more than a century. Big Ass Fans created Haiku with SenseME to reinvent the ceiling fan by automating comfort and home energy savings.


The technology inside — a series of precise environmental sensors and microcontrollers — may be complex, but the result is simple: Haiku with SenseME automatically changes speed as the room heats up and cools down, and it learns your preferences to suit your unique needs. Motion sensors turn the fan on and off as you come and go, and a number of other special features make this a one-of-a-kind ceiling fan.


Energy use and conservation were key drivers behind the development of Haiku with SenseME. There are 300 million ceiling fans spinning in American homes, yet few people use them strategically to reduce the load on their HVAC systems. Big Ass Fans estimates users can save up to 30 percent on summer cooling and 25 percent on winter heating by using Haiku with SenseME.


Pairing Haiku with SenseME and the Nest Learning Thermostat makes it even easier to save energy without sacrificing comfort. This integration of a connected fan and thermostat stands out because of its potential impact on home heating and cooling costs—and, of course, the underlying reduction in energy use.


Big Ass Fans deliberately takes a long-term approach to product development, focusing on purposeful and innovative integrations and connections. In 2014, Big Ass Fans worked with industry leaders including Nest Labs, Samsung, Freescale and Silicon Labs to launch the nonprofit Thread Group, a revolutionary way to connect hundreds of smart home devices.


Carey Smith, founder and Chief Big Ass of Big Ass Fans, explained the company’s interest in smart home technology. “All of our work over the past 15 years has focused on efficiency, and homes are our country’s biggest opportunity to dramatically lower energy usage.”

For Big Ass Fans, Haiku with SenseME is driving the company’s commitment to efficiency. The expansion of the Haiku product line will continue to push the potential of the smart home to maximize the habits of modern human life.


Inside Haiku with SenseME technology - YouTube


More about the Big Ass Fan Haiku with SenseME fan in the ARM Innovation Hub

If you've ever wanted to make a computer program, but always thought it's too complicated to set up the tools and environments that you would need, this might be for you.

Do you like games ? -Most people do; especially programmers.

So if you are not going to set up a toolchain, and you don't need special hardware - what do you need ?

The answer is simple. Click the following link, and start learning: Light Bot.

Once completed, try again, this time you should try and make your programs as short as possible.

And finally, try again, making your program as quick as possible.

When you've exhausted the tutorial, you can proceed the same way on Light Bot 2 (to play a level, click the small white square).

How does this relate to ARM ?

On Light Bot's home page, you can download other versions for your particular device.

ARM and Silicon Labs today announced a new set of APIs for the mbed™ platform. Most importantly, these new low power APIs will give developers an easier road to reducing the power consumption of their applications, and will be first introduced on Silicon Labs' EFM32™ platforms.


Ahead of the availability of Silicon Labs' mbed-enabled kits and software, which are currently scheduled for an April launch, we wanted to show you how these new APIs will improve power consumption in a realistic scenario. By combining automatic sleep mode selection and background I/O operations, we have managed a reduction in current consumption by an order of magnitude.

Screenshot 2015-03-12 10.55.44.png

Without Low Power APIsWith Low Power APIs
Average current consumption of demo1.03 mA0.100 mA


The demo application drives a memory LCD through a unidirectional SPI interface and runs on an EFM32 Zero Gecko MCU. The memory LCD displays the mbed logo and a clock face which is updated once every second. The LCD display furthermore requires a 64Hz external square wave signal input, which also needs to be generated by the application.


This demo was developed first using the previous version of the mbed APIs, using all standard peripherals (timer, SPI, DigitalOut) available through mbed, and programming techniques often used in mbed’s community-driven drivers. At an average current consumption of 1.03 mA this application would only be able to operate for 194 hours (8 days) on a standard 200mAh coin cell battery. The power profile of this application is displayed in the figure below.

Screenshot 2015-03-12 10.39.01.png

The application was then upgraded using the new low power APIs, which also introduce an asynchronous programming model to mbed. This means that instead of waiting for a long-running I/O operation to complete, a programmer can now register a callback to be notified of the operation’s completion. The processing time which has been freed up can then be used to either sleep and reduce power consumption, or do other processing in parallel.

Additionally, the new sleeping API dynamically determines the best sleep strategy based on the application’s state. Keeping true to mbed’s methods of enabling extremely rapid prototyping, this will give a more accurate idea of the power profile the application could exhibit with some more tweaking, while retaining a very simple interface everyone can use.


These optimizations ended up contributing to a decrease in current consumption by a factor ten for the exact same application. As can be seen in the figure below, the sleeping API has selected the best sleep mode in between the one second cycles, and the processor is only waking up sporadically to generate the required 64Hz output.

Screenshot 2015-03-12 11.00.52.png

We've now increased the battery life of this application tenfold!


If you're in the Austin area for South by Southwest Create this weekend (Friday, March 13th through Sunday, March 15th), feel free to drop by the Silicon Labs booth to get a live demo of the low power APIs. Otherwise, look forward to more information on the new API set as we get closer to the launch!

About thirty years ago, Acorn partnered with the BBC with the purpose of putting at least one computer into every school in the UK. The goal was to get children interested in writing code and it has been probably the single biggest contributor to the growth and success of the computer and electronics industries in the UK. Many of the engineers in ARM, over a certain age (ahem), will have been exposed to computing for the first time through that program.


Today a similar initiative was launched by the BBC in London to get a new generation into coding with their “Make it Digital” campaign. There are a number of elements to the initiative but the one that ARM is most excited about is the Microbit project that will build on the success of the original “BBC Micro” idea and take it even further. In early September, every child in year 7 at a school in the UK will be given a small ARM based development board that they can program using a choice of software editor. The teachers will be trained and there will be a full suite of training materials and tutorials for every child, at any level of ability, to program their first Internet of Things (IoT) device.


The board has BLE on board so that it can be connected to a phone or tablet and will support Firmware Over The Air (FOTA) so that it can be reprogrammed using a mobile device and will not be limited to being connected to a PC by USB cable. There is a 5x5 LED array on the board that can be programmed to scroll text or display simple images along side other soon to be announced capabilities so that kids can have fun experimenting. Both Freescale and Nordic Semiconductor are working with us on making this initial 1 million devices a reality.


ARM is particularly proud of this device as it is being built on top of ARM’s mbed platform to ensure flexibility for future generations of the device without breaking compatibility. We would like to see this become a yearly event in the UK so that every child that moves up to secondary school gets a Microbit of their own. Clearly, this programme should also be pushed out beyond the UK and we are investigating how best to do that. A crucial element to enable further proliferation is that all the pieces of the Microbit project will be open sourced and freely available for others to use and replicate.


More information can be found here: http://www.bbc.co.uk/makeitdigital and the BBC announcement is here:BBC - Make It Digital - About Make It Digital


I would like to recognise a few people in ARM that have been instrumental in getting us to this point: Stephen Pattison for ensuring that we were involved from a very early stage: Kris Flautner and Simon Ford from the IoTBU for providing the resources and support to ensure mbed was the platform to build on: Jonathan Austin and Chris Styles for all their contributions to the hardware and software designs. Just this morning, Jenny Duvalier stood on stage with the Director General of the BBC, Tony Hall, and voiced ARM’s commitment to the Make It Digital initiative and the impact it will have on the talent pool of the future.


We think it is crucial that we inspire a new generation of engineers to get interested in computing and technology. The continued success of UK plc as a leader in ICT depends on how successful we are in encouraging boys and girls to embrace technology and choose to build a career for themselves in our industry. This will make a massive contribution to that endeavour and everyone at ARM should be proud to be part of it



As we embark on the voyage of true hybrid clouds it’s important that we don’t forget about the waters we’ve traveled and the waters ahead.  A familiar ocean with ports and established routes has been sailed for years.  It’s a vast and rich environment though a new ocean has been discovered and this one looks nothing like the one we know.


The public / private / managed cloud experience has existed for years and we all have a good understanding of how to consume these services to suit our specific needs. For security-constrained data we build Tier 4 data centers.  For general-purpose environments we throw them over to public cloud and as we grow we consume managed hosting where it makes sense. But what happens when our need for power consumes us and the lines between cloud and connected start to blur?  What happens when software and data are designed hand in hand to be portable and delivered extremely close to the user?

As the pendulum of IT swings back and forth we have traditionally moved from centralized and highly integrated environments and solutions to highly decentralized and disaggregated ones.  Think about your old mainframe, 25K or Superdome moving to a client server model.  Now think of that client server model moving to blade infrastructure. Software follows this same paradigm – Massive vertically integrated software suites with everything tightly controlled to highly de coupled micro services there to offer like functionality for anyone that needs it. Remember SOA? Remember SAP?

The truth is that the Data Center and the thought process behind it ebbs and flows the exact same way.


Enter IOT / IOE


With the introduction of hybrid environments and containers we stand literally, on the edge of a new cloud – An embedded cloud that must take its environment into account and must be delivered close to its user.  This cloud must be highly distributed, highly decentralized and powered with technology that understands it’s environment, – A key head start I believe ARM has in this new world.


Welcome to the edge.  Do you recall the once great Sun Microsystem’s quote, “The Network is the Computer”? They were right they were just a decade too soon.  With the Internet of Things and ultimately the Internet of Everything we need technology that let’s us use our urban infrastructure to deliver the next generation, connected device, user experience.


In 2007 the world hit a major milestone – for the first time the world’s population became more urban than rural and that trend is expected to increase:


It’s estimated that by 2020 there will be 40 billion connected devices on the planet and over 40 zetabytes of data.  With the reality that this experience and this data will be served, collected, analyzed, at the edge, the data center and the silicon in it will have to evolve.  We’re probably not going to see today’s hyperscale data centers with their requirements of massive power, heavy silicon, and space power this new cloud.  A lighter weight solution must be deployed and there are several companies capable of building this technology but ARM again, especially with its Cortex v8 cores are primed to take an early lead.


The IOT Enabled Data Center

So what to do?  The only clear choice is that we must get smarter about what runs where and what powers our new, more connected lives. Logic implies that we need to do more with less while at the same time doing more with more, let me elaborate:

We’ll have less power to offer these new data centers meaning we need light weight silicon that consumes less.  We’ll have less space which means we need to maximize density and house our data centers in smaller places, like embedded into our urban infrastructure but we’ll also have more, lots more! Instead of your 1 or 2 or 3 big 100+ remote data centers we’ll have a disaggregated data center and we’ll have lots of them all around us.


The silicon running in this data center will be low power with optimizations built in.  The management of these data centers will require us to look across many locations and ensure application availability is built into the app and infrastructure fault tolerance and high availability are replaced with infrastructure resilience and design to fail rule sets.


The journey we’re about to embark on is going to be long and at times the seas will be rough.  What’s clear though is that what ARM and its partners are doing with MBED is a clear sign that ARM has built the ARMed forces ready to take on the future.


Cole Crawford
Founding Executive Director @ Open Compute
CEO at vapor.io

How many of you are fans of the CBS hit sitcom series, Big Bang Theory? Well, you’re in luck. If you recall an episode from the show’s first season, entitled “The Cooper-Hofstadter Polarization,” the team of Sheldon Cooper, Leonard Hofstadter, Howard Wolowitz and Raj Koothrappali successfully turned on a lamp via the Internet using an X-10 system.

The gang was able to send signals across the web and around the world from their apartment to connect not only their lights, but other electronics like their stereo and remote control cars as well.

“Gentlemen, I am now about to send a signal from this laptop through our local ISP racing down fiber optic cable at the of light to San Francisco bouncing off a satellite in geosynchronous orbit to Lisbon, Portugal, where the data packets will be handed off to submerged transatlantic cables terminating in Halifax, Nova Scotia and transferred across the continent via microwave relays back to our ISP and the external receiver attached to this…lamp,”  Wolowitz excitedly prefaced.


What’s funny is, the technology that the group of sitcom scientists was simulating could have just as well been done using a Wi-Fi network controller, like the WINC1500 module. However, at the time of airing back in March of 2008, open access for Internet users looking to control “things” around the house was seemingly something only engineers and super geeks thought possible.

In an effort to generate awareness around the upcoming IoT Secure Hello World training series, a team of Atmel Norway engineers decided to make their own rendition of the Big Bang Theory lamp scene using the ATWINC1500 IEEE 802.11b/g/n network controller and an Atmel | SMART SAM D21 Xplained Pro board, all secured by Atmel CryptoAuthentication devices.

After watching the Trondheim-based crew’s Cooper-Hofstadter IoT experiment above, be sure to check out a detailed description of the technology behind the project below.

Mobile World Congress (MWC) isn’t just mobile anymore!  Similar to CES, MWC seems to have more and more ARM-based devices.  It makes me quite proud to see ARM everywhere through the ARM ecosystem's expansive solutions collectively demonstrating how it is expanding the connected experience.  In my first blog #MWC15: New Mobile Devices and Security Offerings to Make Our Lives Better I reviewed some mobile highlights as well as Simon Segars, ARM’s CEO, keynote.  What I enjoyed, perhaps a little more than the mobile highlights, was all the other non-mobile solutions that I found at MWC15.  In this review, I’ll cover the ultimate mobile device (automotive), ARM-based server solutions supporting the  (The Intelligent Flexible Cloud White Paper) and a compelling new wearable.


Automotive at MWC?


I expected to see many automotive solutions at Embedded World and I did.  However, I’ve been delightfully surprised to see many automotive solutions (and full blown cars) here in Barcelona.  On Monday, Freescale announced a new automotive vision processor, S32V, which features sensor fusion capabilities to support advanced driver assistance systems (ADAS).   This is a new automotive-grade, quad-core ARM Cortex-A53 based SoC.  It’s taking one step closer to that vision of a self-driving car.  They also have a cool BMW that shows all of the different automotive solutions that they provide to the car.


Qualcomm in Automotive

Qualcomm has a beautiful Maserati in their booth with a decked-out concept demo they jointly developed with QNX Software Systems.  The platform shown below is based on the Qualcomm Snapdragon 600 processor based on the ARMv7A architecture.  The platform has digital mirrors and an instrument cluster displaying safety readings integrated with a fully featured infotainment system.  Safety and multimedia combined!

QC auto.JPG

64-bit Servers and Intelligent Flexible Cloud

From high-performance automobiles to high-performance computing, I saw quite diversity on the show floor.  I saw the Cavium ThunderX demo with up to 48 custom ARMv8 cores running at 2.5GHz and I just said Wow!  Review the features of the ThunderX solution and see how ARM-based servers are meeting networking challenges.

This demo is an example of a working solution supporting the recently announced Intelligent Flexible Cloud (IFC).

ThunderXv2.JPG  ThunderXServer.png



LG Watch Urbane.JPGMore diversity at MWC! Another theme for the show was the variety of new wearables that are coming on the market.  LG Watch Urbane LTE is based on Qualcomm’s Snapdragon 400 processor based on the ARM v7A architecture with 4G LTE.  This watch looks like a sports watch, makes its own phone calls over LTE, makes mobile payments over NFC and runs its own new operating system. Is that a watch or a phone?  It’s interesting to watch (couldn’t resist the pun) how the use cases for smart watches are developing.  It's coming soon to the US with AT&T as the first major carrier.

All the information I shared in this and my first MWC blog I saw in the first 6 hours of MWC!  It’s no longer a mobile only show by any stretch of the imagination.  Innovation continues to be the major output of our industry and it makes me proud to be in technology.  I can’t imagine how much will change before Barcelona 2016!


Interested in a few more highlights of #MWC15?  Please follow me @lorikate.

From my first Embedded World (EW) in Nuremberg (ARM Booth was Connected at Embedded World: From Sensors to Servers, Thread and Tools) to my first Mobile World Congress (MWC) in Barcelona, it’s been an exciting and exhausting few days!  Similar to EW, #MWC was more than just the one theme. Walking to some of the Partner booths, in addition to the expected new smartphones and tablets, there were new security-based technology, full cars, high-performance server demos supporting the recently announced Intelligent Flexible Cloud (IFC) White Paper, and new compelling wearables.  In this first blog, I’ll cover new mobile devices and security offerings.  Check out my next blog for the continuation of the discussion.  Here are some video highlights covering some of what I saw.

Simon Segars keynote: Driving Innovation

I hope that you were one of the lucky ones who had a full pass and were able to attend Simon Segars' (ARM CEO) keynote on Tuesday March 3rd during Keynote 4: Innovating For Inclusion from 11:15 – 12:45 pm.   A few of the topics that Simon covered:

  • The importance of mobile is not only for the devices it brings to consumers but as an innovation platform. Driving innovation in the phones themselves drives innovation to adjacent markets and enables new markets to utilize the technology itself as well as the screens as the interface, and of course mostly the apps and services.  As an innovation platform, mobile enables global innovations and maybe even more important enables local innovations.
  • Infrastructure challenges are universal and can be addressed by the concepts of an Intelligent Flexible Cloud where the intelligence is distributed throughout the network closer to where it is needed.  This distributed intelligence can then not only manage the data better and provide for more power optimized deployments, it also creates a new platform for innovation. Now services can be created to run locally to address local needs in a manner that best suits that region.

  Several of his themes align nicely with the themes I’ve been seeing from the show floor that I'll cover in my next blog.


Exciting developments in Mobile

Of course, MWC has the latest devices and technology trends to tempt you towards your next smartphone and table purchase. It did not disappoint and I don’t know which device to choose.


More Cortex-A57 smartphones launched

Qualcomm’s booth was packed with many new gadgets to experience.  At the high end were three new devices based on the Snapdragon 810 processor which are packed with quad ARM Cortex-A57 and quad ARM Cortex-A53 CPUs  in an ARM big.LITTLE configuration with 64-bit support.  The HTC One (M9), LG G Flex 2, and Sony Xperia Z4 all feature X10 LTE and Qualcomm’s Quick Charge 2.0 technology.

LG G Flex 2.JPG  HTC One (M9).JPG

New ARM® TrustZone®-based security technologies launched by Qualcomm and Samsung

MWC is more than just new devices; it’s about new ways to use your smartphones.  On Sunday, Samsung announced Samsung Pay.  I received a demo from the recently acquired company who created the technology.  They demoed how multiple cards could be ‘stored’ on your mobile device and transactions can take place.  On the left is the example of the Starbucks card, but other examples include American Express, Target and others.  On the right below is an image of the Samsung Galaxy S6 and the description on the tablet of the upcoming Samsung Pay.  The service will launch on April 10th in Europe and the US.  Quoting the press release: “Samsung’s payment security is enhanced by its own mobile security platform (Samsung KNOX) and ARM TrustZone working together to protect transaction information from fraudsters and data attacks.”

Samsung Smart Pay 1.JPGSamsung Smart Pay S6.JPG


Qualcomm’s 3D fingerprint authentication with Snapdragon Sense ID

New services that make my life easier are always interesting to me.  Thus far I haven’t been successful using fingerprint technology for passwords on a regular basis.  Qualcomm’s new Snapdragon Sense ID using ARM TrustZone provides leading biometric fingerprint authentication.  Its ultrasonic-based technology is engineered to capture three-dimensional details.  This solution can scan through common barriers like sweat, lotion, and condensation. The first devices that will use this technology are Snapdragon 810 and 425 processors.  As you can see from the image below, the team won’t even show a full fingerprint in the demo so someone couldn’t capture that person’s image.  I can imagine how in the near future fingerprint ID scanning will become the norm.

QC Sense ID f.JPG

So MWC has lived up to my expectations to experience new smartphones, tablets and find new mobile security offerings that will soon make our devices even more compelling to use.


For more MWC updates, follow me at @lorikate and check out my next blog on servers, IFC, automotive and wearables. So much to see and do at MWC!

The mbed Team has been busy the last few weeks getting ready for Embedded World (EW) in Nuremberg, Germany and now we are looking forward to Mobile World Congress (MWC) in Barcelona, Spain this coming week. There has been much to celebrate from the collaboration with IBM to announce the IoT Starter Kit – Ethernet Edition to the ARM mbed Device Server 2.3 release.


At EW we announced the IoT Starter Kit – Ethernet Edition that was developed jointly with IBM. This kit is intended to help developers build devices that connect to the IBM cloud (IBM BlueMix platform) and further build out the ecosystem of IoT. Additionally, at EW we showcased two demos; our ARM mbed Bluetooth Low Energy solutions and the Thread stack running with mbed OS. Check out the videos below to see the demos in action.



At MWC this coming week, we will be featuring our mbed Device Server. One will be showing our Smart City demo which was developed with wot.io, MultiTech and Stream Technologies. In this demo you will see a real life scenario where delivery vehicles are tracked through London and augmented with real-time traffic information provided by London Open Data and traffic cameras. Not to be missed, we will be measuring many live statistics from the ARM booth (temperature, average height of attendees, noise levels, etc.) to showcase how mbed enabled devices can collect live data from the edge of a network and feed it to an ARM powered server running the mbed Device Server. Be sure to stop by our stand (Hall 6, Stand 6C10) to see both demos in action, if at MWC.


Last by certainly not least, we have released the ARM mbed Device Server 2.3. This release adds more key functionality to allow us to better serve the needs of mbed Device Server users. To find out more, see Neil Jackson’s blog post here.

I am very excited to be able to introduce our Intelligent, Flexible Cloud vision to the community today. This is a vision the team here at ARM has been working on for a number of months, and really is an evolution of the ARM ecosystem’s vision and opportunity for networking infrastructure.


The core of this vision is that the network will become a platform - a platform for traditional network functions, as well as a platform for application development.  Combining scalable, highly-integrated system-on-chips based on heterogeneous compute capabilities with a common software framework enables the deployment of applications and services at cloud scale, while meeting the very real demands of a diverse network environment.  As a platform, the network will be able to scale to meet not only data bandwidth and capacity demands, but also address power efficiency, data diversity handling and data density.


The network is not homogenous.  There are various factors, but most notable are the constraints of power, form factor and latency.  If a small cell platform can’t fit within a power over Ethernet budget, it can’t be deployed.  If it takes too long to process a packet, the packet is dropped and the network becomes highly inefficient.  The balance between meeting these real world networking demands and enabling highly configurable intelligence in the network is the key foundational principle of the intelligent, flexible cloud.   It is with the right balance that we can truly achieve distributed intelligence from data center through to end device.


The scalable, system-on-chip frameworks ARM provides are the foundation upon which are partners’ innovation and market know-how are added to address the needs from end to end. Whether it is the ability to process L1 line rates in access nodes, or the capability of high throughput storage access in the data center, ARM partners add networking acceleration, IO and storage capabilities for unified silicon systems. With this scalability, the nodes across the network can become more configurable and flexible.  Combined with a common software layer, comprised of a mix of embedded and cloud IT technologies, these nodes become accessible intelligence upon which software networking functions and diverse system, business and services applications can be deployed.


The common software layer enabling distributed intelligence for cloud to the edge is an open source software stack based on Linux, Open Data Plane, virtualization and containerization technologies, and management and orchestration technologies.  Industry initiatives like the Linux Foundations OPNFV are building and testing real-world platforms and will be significant contributors to consolidating requirements and bringing together standard software platforms.


We view this launch of the intelligent, flexible cloud framework as a start of a discussion and we look forward to continuing the discussion with ARM partners, our software partners and the broad ecosystem of service providers and OEMs.  It is going to be an exciting decade ahead for networking. I look forward to the possibilities and opportunities we can enable together, not only for the industry but also for consumers and businesses who will drive new services and applications for IoT, mobile and more.


To dive deeper into the industry trends, I recommend this brief from Moor Insights.  For more on the intelligent flexible cloud framework, an in-depth white paper can be found here.



Filter Blog

By date:
By tag: