1 2 3 Previous Next

Embedded

353 posts

We are delighted to announce that the ARM Cortex-M7 processor has been selected by the Linley Group as the “Best Processor IP” for 2014.



Linley Group Analysts Choice Logo.png

Each year The Linley Group (one of the leading semiconductor analysts) presents its “Analysts’ Choice Awards”, recognising the top semiconductor offerings of the year.  Winners are selected on the basis of performance, power efficiency, features, and cost for their target applications.

 

The Best Processor IP award is based on superiority in performance and power efficiency as well as die area for the intended application, and the ARM Cortex-M7 was recognised as “excelling in each of these measures for high-end microcontrollers”.

The Linley Group noted that the “ARM Cortex-M7’s dual-issue superscalar architecture extends ARM’s digital-signal-controller (DSC) concept, boosting compute efficiency by 47% over Cortex-M4. It adds a second integer execution unit with 32-bit SIMD capability, parallel execution of load/stores and MACs, and a double-precision FPU, which all help double DSP performance on some benchmarks.  In ST’s 90nm design, the M7 achieves 1,000 CoreMarks on the Embedded Microprocessor Benchmark Consortium (EEMBC) test, beating its M4-powered predecessor by 64%”.

 

And as The Linley Group note:  the Cortex-M7 “achieves the same performance per milliwatt [as the Cortex-M4] while delivering much greater overall performance”.

 

The Cortex-M7 was announced in September 2014 and expands ARM’s Cortex-M product series, addressing smart and connected embedded applications. The Cortex-M7 enables applications requiring higher levels of general-purpose processing, DSP and floating point performance than achievable with existing microcontroller cores, and all within a highly efficient power envelope. Cortex-M7 also has a variety of powerful system interfaces, including a 64-bit AXI AMBA4 memory bus for access to sophisticated peripherals and external memories, as well as tightly-coupled memory interfaces for efficient, real-time access to critical code and data.  The processor can also support up to 64kB of separate instruction and data caches for further improved performance.

 

Applications include Internet of Things (where data can be pre-processed and/or compressed prior to transmission), voice and image processing applications, advanced audio processing, advanced touch screen products, and advanced motor control applications.

 

Those familiar with the Cortex-M7 won’t be surprised to know that the Linley Group’s award isn’t the only time that the Cortex-M7 has been recognised: the Cortex-M7 won the NMI’s Low Power Design Innovation award, and STMicroelectronics' STM32F7 (one of the first announced devices using the Cortex-M7) was honoured by independent judges with the “Best-In-Show Award - Best Hardware Product” at ARMTechCon 2014.  As well as STMicroelectronics, Atmel, Freescale and Spansion have also publicly announced that they have licensed Cortex-M7.

 

The Linley Group Analysts’ Choice Awards also included “Best Mobile Chip”, which was awarded to ARM’s partner, STMicroelectronics’ STM32F411 sensor hub, which is based around a Cortex-M4 and which makes use of the Cortex-M4 DSP extensions. The Linley Group note that these extensions “enable always-on monitoring of incoming audio and other sensor data, waking the AP [application processor] only when necessary”.

 

ARM’s CPUs also featured in the choice of “Best Mobile Processor” which was awarded to ARM’s partner, Nividia and in the “Best Networking Chip” which went to Marvell.


As electronics become ever more pervasive in the automotive, industrial automation and medical device sectors, fault-tolerant electronics sub-systems are becoming a standard requirement. Designing these systems with Cortex-R series processors that have a high level of fault tolerance realizes benefits such as:

  • Improved reliability
  • Enhanced fault detection and coverage
  • Reduced cost of operation

 

Functional safety support is increasingly becoming an essential part of these systems. As the various functional safety standards continue to develop in complexity, ARM has developed the Cortex-R5 Safety Documentation Package to speed time to market, simplify the certification effort and enable higher levels of certification to be obtained.


Key technologies to support functional safety in the ARM Cortex-R series


The ARM Cortex-R series processors have been developed to be used in applications that require high dependability and detection of any errors that can arise in the processor or the system. The types of faults that can occur in any system include hardware faults (such as failures from aging memory or temperature induced stresses) that cause erroneous values and random faults (such as random radiation hits to the silicon which ‘flip’ a bit or gate or even cause permanent hardware damage). If the system has safety implications, where any failure could have serious consequences, then any error must be detected and handled in the appropriate way for the particular system.

For addressing this, two key strategies exist:

  • Detection of errors in memory: Additional Error Correcting Codes (ECC) are appended to all memory values and checked before the data is used. This enables automatic detection and correction of single bit errors and detection, but not correction, of multiple bit errors. This requires the use of wider memory that has extra bits to store the ECC and is used on all memories in the system, including caches and tightly coupled memory (TCM). The processor automatically checks the ECC codes when data is read in, and automatically corrects single bit errors, and signals the error to the system if it is uncorrectable. On writing memory the processor automatically creates the ECC codes. The Cortex-R5 also enables detection of errors on all the buses that connect the processor to the system.

EEC Picture.jpgECC on Reading TCM

 

  • Detection of errors in the processor: Radiation could hit any gate in a system and if this causes an error, not in the memory but in the actual logic, then this must also be detected. Dual Core Lock Step (DCLS) implements two identical processors with identical inputs, though one is slightly delayed to ensure events that affect the whole system at the same time are detected, and checks that the output from both processors are identical. If the compared outputs do not match then there must have been an error in the system and this is signaled so the system can take the appropriate action.

DCLS Diagram.jpg

Redundant Dual Core Lock Step

 

These key areas, when combined with many other features within the Cortex-R series enable SoCs and wider systems to be developed that meet the requirements of many functional safety standards.

 

The Cortex-R series have been adopted by more than 70 partners, many of whom rely on the error detection features. The processors have shipped in more than 1.5 billion devices and their reliability proven in many markets such as automotive, industrial, storage and medical, where data integrity is critical.


However, just having a processor with these features is not sufficient to meet the needs of applications which have functional safety requirements.


How does ARM support functional safety for the Cortex-R5?


Functional safety standards such as ISO 26262 and IEC 61508 require evidence to demonstrate particular system or system component properties. The safety documentation package for Cortex-R5 has been designed to simplify certification, and helps SoC integrators develop and demonstrate the required level of functional safety.


In the context of functional safety standards, ISO 26262 in particular, semiconductor IP can be treated as a safety element out of context (SEooC). For such elements the actual use cases are not necessarily known during design time. This is of course exactly the case for Cortex-R5, which can be used in a huge number of real-time applications. The safety documentation package has been designed with this in mind, to allow SoC integrators to develop products for particular applications with safety requirements.

Safety Package Diagram.jpgCortex-R5 Safety Documentation Package for SoC integrators

 

The Cortex-R5 Safety Documentation Package contains information about the Cortex-R5 product itself, focusing on its fault detection and control mechanisms such as dual-core lock-step and memory protection options with ECC or parity. To facilitate integration of the Cortex-R5 into safety-related designs, an FMEA report with example failure rate distributions is also included.

 

The information is structured into a set of three documents: Cortex-R5 Safety Manual, Cortex-R5 FMEA Report, and a document describing the allocation of roles and responsibilities for functional safety in projects integrating the Cortex-R5 processor. The Safety Manual includes details on measures used to avoid and control systematic faults during the processor design and verification activities. It also includes details on the processor behavior when faults are detected. The FMEA Report includes a detailed analysis of the design, which can be used a starting point for system-level safety concept definition and subsequent analyses.


This information helps the SoC integrators to create required safety documentation for their products, reducing the time to market for new products. The information can also be used to support functional safety assessment activities for the SoC products with an integrated Cortex-R5 processor.


ARM is only making this information available for the SoC integrators. Therefore if you are a system or software developer targeting safety-related designs, you need to refer to any safety documentation provided by your SoC vendor. The key reason for this is the fact that Cortex-R5 is highly configurable, with different configuration options having possible impact on the processor fault behavior. Since the ARM Safety Manual for Cortex-R5 describes all these configuration options, we want to ensure that any safety documentation available to system and software developers correctly reflects the actual feature set of your chosen SoC implementation.


It's worth remembering that complementary to the Cortex-R5 Safety Documentation Package, the ARM Compiler toolchain has also been certified by TÜV SÜD, a recognized safety industry expert. The TÜV Certificate and the accompanying report confirm that the ARM Compiler 5.04 fulfils the requirements for development tools for safety-related applications. This enables you to use the ARM Compiler 5.04 for safety-related development up to SIL 3 (IEC 61508) or ASIL D (ISO 26262) without further qualification activities when following the recommendations and conditions documented in the Qualification Kit.


For related information please see the whitepaper "Safety standards in the ARM ecosystem". We will be expanding support for functional safety for our CPU products this year, so please keep an eye open for further announcements!

I am not a great fan of video, as a medium. If I want to learn something, I am more likely to turn to the printed word. It is only when something is intrinsically visual that I turn to YouTube. Like the time I unexpectedly needed to gut some fish – there I was, working in the sink with two slippery fish and a sharp knife, with my wife holding an iPad showing a video of a rugged looking man doing the job more proficiently than I aspired to.

I think, however, that video may feature more strongly in my life this year …

Although I spend considerably more time reading and listening to the radio than watching TV, I know that I am somewhat unusual in my preference. [Actually, even I do enjoy going out to see the odd movie.] As a result, I am intending to produce some video content in the coming months.

My first plan is to create some video blogs, which will be posted here. I am not sure yet to what extent the medium will help enrich the content. Will it be a better way to describe aspects of embedded software programming, for example? I would appreciate any feedback here, by comment, or on my social media accounts – are you likely to watch any video blogs, or do you [too] prefer the written word?

To read the rest of this entry, visit the Colin Walls blog on Mentor Embedded.

ScreenClip-520x351.png

During the heat of CES 2015, Atmel offered us a nice post Christmas gift with the release of... Read more

Wearables Week on the ARM community last October was a huge hit and it threw off lots of interesting conversations and data to look at in 2015 but Atlas Wearables was so interesting I thought it worth a deeper dive.  In December we ran Kickstarter week on the ARM community and we saw how much that funding platform is changing the way hardware is designed and comes to market but they aren’t the only player because Atlas came to market with Indiegogo.  I spoke to Mike Kasparian the CTO of Atlas and first asked him why they chose to go with Indiegogo?

mike of Atlas.jpg

Mike told me that Kickstarter is a great platform but they see a much more global market for Atlas and Indiegogo has greater international reach.  Interestingly Atlas were part of a tech incubator called Techstars which is similar to HAXLR8R (which I wrote about last month) in that they give you around $100k for just under 10% equity in your company.  Atlas launched on Indiegogo in early 2014 and hit 503% of their funding goal ($629k) on March 8, 2014.  Then the hard work started.

 

Atlas is unlike other wearables because it aims to excel purely in the fitness and training market with a big data back end called the Motion genome project giving users a tremendous resource to compare themselves to.   The device itself is also very different and I think its well thought out for its purpose:

Atlas_Product_3-4_View.png

Mike told me that serious fitness enthusiasts need a big display to see their progress and smartphones aren't  a good fit for these folks in the gym.  Perhaps more significant is the data that the Atlas collects is very accurate and employs some intense bio-mechanical data processing.  This wearable uses 3 separate ARM based processors to gather and transmit data:

  1. A Cortex-M4 processor drives the heart rate application via a photodiode measuring light coming through your wrist which measures your pulse.
  2. Another Cortex-M4 measures motion and uses proprietary machine learning algorithms to calculate the intensity of your workout and give you direction.
  3. A Cortex-M0 runs the Bluetooth connectivity application like many other wearables (Nordic Semiconductor again)

 

Mike and the Atlas team have 3 patents on the design and I think their very focussed approach to the application and the big data aspect bode well for their success.  I asked Mike about charging which is a bone of contention with many users and he told me that the device charges via a simple micro-USB cable but they will look at wireless charging in the future (wireless charging is still in limbo in my opinion).

 

Atlas was a finalist at the TechCrunch hardware battlefield at CES 2014 but didn't win the $50k investment but I don't think they need it.  The $249 price seems reasonable for this level of product and access to the Motion Genome project is a major motivator for getting involved.  Check out the whole system in this video:

 

If you would like to buy an Atlas for $249 you can preorder it here and they plan to ship in the summer of 2015.  What's your take on this wearable?

I always welcome contributions – guest blog posts – from my colleagues and associates. After all, that reduces the amount of work that I need to do – how could I refuse? But seriously, I think that a different “voice” from time to time is refreshing. My colleague Richard Vlamynck has appeared here before. He has an interesting perspective, sitting right on the cusp of hardware and software – definitely an embedded “Renaissance” man. Today Richard is musing on what actually constitutes a microcontroller …

This blog post started out on another blog, when someone offered the premise that anything that has an MMU (Memory Management Unit) can not be a microcontroller. That made me ask: why? Can’t I still call a device a microcontroller if it has an MMU? Can I still call a device a microcontroller if it has an embedded DSP block? For example, “everyone” knows that the Nest home thermostat has an ARM Cortex-A7 and it does have MMU. The same Cortex-A7, in a similar chip, is used in Google Glass.

So what is it that distinguishes a microcontroller from a microprocessor or a CPU (Central Processing Unit)?

To answer that question, let’s set the way-back-machine to sometime near the beginning of digital computing for embedded systems.

To read the rest of this entry, visit the Colin Walls blog on Mentor Embedded.

microcontroller.png

I'm posting a guest blog from Matt Benes of Tektronix Embedded Instrumentation Group on an important if not vital topic; power consumption of wireless systems.  During Kickstarter Week  we saw that the vast majority of new designs used wireless connectivity and therefore power consumption and battery life were key factors for users. Enjoy.

 

With the continuing growth of the Internet of Things, being connected is more important than ever. Smart, connected electronics are now everywhere: from large scale smart grids to Wi-Fi enabled toasters. With this comes a push to become “unplugged” and eliminate the need for power cords.  For many devices, a power cord is simply not practical for a number of reasons. This means designers must come up with alternative methods of powering their devices, adding new challenges and complexity.  Without the seemingly unlimited source of power from being plugged in, designers must pay close attention to how much power their devices are consuming at any given time to maximize efficiency, whether using batteries, solar cells, or any other limited power source.

 

The most common method used to power wireless devices, of course, is with batteries. Batteries are a proven and reliable source of power that can be recharged and are available in a wide array of sizes and capabilities. Batteries can only supply a finite amount of power before they need recharging so designers must weigh a host of tradeoffs such as battery size, system performance and how long the battery needs to last between charges.  ARM has developed the Cortex-M series family of processors that has become a favorite among designers working on embedded IoT applications. It is not surprising that it has become a favorite because ARM has designed the Cortex-M specifically to be a low-cost, low-power, and high performance processor and those features directly align with the goals of IoT designers. 


The Cortex-M series has some neat features built into the design that help reduce its power consumption while operating. It uses a power efficient 32-bit processor that allows the unit to consume less power by performing tasks faster at a lower clock frequency than 8-bit and 16-bit designs. It also supports multiple sleep modes that allow designers to put the processor in a lower power state when high performance is not necessary allowing for longer battery life in devices.

ARM Cortex M family.jpg

 

The simplest way to determine battery life is to use this basic formula:  T=Q/I  where (T) is the amount of time the battery will last (in hours), (Q) is the capacity of the battery (in amp hours), and (I) is the current drawn from the battery. Using this formula, for example, a 2Ah battery that is powering a device that is drawing 0.25A will last for 8 hours.

 

This is a great method for estimating battery life on devices that are drawing a constant current, but it gets more complicated when the device has multiple states of operation that each draw different amounts of power over time. These different states of operation can include putting the device in a power saving mode, enabling different connections such as Wi-Fi or Bluetooth, motors running, lights turning on, or any other task that the device may be designed to do. For those types of measurements you will need an oscilloscope with current probes capable of measuring the spikes in power consumption as well as a long enough record length to characterize the battery drain.

 

One great example of how to use modern test equipment to help design cutting edge electronic designs is Istituto Italiano Di Tecnologia’s iCub robot. The iCub is the humanoid robot developed at IIT as part of the EU project RobotCub and subsequently adopted by more than 20 laboratories worldwide. It has 53 motors that move the head, arms and hands, waist, and legs. It can see and hear, it has the sense of proprioception (body configuration) and movement (using accelerometers and gyroscopes).

iCub schematic.jpg      

iCub robot.png

 

The method challenge that IIT faced was troubleshooting a new battery pack system that allowed iCub to operate without along extension cord and validating the CAN and I2C buses using the battery pack’s design. In order to make this possible, IIT turned to a Tektronix MSO4104B oscilloscope with a TDP1000 differential probe, TCP0030 current probe and four TPP1000 probes along with decoder modules to measure analog signals, power characteristics and bus communications.

 

The MSO4102B oscilloscope features 1 GHz bandwidth with a sample rate of 5 GS/s. It supports up to 4 analog channels and 16 digital channels. Since the digital channels are fully integrated into the oscilloscope, users can trigger across all input channels, automatically time correlating all analog, digital, and serial signals.

 

With the addition of the appropriate power probes – from the wide cross-selection offered by the Tektronix – the MSO4000 series oscilloscopes are well-suited to power test applications such as the iCub battery backpack. For instance, the TCP0030 used by IIT is a high-performance, easy-to-use AC/DC current probe that provides greater than 120 MHz bandwidth with selectable 5 A and 30 A measurement ranges. It also provides low-current measurement capability and accuracy to levels as low as 1 mA.

 

The first application for this set up was to analyze the start-up transient as shown below to ensure that sensitive electronics were not damaged during the start-up of the robot.

Tek transients.png

Start-up transients were measured and reigned in with the help of Tektronix test instrumentation.


The oscilloscope also proved useful in evaluating battery life under a variety of conditions. Interestingly, it proved difficult to exercise the robot to its fullest where all 53 motors were running simultaneously, and in fact the team was not able to produce a truly worst case scenario. Like humans, rarely are all the possible combinations that go into creating movement used at the same time. After getting the robot as close to full movement as possible, the MSO4104B’s long 20M point record length was used to characterize battery discharge as shown below. Under near worst-case scenarios, battery life came out to about 1.5 hours but would typically be much longer under more normal operation.

Tek battery discharge.png

The MSO4104B deep memory was used to characterize battery discharge.


With three boards and two bus technologies, another important challenge confronting the team was to validate and debug data communications – a tedious task if performed manually. That’s where the DPO 4AUTO and DPO4EMBD data decoder modules made it easy to read and validate the data communications between a master board, a hot swap manager board and third board used for monitoring. The hot swap board communicates with the master through a 1 Mb/s CAN bus while the monitor connects through I2C to the master. The master includes a Bluetooth interface so it can communicate battery status to a mobile device or the robot head. An example of the CAN and I2C communication signals and the respective bytes decoded is shown below.

Tek debugging.png

CAN and I2C decoding helped to speed up debug


For additional information about this iCub robot project visit for the Tektronix website a free case study pdf: http://www.tek.com/document/case-study/icub-robot

During ARM Kickstarter Week we saw some amazing and inspiring new hardware companies coming to life through crowdfunding and you can see a list of the some of the most popular ARM powered projects here. One project that caught my eye is Kano which has such ambition and scope I had to speak to one of the founders; Yonatan Raz-Fridman to understand their vision.

yon at continuity forum 2014.jpg  

Kano bills itself as “a computer anyone can make” and its target customers are kids who don’t currently have access to technology to create things.  Kano wants computing to be as easy to use as Lego’s, where kids create rather than passively consume content.  Kano is based on the Raspberry Pi board which in itself has been a game changer in bringing low cost computing and programing to millions of people. Raspberry Pi is a great platform and with over 3.8 million boards sold it’s a phenomenon in modern computing but the Kano vision is even larger; to go beyond the hobbyist/maker market and serve kids globally.  Yonatan and the Kano team knew they had to create a simple operating system (OS) and an online “world” for collaboration, both daunting tasks.  Kano was self-funded prior to launching their Kickstarter campaign in 2013.  They built prototypes, the OS (built on Debian Linux) and in the “Lego” spirit thought hard about the industrial design of the components so they could withstand the special love they will get from kids. 

 

Interestingly Yonatan and the Kano team chose not to join an incubator (see my blog on HAXLR8R for an example) but instead they worked out of an apartment in London, made 200 units and then sold them for $99 online (that was the alpha version). With the concept proven they decided to launch on Kickstarter, their Kickstarter goal was $100k but they got to a whopping $1.5m raised in December 2013.   For first 9 months of 2014 they worked hard on development and shipped the first units in September 2014 and have now shipped 35,000 units to 86 countries so are well on their way to delivering on the vision.

Kano Kickstarter.png

So far this story runs very much along the lines of most successful Kickstarter projects but this is where I think Kano gets interesting. Yonatan is very clear that their ambition is to build a new global computer company that will bring the power of technology to those around the world who previously could not afford it.  They plan to do this in two ways, first with the open source KanoOS so there is no barrier to entry for software and using Kano World which might be described as a social network for tech kids where the code, projects, games and community happens and gets shared:

Kano World.png

Second, the Kano team has taken control of their supply chain in Shenzhen with logistics partner PCH International.  This means they have control of quality and supply making day to day management much more predictable.  This reminds me of an earlier effort to democratize the PC and that was the ‘one Laptop per child’ initiative driven by Nicholas Negroponte of the MIT Media lab, it too was ambitious and global with similar goals but it might have been too early.  OLPC is still operating and is now on the 4th version of the laptop (Marvell ARM based SoC) but seems to have scaled back somewhat.Kano has embraced a widely used hardware platform (Raspberry Pi) and an open source community and social network which in the world of Facebook may give them a leg up on getting to critical mass.   Yonatan told me that the ultimate vision for Kano is that it’s a new kind of end computer company, not based on a charity model but users get a real computer, at the right price, that people want buy.   Kano hopes to empower millions of creators around the world and this approach in a time of a new generation of connected kids might just work.


This is a noble effort and deserves your support even if you just want to play with the hardware! The $149 or 99 pounds sterling (includes free shipping) price is certainly accessible, you can buy yours here.

Happy New Year! Welcome to 2015 the year of hover boards, flying Deloreans’ and another opportunity to start a new with untold opportunity. As each new year approaches the media bombardment of resolutions and statistics can be downright dizzying. A quick review of the top ten resolutions show that they center around either vague concepts that lack the ability to be tracked such as being happier or learning something exciting to breaking of bad habits such as smoking, eating healthier, etc. Little mention is ever made to how to improve ourselves at work, in our careers or to that which is near and dear to my heart improving embedded software!


When developing new year resolutions it seems that one of the most overlooked is embedded software development resolutions! Work tends to be one of the highest areas of stress in ones life so shouldn’t setting resolutions that decrease bugs, improve software quality and the design cycle be at the top of the list?  After all isn’t a happy work life a bug-free life? 


Any good resolution or goal whether it is set for the new year, a new month or week has some common characteristics. The first and most important is that it is specific and not vague. For example, stating that the team is going to do more code reviews this year is doing nothing more than guaranteeing failure; However, stating that all code will be reviewed with code reviews occurring every Thursday from 9 a.m. until 11 a.m and putting it into the calendar has a much higher chance of being successful. There is a plethora of data showing that code reviews are one of the least expensive ways to go about bug squashing!


The second characteristic that a good resolution would have is that it is traceable! Get out your favorite spreadsheet or database application because metrics is exactly what is being ordered. Creating metrics that track code and the software development process is a critical step to ensuring that a resolution is actually occurring and that there is an improvement occurring that is reducing bugs, savings development costs, etc. Most engineers balk at metrics tracking and it can be difficult to get into the habit but as engineers who use data to prove our systems are working should also be creating data that shows our methodology and improvements are working as well!


Not all resolutions necessarily need to be wrapped in a detailed metric. Some of the best resolutions that any engineer could undertake would be to learn about a tool or technique that could be applied to the design cycle. For example, deciding to become a static analysis guru would have a drastic effect on decreasing bugs and creating more robust code.  Learning the tool and techniques wouldn’t initially show-up in the metrics but overtime as the technique is mastered the metrics should show improvement!


These are just a few simple examples of embedded software resolutions. There are any number of goals that could be set ranging from improvements to software architecture, the design process, third party reviews, coding standards, development methodologies, etc. Some additional thoughts can be found in the associated video blog (Embedded Software Resolutions for 2015 - YouTube) and some additional tips for creating embedded software resolutions can be found at http://bit.ly/1BFFkia .



Everything really boils down to just two simple questions:  

  1. What do you need to improve in 2015? 

  2. How are you going to go about making these improvements?

 

 

Jacob Beningo is a Certified Software Development Professional (CSDP) whose expertise is in firmware for embedded systems. He works with companies to decrease costs and time to market while maintaining a quality and robust product. He is an avid tweeter, a tip and trick guru, a homebrew connoisseur and a fan of pineapple! Feel free to contact him at jacob@beningo.com, and at his website www.beningo.com

In part 1 and part 2 of this series we looked behind the show floor of CES into the semiconductor announcements and what they might mean for CES 2016.  I also missed significant announcements from ARM partners Allwinner, MediaTek, and Texas Instruments but there is simply so much to process its hard to get to it all.  One major trend I couldn't miss was the self driving car and thanks to ARM's "man in Detroit" Will Tu, I looked at NVIDIA's auto pilot car computer the DRIVE PX.  This product is absolutely amazing, and I quote:

 

"The DRIVE PX platform is based on the NVIDIA® Tegra® X1 processor, enabling smarter, more sophisticated advanced driver assistance systems (ADAS) and paving the way for the autonomous car. Tegra X1 delivers an astonishing 1.3 gigapixels/second throughput – enough to handle 12 two-Megapixel cameras at frame rates up to 60 fps for some cameras. It is equipped with 10 GB of DRAM memory and combines surround Computer Vision (CV) technology, extensive deep learning training, and over-the-air updates to transform how cars see, think, and learn"


The Tegra X1 utilises an 8 core CPU (4, Cortex-A57 and 4, Cortex-A53 cores) in the ARM big.LITTLE configuration and the 256 CPU Maxwell GPU.  What this means for all of us is summed up in this picture below:

CES2015-Audi-self-drive-700x394_mid.jpg

The self driving car is a reality and NVIDIA wowed CES with a self driving Audi that drove from Silicon Valley to Las Vegas. Exactly when this technology comes to market is another question and my guess is that we will go through a period of partial introductions of the technology so the driving public can become comfortable with the concept and the technology.  The automotive market moves slowly with careful 3 to 5 year cycles of new technology intro's, remember anti lock braking?  it was revolutionary in its time but took a decade to be adopted in main stream vehicles and I think the same will be true for the autonomous car. Check back with me at CES 2025.

 

NVIDIA is showing us the high end potential of autonomous vehicles but even at the entry level of vehicle electronics we see projects like the San Jose State University senior class taking radio control cars and turning them into autonomous vehicles.  You can read about their work here on the community and watch this video (and you should).

 

 

Thanks to ARM super intern (and SJSU undergrad) Carissa Labriola for pointing me to this project.  There is so much more to this story like the professor designing his own board for the class (NXP Cortex-M3 based) but we will dig into this in a later post and have some fun.

 

So here's my larger point and prediction on autonomous vehicles; if the graduating EE class at San Jose State University is taking this technology and making it work today then they expect to see it in the cars they will be buying just a few years from now. Automakers need to understand their next customer, it's Marketing 101.

 

When do you think you will be comfortable buying a self driving car?

Holy moly connected everything explosion at International CES 2015! As the connected home products and autonomous auto announcements took majority of the spotlight (as they should), you always have those quirky gadgets that too make some noise. Here are 7 "real or unreal" examples of how companies are connecting the unconnected.

 

 

emiota-belty-smart-belt.jpg

Belty

Do you frequently purge at buffets? If so, this belt might be for you. Belty by French start-up Emiota is a smart belt that automatically loosens to give that expanding waistline comfortable room - and then as you walk it off, the belt will gently tighten as your waistline trims down. Emiota also has included some activity tracking sensors into the belt which will notify you through a buzz if you've been sitting too long (just in case your aching back doesn't alarm you).

 

 

 

babyglgl-bebe1-735x500.jpg

Baby GiGL

From the creators of one of last year's most popular CES gadgets, HAPIfork (now called the 10S Fork), by Smart Control - another company out of France, brings to you the smart baby bottle. Baby GiGL monitors how much and how fast your baby is drinking and then sends the data to your smartphone (I guess the baby crying isn't a sign that you need to feed him/her more). Rest assured, it will also inform you of the optimal angle you should be holding the bottle to ensure your baby doesn't intake too much air.

 

 

Sketchers Game Kicks.png

Game Kicks

In case your kid's tablet runs out of batteries, the popular shoe company SKETCHERS just announced Game Kicks. The interactive shoes light up and makes sounds in a Simon-like 'match the pattern' memory game. I can't wait to see a classroom full of kids with their leg up on their lap playing the light-up game. The sneakers are ready for your purchase at $65 a pair.

 

 

melomind by mybrain.jpg

melomind

Having a tough day? myBrain (yet another French company - Alban and Jérôme should be impressed with all the gadgets coming from their homeland) have created melomind. Strap on this connected headset during a coffee break and the company is claiming it will "improve your health and well-being with a 15-minute peaceful musical interaction experience." The headset connects via Bluetooth to your smartphone or tablet. When you put your earphones on, and start an audio journey (from a catalog of audio environments they provide) through which the music will be modulated by your own brain activity, it will guide you through the relaxation process. Sounds great, why wouldn't I wear this headset all day? melomind is available for a $299 pre-order via their website and set to ship later this year.

 

edwintheduck.jpg

Edwin

Edwin the Duck claims to be the "World's First Interactive Rubber Ducky" and boy has it revolutionized. The duck features an LED light, Bluetooth and Bluetooth Low Energy (BLE) wireless connectivity, waterproof speaker and thermometer. When in the bath, it will play music streaming from your smartphone and will detect if the water is too hot - then as you're ready for story time, Edwin engages by its lights reacting to a fully interactive animated adventure stories from the Edwin the Duck app. Available for pre-order at $99 - now the question is, would you pay that price tag for the next-generation of the little rubber ducky?

 

Parrot H20.jpg

Flower Power H2O

Yes, the company behind the drones, Parrot (and yet annnooottthheerr company out of France), brings to you the Flower Power H2O. I'm guilty of being a consistent plant killer and this new product hooked me at their press release headline, "the smart sensor that waters for your plant while you are away." Really? But, how does it store water? Well...you need to screw in a water bottle - and this is when I lost a bit of interest. I'm a bit OCD about things, and not sure I'd be ok with ruining my plant's beautifulness with a plastic water bottle - but perhaps others won't mind. If you're able to look past that (since you'll be out of town anyways), it will do its job of keeping your plant alive by using a sensor to dictate how and when it needs water.

 

pacifi.jpg

Pacif-i

It seems like connecting your toddler was another popular theme this year - let it be known that parents aren't afraid to throw money at anything related to safety and their child. Pacif-i by Blue Maestro is the "World’s First Smart Pacifier" that with a temperature sensor built into the pacifier’s silicon teat, it transmits temperature data via BLE to your smartphone or tablet. But, the greatest feature in my opinion is the built-in proximity sensor - this would've been helpful the other day when my toddler threw out her pacifier at Costco and I spent wasteful minutes trying to track it down.

 

 

Although these brilliant 'forward-thinking' innovations display the technology that is accessible today and are very good proving grounds of what can be done, I'm wondering though, how many of these products do you actually "need to have" and spend your hard-earned dollars on? Then also, what physical products do you interact with on a daily basis that you would like to be connected?

On the first day of CES (in part 1) I blogged about imagining the conversations going on behind the scenes in the private meeting rooms and hotel suites by reading press releases on new semiconductors being introduced to potential customers.  These are the enabling technologies that will power the cool new products at CES 2016, so it's a lot of fun to try to look into the future.  In part 1 I looked at ultra low-power processors for Bluetooth Smart and quad-core 64-bit application processors for tablets - but in part 2, lets take a look at an under reported but essential part of the Internet of Things, and that's sensors.  This may be stating the obvious, but without sensors there will be no IoT, so it's worth a deeper dive.


There are over 3500 exhibitors at CES but 393 identify themselves as being in the sensor business in some way or other.  There are some obvious names that define the sensor business like Bosch, TRW and Omron but many names I don't recognize.  In a future blog I will delve into the world of sensors, but for today I took a look at one particular announcement that caught my eye.  Invensense is a relatively new company in the world of sensors, they have been delivering products for just 9 years (Bosch has been around since 1886), but has been a pioneer in the new and wonderful world of MEMS technology. In my simple definition, MEMS are just physical sensors or systems implemented at very small scale using the magic of semiconductors.  Imagine a tiny microphone where the vibrating coil is replaced by a silicon membrane, same physics just incredibly small (and cheap).  The MEMS industry has quietly been innovating with tiny gyros, magnetometers, microphones, accelerometers, switches and even inkjet nozzles and they appear in many mobile products without us even knowing (the screen orientation on your mobile phone uses a MEMS magnetometer for example).  So it was inevitable that MEMS innovation combined with semiconductor process shrinking was going to enable some very cool products and Invensense just proved the point today.  Invensense announced a Sensor System on Chip (SSoC I suppose) that combines an ARM Cortex-M0 processor with 2 motion co-processors which give 6-axis motion measurement all in a 3mm x 3mm x 1mm package.  You can read all it about the device on the Invensense site here.  This new device has its own RTOS and is Android Lollipop compatible right from the start so it should ramp quickly.

chip-burst-with-pressuresensor (1).png

 

So depending upon the price point we have a totally integrated ultra low-power sensor system in a tiny package that could revolutionize several markets.  Invensense will inevitably have a lot of competition, but that means these devices are poised to become mass market very rapidly and that's good for our industry. As an aside, but a significant one, Invensense are predicting that in this quarter they will pass the 1 billion parts shipped milestone, thats quite an achievement.

 

So the glimpse into the future shows us many more MEMS devices coming to shoes, wrists, pets, robots and bikes at CES 2016.  Did you see any game changing semiconductor devices at CES this week?

German steel works suffered “massive damage” after hack attack

 

The stolen login credentials gain access to the control system and prevented the plant from being shut down.

 

If you needed another example of why it’s a good idea to airgap your industrial plant’s production network from the rest of the internet, here it is!

 

http://blog.lumension.com/9630/german-steel-works-suffered-massive-damage-after-hack-attack/

 

If you want to learn more about securing embedded devices, please check out our secure Linux programming course.

Its the first official day of CES today (Jan 6, 2015) and the circus is in full swing.  It's hard to miss the big stories of the show because every major media outlet in the world is there and the tech press in particular flock there en masse (CNET is my favorite) but for those of us in the semiconductor world we have to look behind the scenes into the private meeting rooms and suites that surround the main event to understand what's really going on. Remember, at CES we are seeing consumer products that were designed last year or maybe even several years ago so the chips in them are a generation or two behind the ones being announced at this CES.   If we take a look at new devices being launched and previewed in the demo suites we can take a unique peek into the silicon crystal ball and have some fun.

 

So working on my theory that by looking at the chips being launched today you get a glimpse into possible new end products that might appear at the 2016 event.  Obviously there are some trends that run for several years so its not that hard to predict bigger TV's or more wearable devices are in the pipeline but the fun part of CES is trying to spot an inflection point or two and make some guesses on what we might see next January.

 

Some might say that CES has become a Bluetooth show because so many products there are utilizing this RF technology.  When Bluetooth first came out (15 years ago) there was much skepticism about how much power it used and whether it would interfere with other radios. Bluetooth has overcome all this to become a de-facto standard for short range wireless comms and the developments continue here at CES 2015. Below is a chart from the Bluetooth SIG on their amazing growth.


Bluetooth growth.jpg


Bluetooth SMART and tiny ultra low power devices are vital for the wearables market and Atmel have a triple play with their BTLC1000 chip announced today which is only 2.1mm x 2.1mm in size yet has a Cortex-M0 processor on board and battery life improved 30% over current devices.  So here is a perfect example of my point, this device is smaller, uses less power and combines more functionality so after it samples in March it will make its way into new products that simply couldn't exist before.  What kind of predictions can we make from this you may ask?  Things like smart bandages that take your temperature and remind you to take your antibiotics or food packaging that warns of spoilage, the possibilities expand every year.  Expect to see more Bluetooth connected "things" at CES 2016. This is an important step in the Internet of Things becoming a reality and that could be an inflection point.


At the other end of the silicon spectrum we see a wave of multicore 64-bit devices coming out not just from the big names like NVIDIA (see the spectacular Tegra X1 launch) and Samsung, but Chinese innovators like Actions Semiconductor. Actions today announced a 64-bit quad core Cortex-A53 chip today with speeds up to 1.8GHz.  Not many years ago multicore was seen as a major hurdle for the whole industry yet today we see new products like this that can run Android and 4K video on a 28nm process.  But the deeper point is that this technology is now ready to trickle down to products beyond tablets and set top boxes into products at price points that could never justify the cost of a powerful processor and multimedia display.  Thinking that point through the connected kitchen might just become a reality and get beyond the joking to really useful and informative appliances that serve the connected home and the IoT.


Tomorrow we will look for more CES 2016 clues,  what did you see at CES that's a game changer?    


Second part:Predicting the future using CES, part 2

Jama Software, a product delivery platform that helps companies bring complex products to market, will be hosting a Lunch & Learn in Santa Clara on January 28, entitled “Creating a Collaborative Product Definition Process”.  The theme is structured around how to help keep your team focused on building the right product. Adrian Rolufs of Eola will headline the session, and topics covered will include:

 

  • The value of communicating requirements instead of specifications
  • Communicating requirements in small batches to development teams early to get fast feedback
  • The value of focusing on problem statements and sharing them with the entire team so that the team is all marching to a common goal.

 

LH-hH5Yl_400x400.pngPlease register using the following link: Creating a Collaborative Product Definition Process

 

Do you want to learn more about Jama Software’s technology and its relevance to ARM and the rest of the semiconductor and electronic hardware-software spaces? Then check out these articles?

 

Filter Blog

By date:
By tag: