As you may be aware, my ARM-Powered LEGO robotic Rubik's Cube solver "Speedcuber" was demonstrated live at Mobile World Congress '10 in Barcelona in February this year (actually it almost didn't happen because the LEGO robot fell apart while in transit and as I didn't attend, a few other ARM Engineers had to roll up their sleeves and work out how to piece it back together — thanks guys). I had anticipated that after the push to prepare the robot for the its first public outing, I could re-focus my energy on my regular day job and relax a bit more in my spare time... but things don't often go as expected in my life!
Speedcuber was a very popular demonstration at MWC so it was natural to consider demonstrating it again. However, someone (I won't name the "guilty party") suggested that we should switch to more relevant, contemporary technology; after all, the Nokia N95 phone controlling the first robot was launched almost four years ago in 2006. It was proposed that we show the robot again in April at the Embedded Systems Conference in Silicon Valley (time to forget relaxing in my spare time for another couple of months!?!).
We decided to use an Android-based phone and I set about downloading the Android SDK and acquainting myself with the unfamiliar APIs. The ARM Solution Center for Android (SCA) provided a good starting point for this. The application on the Nokia phone was written in Java using MIDP 2.0 so I hoped that the port to an Android application using Java would be relatively painless. I managed to borrow an HTC Hero for my initial work. Using one of the many example applications available via the ARM SCA I was quickly able to construct a skeleton application and user interface from a "Hello World" example and port the cube solver code.
For me, one of the coolest things about developing for Android was the ability to download and debug an application directly on the device using features such as breakpoints, code stepping and variable inspection that you would expect in an embedded software development environment. While developing for the N95, I was able to debug the application running in an emulator, an option which is also available with Android. This allowed me to debug the main solving algorithm, but didn't allow me to debug the Bluetooth and Camera interfaces. From my previous experience, I expected that writing code to interface from the phone to the Motorola DROID.
But it was all going too smoothly! Accessing the camera was more of a challenge. On the N95, I was only able to use the camera in snapshot mode rather than in streaming video mode. This limited the overall speed of the demonstration since the initial scan took several seconds to capture each of the images for the six faces of the Rubik's cube. I was hoping to be able to use the live preview image stream provided by the android.hardware.Camera class to significantly reduce the time required to capture each image in the new demonstration. It took a while to find enough information to convert the HSV images returned in the preview call-back into RGB format to allow me to use my existing colour recognition algorithm. However, once this was achieved, the Android version of Speedcuber soon sprang into life with its first solve!
The other significant change to the demonstration was to add a large LED display to show messages and solve times. Luke from the "demo team" in Cambridge worked very hard to create this from scratch using a Cortex-M3 based microcontroller, the LM3S811 from Luminary Micro using their Evaluation kit. Of course the software development environment supports debug of the embedded application while it is running on the microcontroller. Luke had to develop code to allow the LEGO NXT controller to communicate with the Cortex-M3 in the LED display using an I2C interface so I wasn't the only person to get to play with LEGO at work this time!
Overall, I had a very positive experience porting the application to Android. The developer resources are generally clear and comprehensive. Since I was interacting directly with hardware interfaces to external devices, the ability to debug the application directly on the phone was a definite win for me.
You can see the Android version of the ARM Powered LEGO Speedcuberlive on ARM's stand #1308 Embedded Systems Conference in San Jose until 29th April (assuming it survived the flight this time!)
Back to my day job... and maybe I can find some time now to relax a little?!?