Android gives eyes, ears, and a sense of direction to a Lego NXT robot

While some of us use our phones to communicate or play awesome games, there are those who take science and spare time to the next level.  Mike Partain is one of those people.  Using an off-the-shelf Lego NXT robotics kit, a Motorola Droid and a whole bucketful of ingenuity and smarts, Mike has added sensors the robotics kit was missing -- camera, GPS, and compass.  Hit the break to read about it in Mike's own words, a link to the source code (special thanks Mike!), and a couple videos seeing different views from the Droid's unblinking eye. [spike3 on PBase]

We reached out to Mike for a bit of an explanation about this project, because frankly, we needed things brought down to a level we could understand a bit better.  He was helpful enough to give us the lowdown, which is here in it's entirety.

One of my all-time wishes was to have my very own robot. And now, Lego has made playing with robots practical with the Lego NXT Robot kits. These kits follow the spirit of Legos in that they are easy, inexpensive and fun to assemble into multiple varieties of wheeled and walking robots. The kits include a brain that controls three motors, and up to 4 sensors. But what was missing was a remote camera, a compass, and a GPS. Enter the Motorola Droid, or most any Android device for that matter. Since the Android phone contains these advanced sensors, and is based on Java and open source, I figured it would be a simple matter to "wire" the Droid up to my computer and monitor its sensors side by side with my Lego Robots software.  I wasn’t too far off. I struggled a few days working out the code needed to decode the camera preview image, but everything else went fairly smoothly.  In the end, I had a huge smile across my face as my Lego robot ventured out across the vast living room carpet, with me controlling it remotely from only the image I could see on my computer screen.  It’s not the robot from lost in space, but none the less, it can open up a lot of young minds into how their future may look.I should note that the software is not "finished" quality. It's proof of concept, and it works. I may or may not ever improve or complete it, so I am making it available as is with no license or restrictions (or promises).There are 3 pieces of software;The Android Java code (droidSense). This code is a simple TCP server that provides raw undecoded image data (from the camera preview, sensor data and GPS data to any client.The stand alone .Net program (AssClient) that communicates with the Droid Java server, decodes the raw image data and displays the image, and raw sensor and GPS data.A Microsoft Robotics studio service (AndroidCameraService) that provides a generic WebCam image for use by a visual or standard MRDS program.The Java piece run on the Droid (There is no UI, it just displays the preview surface while running, use ‘force stop’ to kill it), and you then run either piece on the PC (I haven’t tried running them both at the same time, but I suppose it might work).

 Awesome sauce right there.  Here are the videos Mike's included:

Youtube link for StandAlone.wmv

YouTube link for AndroidCameraService.wmv

And finally, if you have the know-how and the time, here's a link to the source code Mike was kind enough to provide. Thanks again Mike, this is one of the coolest things ever!

Jerry Hildenbrand
Senior Editor — Google Ecosystem

Jerry is an amateur woodworker and struggling shade tree mechanic. There's nothing he can't take apart, but many things he can't reassemble. You'll find him writing and speaking his loud opinion on Android Central and occasionally on Threads.