Since a few years already, hobbyists managed to build robots operated by cell phones (see video 1). Earliest experiments rely on Dual-tone multi-frequency (DTMF) signaling. With this approach, a phone sends commands through its headphone output to a micro-controller that interprets them to drive motors.
Video 1: Robot Controlled by a Cellphone through DTMF
DTMF is a one way communication channel. You can send commands to the robot, but you can’t get sensor values. Fortunately, smartphones such as the iPhone or the Android phones were designed with the ability to interact with other devices. Apple does have the Made for iPhone program (MFI) for developing electronic accessories that connect to iPod, iPhone, and iPad. Its rival, Google, provides the Accessory Development Kit introduced for Android 3.1, and backported to Android 2.3.4. Obviously, roboticists took benefit of these facilities to easily make robots mixing their favorite sensors and actuators with smartphones built-in sensors such as GPS or accelerometers (see video 2).
Video 2: Robot operated by an Android Cellphone
At CES 2012 Las Vegas, two startups demoed their products (see videos 3 and 4). On the one hand Xybotyx presented Xybot1 a mobile dock for iPhone and iPod Touch. On the other hand, Romotive which raised funds through Kickstarter showcased Romo a robot body that can be controlled either by an Apple’s prodcuts (iPhone and iPod Touch) or by an Android enabled device (the Romo specification mentions Samsung Galaxy family and HTC Droid family both with Android v2.2+). Another difference is that Xybot1 is wheeled while Romo is mounted on caterpillars.
Video 3: Xybot1 for iOS devices
The market of robots operated by smartphones is targeted by larger businesses too. In 2011, Engaged reported that the toy manufacturer Hasbro in partnership with Google, experiments a legged toy robot controlled by a Nexus. Another big name interested by this emerging market is iRobot that explores robots operated by tablets: the AVA (see video 5). According to iRobot, the AVA is a technology demonstrator in early stages of commercialization. It exhibits an rich suite of sensors including one scanning laser and two Kinect sensors. For local control, the AVA reacts to touch using its touch sensitive “skin”.