The Autonomous Mobile Robot Group

Department of Computer Science
Northwestern University

Final Projects

Remote Robot Control via Everyday Devices

Team members

Andy Crossen
Louis Lapat
Nick Hofmeister

We’ve created three ways to control the robot remotely at your leisure: through a web controller, with a standard joystick, and using telephony with any phone through a standard phone line.

Our work on telephony control of robot behavior was due to a need for a convenient, ubiquitous control device.  Why not be able to control a robot from a device that everyone has, namely, a cellular phone?  A combination of Scheme, GRL, Java, and the Microsoft Telephony API provided the necessary tools to build what we were looking for.  The robot first detects a user’s presence using its front sonars (analogous to asking the question “Is someone standing in front of me?”).  Upon noticing someone, a message is sent to a desktop machine with a voice IP modem, which then calls a user’s cellular phone.  Upon answering the phone, the user now has control over the robot using the keypad on the phone.  Pre-programmed behaviors assigned to each number, as well as the # and * keys, are activated on the robot upon pressing the corresponding key.

The web controller provides an alternative interface to the same underlying functionality.  Using any forms-capable web client, a user controls the robot through form buttons corresponding to behaviors of the robot.  A Java servlet provides the CGI interface to the web controller.  The servlet displays a list of behaviors in HTML, and inputs the desired behavior (and robot to perform it on) from the user through an HTML form.  The servlet then sends this command to the specified robot through UDP.  Having a web client with basic HTML controls allows any device with a simple browser and a network connection to operate a robot.  This functionality was tested on a networked desktop machine, a Palm device outfitted with a wireless IP modem, as well as a WinCE device equipped with a wireless network card.

The joystick controller works on the same principle.  This input device was used to demonstrate directional motor vector commands on the robot with a flight-stick interface.  This interface proves the most natural in driving the robot around, a directional vector mapped to each motion of the joystick.

Screenshots from the web controller running on a Compaq iPaq WinCE device

General Device Operation

These controllers were built within a Java, Scheme, and GRL framework, using the connectionless UDP protocol for messaging between these languages.  UDP was chosen due its relative ease of implementation on all platforms, and didn’t require a great deal of custom code to be written to use it.  The generic usage lifecycle of each demonstrated control device is as follows:

  1. The control device provides a list of behaviors to perform. 

  2. Each behavior has a short string representation in a UDP message 

  3. The robot waits for a UDP message specifying which behavior to enact 

  4. The control device sends the message to the listening robot 

  5. Upon reception of the message, the robot begins performing the behavior 

  6. The robot continually performs the behavior until told to explicitly stop, or given another behavior

Reflections and Regrets

While all of our original milestones were met and exceeded, there are some things we would have done differently, as well as some things we would have liked to add, given more time.

  • UDP, while very easy to implement and use, proves to be unreliable due to its connectionless delivery operation.  A more robust solution would be a TCP client/server architecture to ensure that messages are always received via an authenticated connection, as well as to guarantee that messages are received in the correct order that they were sent.
  • While a user can add a command message to send through the web controller, there is currently no way to add a new remotely-engaged behavior to GRL without manually adding it to the behavior pool of the system.  Remote addition and removal of behaviors would be nice.
  • Complete teleoperation including live video and debugging information from the robot would prove a more useful solution.

 



AMRG home