Sensor networked mobile robotics project
The traditional mobile robot architecture uses onboard sensors
to see its environment.
R2D2, the Terminator, and the majority of examples from science
fiction (as well as non-fiction) all follow this form. These
robots are limited to a first-person field-of-view. They also
must solve the difficult problem of moving data fusion ,
which includes the correspondence problem and the self-localization
problem.
Images or data sensed from one viewpoint must be fused with data
sensed from additional viewpoints as the robot (and hence sensors)
move through the environment.
The sensor networked mobile robotics project researched an alternative
architecture to mobile robotics. In this architecture, the sensors
are deployed as a stationary network distributed throughout the
environment. The robot itself is "blind", and sees by receiving
transmissions from the sensor network.
An SN robot enjoys a third-person perspective. This is sometimes
referred to in the literary world as the "God's eye view".
By being able to see the entire environment, the robot should be
able to more effectively plan and execute motion.
In addition, the data fusion problem is simplified, because
although the robot moves, all the sensors remain stationary.
Demos
The following videos show an SN robot running into objects to watch
what happens. From these interactions the robot can determine
dynamic properties of the objects, such as their mass or coeeficient
of friction with the floor. Sort of like a dog nosing objects in
a sandbox.
The robot was modified and fitted with a "ram":
The objects used for experiments included a Tonka truck, a lightweight
plastic ball, a large empty box and a small heavy box:
Click
here to see a movie clip demo of the robot moving around its box,
hitting objects. This clip is playing at 10x speed.
Click
here to see a movie clip demo of the "interaction analyzer"
system in action. The collisions in this clip are playing at
normal speed, but the rest is playing at 10x speed.
Click
here to see a movie clip of the polynomial path search.
Papers about this project:
-
B. Olsen and A. Hoover, "Calibrating a Camera Network Using a Domino Grid", Pattern Recognition, vol. 34 no. 5, May 2001, pp. 1105-1117.
-
A. Hoover and B. Olsen, "A Real-Time Occupancy Map from Multiple
Video Streams",
in the proceedings of IEEE International Conference on Robotics
and Automation, Detroit, MI, May 1999.
-
A. Hoover and B. Olsen, "Path Planning for Mobile Robots Using a
Video Camera Network",
in the proceedings of IEEE/ASME International Conference on
Advanced Intelligent Mechatronics, Atlanta, GA, September 1999.
-
A. Hoover and B. Olsen,
"Sensor Network Perception for Mobile Robotics",
in the proceedings of IEEE International Conference on Robotics
and Automation, San Francisco, CA, 2000, pp. 342-347.