Several earlier work by others, including a previous work for this course, motivated us for this project. We will briefly describe these works of others in
the first part of this section. In the second part we will show some Youtube videos that used robots as
physical platforms for lighting instruments to create stunning visual display.
Previous Work
Our primary source of inspiration for this project is Video 1. This video features a project completed for
the same course in 2009. The clip is a bit short for our comfort. We suspect -- only
based on our intuition and without any evidence for now -- that over time (e.g., 2 minutes)
the formation of the robots will break.
Video 1: Dancing Roomba robots at UVic.
Our intuition was then corroborated by Video 2. This video is developed by a group of high school summer
interns at University of California, Santa Barbara. In this video, the formation breaks within a minute.
Video 2: Dancing Roomba robots at UCSB.
These two videos clearly show that we need a motion tracker that frequently tells the Roomba robots
to adjust their trajectories.
Video 3 shows a successful formation of a number of robots using an overhead camera. This video
features Masters work of Edward Macdonald at Georgia Institute of Technology. For precise world location, he used multiple
Infra-red motion capture cameras. Figure 1 shows his experimental setup. His setup motivated us to develop the Roomba
robot tracker using depth data. However, as IR motion capture cameras are very expensive we decided to use a Kinect sensor that
gives reasonably accurate depth information. In Section Kinect Tracker we will specify the requirement in detail.
Video 3: Robot formation (no coordinated motion).
Figure 1: Experimental setup of Edward MacDonalds' Masters work at Georgia Institute of
Technology.
The project shown in Video 1 and 2 is only concerned with maintaining a coordinated motion. They assume
that the robots can be placed at the initial position manually.
We suspect that this assumption probably limits possible coordinated motions of Roomba robots.
In this respect, our project is very similar to the projects shown in Videos 1-2 that
achieves a formation and maintains a synchronized motion among robots.
The project shown in Video 3 is only concerned with achieving a formation. The coordinated movements is not of
concern for that project. Therefore, we believe that our project is a combination of projects shown in Videos 1-3.
Platform for Laser Shows
The following videos shows robots as platforms for laser and led lights to create stunning visual displays.
Notice that for these to work, the tracker needs to be non-reliant on visual information. This has led us to
use only the depth data available from the Kinect sensor, even though optical information, RGB values, are
available.
Video 4: A light display using quad rotors. We do not know the details of this project.
Video 5: A beautiful led display on a RC helicopter. (Rayhan's personal favourite.)
Video 6: Another display of carefully coordinated lights.