Image-directed control is the research field associated with using sensors that "see" and respond, similar to how humans and animals sense things with their eyes. This technology is not new, but the Magicc lab tries to use effective and innovative approaches to solve this problem.
One example of image-directed control is Computer Vision. In fact, computer vision is an extensive field that focuses on making computers "see" by recognizing objects and responding. As part of the Mars Lander project, sponsored by NASA'a JPL (Jet Propulsion Laboratory), we have enabled our UAVs to sense a parachute deployed by a fellow UAV and follow it until the plane reaches the ground. At this point, the airborne plane continues to orbit over the fallen plane to aid in recovery.
Another example of image-directed control is called Optic Flow. Currently, we are working with an Australian University which has experience in optic flow sensing in order to build our own optic-flow-capable UAVs. Optic flow tracks information "flowing" past a skinny camera, often with a small resolution line of pixels. By tracking how quickly information passes through the sensor, the plane can sense how far away it is from obstacles and can be programmed to avoid them. Obstacle avoidance is a major field in UAV research which is necessary to make them safe and marketable. Capabilities such as computer vision and optic flow make our UAVs smarter and more robust.
One current of example of using image-directed control on real planes is demonstrated by the work of David Casbeer, a graduate student at BYU. His work shows how a team of UAVs can help firefighters track the growth of forest fires. First, upon discovery of a fire, the UAV team converges and begins to circle the perimeter of the fire.
As the planes travel, they track how far they've gone and send this information to each other and to the ground station. This allows firefighters to know how quickly the fire is growing and keeps them from getting trapped by rapidly expanding flames. David's work includes modeling the fire growth in the computer and planning coordination routines for the planes in order to most effectively relay information. The movie clip to the left demonstrates the growing fire simulation and a single plane approaching it and beginning to circle.
The planes use image-direction to track the perimeter of the fire. Therefore, this project includes many aspects of Magicc research: first, it relies on an effectively running autopilot system; second, it requires the coordination of multiple agents in a single project; and third, it requires image-directed capability in order to track the fire. This is one way in which UAVs can be an effective commercial product as well. The movie at right is a model of how the planes will coordinate their paths and communicate with one another. A more complete presentation of this project is found here, in an Adobe Acrobat file. THE PDF IS NOT FOUND ON OLD SERVER
Vision and Control Server Documentation - NOT FOUND ON OLD SERVER