Feb 12 2009
Scientists at Rochester Institute of Technology are designing a new kind of optical sensor to fly in unmanned air vehicles, or surveillance drones, tracking suspects on foot or traveling in vehicles identified as a threat.
"The Air Force has clearly recognized the change in the threat that we have," says John Kerekes, associate professor in RIT's Chester F. Carlson Center for Imaging Science. "I think we all understand that our military has a paradigm shift. We're no longer fighting tanks in the open desert; we're fighting terrorists in small groups, asymmetric threats."
Kerekes won a $1 million Discovery Challenge Thrust grant from the Air Force Office of Scientific Research to design efficient sensors using multiple imaging techniques to track an individual or a vehicle.
The sensor will collect only the data it needs. It will assess a situation and choose the best sensing mode (black and white imaging, hyperspectral or polarization) for the purpose. Developing two strands of information—one about the target, the other about the background environment—will be key to maintaining a connection and for piercing through camouflage effects.
This is how it will work: The sensor will collect a black and white image of a target, say a car, and will record the shape of the object. A hyperspectral image will plot the object's color as it appears in multiple wavelengths, from the visible light to the near and short infrared parts of the spectrum beyond what the eye can see. (This mode can tell the difference between two blue cars passing.) The third imagery mode, polarization, cuts through glare and gives information about surface roughness. It provides details that distinguish between objects of similar color and shape. (This mode can lock onto the unique material properties of the blue car in question.)
"These are all complementary pieces of information and the idea is that if the object you are tracking goes into an area where you lose one piece of information, the other information might help," Kerekes says.
As the lead scientist on the project, Kerekes assembled a comprehensive team with RIT collaborators and other scientists to envision the system from end to end: all the way from the design of the optical and microelectronic devices to the synchronizing algorithms that tie everything together.
Zoran Ninkov, professor of imaging science at RIT, is working on the overall optical system. Ninkov is modifying one of his own astronomical optical sensors for this downward-looking purpose. Alan Raisanen, associate director of RIT's Semiconductor and Microsystems Fabrication Laboratory, is designing tunable microelectronics devices to collect specific wavelengths. Ohio-based Numerica Inc., a large subcontractor on the project, is creating the advanced algorithms necessary for tracking a target and picking the right imaging mode based on the scenario.
According to Kerekes, motivation for this project came from Paul McManamon, former chief scientist at the Air Force Research Laboratory's Sensors Directorate in Dayton, Ohio, partly as a means of eliminating data overload.
"The idea is to lead to more efficient sensing both from the point of view of collecting the data necessary and being able to adapt to these different modalities based on the conditions in the scene or the task at hand," says Kerekes.
The catch phrase is 'performance-driven sensing,'" he continues. "The idea behind that is you let the task at hand and the desire to optimize the performance drive what information is collected."
Kerekes and his team are testing their preliminary models using generic scenarios played out in a simulated world akin to Second Life. The computer program, known as Digital Imaging and Remote Sensing Image Generation model (http://dirsig.cis.rit.edu/), is driven by computer graphic codes that predict simulated sensor data and provide a platform for testing scenarios based on imaging problems, such as Kerekes' new sensor technology.