Nov 12 2014
Science begins with observation, and many defining moments in scientific progress followed the introduction of new ways to observe the world, from microscopes and telescopes to X-rays and MRIs.
The Electronic Visualization Laboratory at the University of Illinois at Chicago has been awarded a $3 million National Science Foundation grant to develop an extraordinary new camera -- really an array of dozens of video cameras -- that can capture images in 360 degrees and three dimensions.
Imagine the difference between seeing a fish dart through the narrow field of an underwater camera and seeing the fish and the entire surrounding area.
"We see a fish whip by, but you have no idea why. What might be chasing it? It's a very myopic view," said Daniel Sandin, a co-founder of EVL who worked on the development of the Sensor Environment Imaging (SENSEI) Instrument project.
SENSEI will show you what's chasing the fish, how big it is, and how far away it was when it spooked that fish.
SENSEI will capture the the whole sphere of the surrounding environment, in stereo, with calibrated depth, providing information of how far away objects are and allowing estimates of their size and mass. The video image exhibits true colors and brightness, Sandin said.
EVL has worked before with multiple images to create a single complete 360-degree panoramic image, making it possible to explore Medinet Habu -- the mortuary temple of Ramses III, located on the west bank of Luxor in Egypt -- in the projected virtual reality of the CAVE2™.
To create a composite view of multiple images, "you have to stitch together pictures that overlap and are taken at different angles," said Maxine Brown, director of EVL and principal investigator on the project.
A successful prototype will require that the developers engineer configurable, portable, sensor-based camera systems capable of handling a huge amount of data, displays for viewing the 3-D stereoscopic images, and network systems that allow collaboration.
In addition, SENSEI will need hardware scaffolding for its sensor arrays, data acquisition and computing platforms, telemetry and communications, and software.
The cell phone market has driven improvements in miniaturization and resolution in camera technology that combine with EVL's strength in visualization of big data and computer networks to make the project doable, said Brown.
Investigators include oceanographers interested in studying coral reefs and kelp forests; computer scientists viewing social networking in animals; astronomers looking at the night sky from a high-altitude balloon and telescopes around the world; archeologists opening a window on cultural heritage; and public health experts working on emergency preparedness.
Robert V. Kenyon, Andrew E. Johnson and Tanya Berger-Wolf, all of UIC computer science, are co-principal investigators on the grant.
Researchers at four other institutions will contribute to the project and receive sub-awards from UIC: University of California, San Diego, led by Truong Nguyen; University of Hawaiʻi at Mānoa, led by Jason Leigh; Scripps Institution of Oceanography, led by Jules Jaffe; and Jackson State University, led by François Modave.