Posted in | News | Imaging

Novel Optical Innovation Expands the Capabilities of Cameras

Researchers at the Penn State College of Engineering have created an ultrathin optical element called a metasurface. This element can attach to a traditional camera and encode the spectral and polarization data of images captured in a snapshot or video using tiny, antenna-like nanostructures that modify light properties. This research was published in Science Advances.

Novel Optical Innovation Expands the Capabilities of Cameras
Xingjie Ni, left, associate professor of electrical engineering, displays a camera sensor integrated with a three-millimeter-by-three-millimeter metasurface. The metasurface turns a conventional camera into a hyperspectro-polarimetric camera. Zhiwen Liu, right, co-corresponding author on the paper, is a professor of electrical engineering. Image Credit: Kate Myers/Penn State

Butterflies can perceive a broader spectrum of colors and detect the polarization of light, allowing them to communicate, forage, and navigate with precision. Similarly, certain species, like the mantis shrimp, can detect a wider range of light wavelengths and the circular polarization of light waves, using these abilities to signal to potential mates.

Inspired by these capabilities in the animal kingdom, researchers at Penn State College of Engineering have developed a metasurface, which can be attached to a conventional camera and uses tiny nanostructures to capture both spectral and polarization data in images or videos. Additionally, the team created a machine learning framework capable of interpreting this complex visual information in real-time on a standard laptop.

As the animal kingdom shows us, the aspects of light beyond what we can see with our eyes holds information that we can use in a variety of applications. To do this, we effectively transformed a conventional camera into a compact, lightweight hyperspectro-polarimetric camera by integrating our metasurface within it.

Xingjie Ni, Study Lead Corresponding Author and Associate Professor, Electrical Engineering, Pennsylvania State University

Ni explained that while polarimetric and hyperspectral cameras can capture either polarization or spectrum data, they are often large and expensive and cannot simultaneously capture both types of data. In contrast, the three-millimeter-by-three-millimeter metasurface developed by the researchers is inexpensive to produce and can capture both polarization and spectral data at once. When placed between a camera's lens and sensor, it sends this information directly to a computer.

To extract the spectral and polarization data from the raw images, co-author Bofeng Liu, a Ph.D. student in electrical engineering, created a machine learning framework. This framework was trained on 1.8 million images using data augmentation techniques.

At 28 frames per second—primarily limited by the speed of the camera we used—we are able to rapidly recover both spectral and polarization information using our neural network. This allows us to capture and view the image data in real time.

Bofeng Liu, Study Co-Author and Doctoral Student, Pennsylvania State University

Researchers tested their neural network and metasurface using video recordings of translucent “PSU” letters illuminated by different laser beams. They also took images of the scarab beetle, known for reflecting circularly polarized light, a feature visible to others of its species.

Ni believes the technique could provide users with real-time hyperspectro-polarimetric information, offering practical applications in everyday life.

Ni added, “We could bring our camera along to the grocery store, take snapshots, and assess the freshness of the fruit and vegetables on the shelves before buying. This augmented camera opens a window into the unseen world.

Ni also highlighted the potential biomedical applications of this technology. Hyperspectral and polarization data could help distinguish the composition and structural characteristics of tissues, aiding in the early detection of cancerous cells.

This research builds on Ni's previous work with metasurfaces and metalenses, including those mimicking the human eye's processing capabilities and lenses that can image distant objects like the moon.

The study was co-authored by Zhiwen Liu, a professor of electrical engineering at Penn State, along with postdoctoral scholar Hyun-Ju Ahn and several graduate students: Lidan Zhang, Chen Zhou, Yiming Ding, Shengyuan Chang, Yao Duan, Md Tarek Rathman, Tunan Xia, and Xi Chen, all from the Department of Electrical Engineering at Penn State.

The study was supported by the US National Science Foundation and the National Eye Institute of the US National Institutes of Health.

Journal Reference:

Zhang, L., et. al. (2024) Real-time machine learning–enhancedhyperspectro-polarimetric imaging via anencoding metasurface. Science Advances. doi.org/10.1126/sciadv.adp5192

Tell Us What You Think

Do you have a review, update or anything you would like to add to this news story?

Leave your feedback
Your comment type
Submit

While we only use edited and approved content for Azthena answers, it may on occasions provide incorrect responses. Please confirm any data provided with the related suppliers or authors. We do not provide medical advice, if you search for medical information you must always consult a medical professional before acting on any information provided.

Your questions, but not your email details will be shared with OpenAI and retained for 30 days in accordance with their privacy principles.

Please do not ask questions that use sensitive or confidential information.

Read the full Terms & Conditions.