Researchers at the University of Texas in Austin have found an algorithm that can retrieve and utilize information from a single image to measure the distance of objects from the focus distance. Until now, only visual systems of animals and human beings could accomplish this.
Co-author of the research study and postdoctoral fellow in the College of Liberal Arts’ Center for Perceptual Systems Johannes Burge stated that the fact that a statistical algorithm can accurately compute focus error is a major finding. Focus error is the measure of how much refocusing of the lens is required to obtain a sharp image from an individual image without any trial and error.
It is possible to apply the researchers’ algorithm to any blurry image to compute the focus error. Measuring focus error also enables computing the distance of objects from the focus.
The human eye is such that any obvious defects in the eye lens, for instance, astigmatism, can cause the visual system through the brain and retina to determine the focus error. The phenomenon known as “defocus blur” takes place when the lens is focused at the incorrect distance, and this blur is enriched due to defects in the eye lens.
According to another co-author of the research study and the director of the Center for Perceptual Systems, Wilson Geisler, the blur pattern that takes place due to focus errors and the statistical reliability of naturally formed images makes it possible to simulate the way the human eye functions.
It was also observed that as the focus error increases, large amounts of details are lost with bigger errors. It was also discovered that though there is a significant change in the image content, the number of details and the patterns remain the same. This constancy ensures that the defocus amount is computed and enables re-focusing precisely.
Their research has been published in the article “Optimal defocus estimation in individual natural images,” and is recorded in the Proceedings of the National Academy of Sciences.