Your next eyeglass exam might see an upgrade thanks to a high-tech astronomy tool developed to help see distant celestial bodies more clearly.
The technique, called wavefront analysis, takes precise measurements of how light reflected from the back of the eye exits. The difference between how light would be refracted through a normally shaped lens and cornea and how it is actually refracted creates a precise map of optical abnormalities known as higher-order aberrations.
"Astronomers already used these techniques to enable a clear telescopic view of planets and stars, undistorted by the focusing aberrations resulting from the Earth’s atmosphere," says Dr. Anthony Adams, Editor-in-Chief of the journal Optometry and Vision Science. "In the past two decades, optometry and ophthalmology researchers have borrowed techniques for measuring and correcting these higher-order abnormalities.”
Robots these days often take their inspiration from nature. Now cameras mimicking bug eyes that can look in many directions simultaneously can be made en masse, researchers say. These novel devices, each possessing hundreds of microscopic lenses, could find use as surveillance cameras on flying droids or in minimally invasive surgical operations.
The compound eyes of insects and other invertebrates are each made of up to thousands of relatively simple light-sensing facets known as ommatidia. These cover curved, hemispherical surfaces so that each points in slightly different directions. As such, compound eyes can have much wider fields of view than human eyes or regular cameras, including panoramic ones reaching nearly 360 degrees all the way around.
Artificial organic light sensors could one day help lead to eye implants that integrate more naturally with the body to restore vision, researchers say.
Blindness often involves damage to the retina, the light-sensitive inner lining of the eye. For instance, retinitis pigmentosa, a group of inherited diseases that afflicts 1 in 4,000 people across the world, involves degeneration of the eye’s light-sensing cells, or photoreceptors.
Research teams worldwide are developing retinal prosthetics that seek to restore vision using electronics that essentially replace lost photoreceptors. In just one vein of study recently receiving attention, neuro-ophthalmologists at the University of Tuebingen in Germany have been implanting light-sensitive microchips within eyes that are helping patients recognize facial expressions and see the faces of their loved ones.
However, current retinal prosthetics typically use electrodes made of inorganic semiconductors like silicon to interface with the body.
Why do humans see colors? For years the leading hypothesis was that color vision evolved to help us spot nutritious fruits and vegetation in the forest. But in 2006, evolutionary neurobiologist Mark Changizi and colleagues proposed that color vision evolved to perceive oxygenation and hemoglobin variations in skin in order to detect social cues, emotions and the states of our friends or enemies. Just think about the reddening and whitening of the face called blushing and blanching. They elicit distinct physiological reactions that would be impossible without color vision.
A few years ago Changizi left Rensselaer Polytechnic Institute where he was professor to co-found 2AI Labs with Dr. Tim Barber. Their Boise, Idaho-based research institute, funded via technology spin-offs coming out of their work, aimed at solving foundational problems in cognitive science and artificial intelligence. The move allowed Changizi to continue to conduct academic work with more intellectual freedom and less of a reliance on grants.