Mountain Splendor? Scientists Know Where Your Eyes Will Look

Typography

Using precise brain measurements, Yale researchers predicted how people’s eyes move when viewing natural scenes, an advance in understanding the human visual system that can improve a host of artificial intelligence efforts, such as the development of driverless cars, said the researchers.

Using precise brain measurements, Yale researchers predicted how people’s eyes move when viewing natural scenes, an advance in understanding the human visual system that can improve a host of artificial intelligence efforts, such as the development of driverless cars, said the researchers.

“We are visual beings and knowing how the brain rapidly computes where to look is fundamentally important,” said Yale’s Marvin Chun, Richard M. Colgate Professor of Psychology, professor of neuroscience and co-author of new research published Dec. 4 in the journal Nature Communications.

Eye movements have been extensively studied, and researchers can tell with some certainty where a gaze will be directed at different elements in the environment. What hasn’t been understood is how the brain orchestrates this ability, which is so fundamental to survival.

In a previous example of “mind reading,” Chun’s group successfully reconstructed facial images viewed while people were scanned in an MRI machine, based on their brain imaging data alone.

Read more at Yale University

Photo credit: skeeze via Pixabay