The Eye Versus the Camera

Typography
Which is better: the camera or the eye (assuming normal eyesight). Visual perception is the ability to interpret information and surroundings from the effects of visible light reaching the eye. A camera merely records whatever image it receives. The human eye long ago solved a problem common to both digital and film cameras: how to get good contrast in an image while also capturing faint detail. The illusion of a bright and dark band on either side of the central stripe is due to lateral inhibition, where the cones in the retina inhibit their neighbors using negative feedback. A University of California Berkeley neurobiologist has discovered that the phenomenon involves localized positive feedback as well. Nearly 50 years ago, physiologists described the retina’s tricks for improving contrast and sharpening edges, but new experiments by University of California, Berkeley, neurobiologists show how the eye achieves this without sacrificing shadow detail.

Which is better: the camera or the eye (assuming normal eyesight). Visual perception is the ability to interpret information and surroundings from the effects of visible light reaching the eye. A camera merely records whatever image it receives. The human eye long ago solved a problem common to both digital and film cameras: how to get good contrast in an image while also capturing faint detail. The illusion of a bright and dark band on either side of the central stripe is due to lateral inhibition, where the cones in the retina inhibit their neighbors using negative feedback. A University of California Berkeley neurobiologist has discovered that the phenomenon involves localized positive feedback as well. Nearly 50 years ago, physiologists described the retina’s tricks for improving contrast and sharpening edges, but new experiments by University of California, Berkeley, neurobiologists show how the eye achieves this without sacrificing shadow detail.

!ADVERTISEMENT!

The visual system in humans allows individuals to assimilate information from the environment. The act of seeing starts when the lens of the eye focuses an image of its surroundings onto a light-sensitive membrane in the back of the eye, called the retina. The retina is actually part of the brain that is isolated to serve as a transducer for the conversion of patterns of light into neuronal signals. The lens of the eye focuses light on the photoreceptive cells of the retina, which detect the photons of light and respond by producing neural impulses.

"One of the big success stories, and the first example of information processing by the nervous system, was the discovery that the nerve cells in the eye inhibit their neighbors, which allows the eye to accentuate edges," said Richard Kramer, UC Berkeley professor of molecular and cell biology.

Kramer and former graduate student Skyler L. Jackman, now a post-doctoral fellow at Harvard University, discovered that while light-sensitive nerve cells in the retina inhibit dozens of their close neighbors, they also boost the response of the nearest one or two nerve cells.

That extra boost preserves the information in individual light detecting cells – the rods and cones – thereby retaining faint detail while accentuating edges, Kramer said. The rods and cones thus get both positive and negative feedback from their neighbors.

The fact that retinal cells inhibit their neighbors, an activity known as lateral inhibition, was first observed in horseshoe crabs by physiologist H. Keffer Hartline. That discovery earned him a share of the 1967 Nobel Prize in Physiology or Medicine. This form of negative feedback was later shown to take place in the vertebrate eye, including the human eye, and has since been found in many sensory systems as a way, for example, to sharpen the discrimination of pitch or touch.  Lateral inhibition fails, however, to account for the eye’s ability to detect faint detail near edges, including the fact that we can see small, faint spots that ought to be invisible if their detection is inhibited by encircling retinal cells.

Kramer noted that the details of lateral inhibition are still a mystery half a century after Hartline’s discovery. Neurobiologists still debate whether the negative feedback involves an electrical signal, a chemical neurotransmitter, or protons that change the acidity around the cell.

The retina in vertebrates is lined with a sheet of photoreceptor cells: the cones for day vision and the rods for night vision. The lens of the eye focuses images onto this sheet, and like the pixels in a digital camera, each photoreceptor generates an electrical response proportional to the intensity of the light falling on it. The signal releases a chemical neurotransmitter (glutamate) that affects neurons downstream, ultimately reaching the brain.

Unlike the pixels of a digital camera, however, photoreceptors affect the photoreceptors around them through so-called horizontal cells, which underlie and touch as many as 100 individual photoreceptors. The horizontal cells integrate signals from all these photoreceptors and provide broad inhibitory feedback. This feedback is thought to underlie lateral inhibition, a process that sharpens our perception of contrast and color, Kramer said.

The positive feedback, however, involves chemical signaling. When a horizontal cell receives glutamate from a photoreceptor, calcium ions flow into the horizontal cell. These ions trigger the horizontal cell to talk back to the photoreceptor, Kramer said. Because calcium doesn’t spread very far within the horizontal cell, the positive feedback signal stays local, affecting only one or two nearby photoreceptors.

Jackman and Kramer found the same positive feedback in the cones of a zebrafish, lizard, salamander, anole (whose retina contains only cones) and rabbit, proving that "this is not just some weird thing that happens in lizards; it seems to be true across all vertebrates and presumably humans," Kramer said.

For further information: http://newscenter.berkeley.edu/2011/05/03/why-the-eye-is-better-than-a-camera/