Tracking the Evolution of a Theory

Article by Janet Wells

A theory proposed by a Berkeley Optometry Hall of Fame researcher 100 years ago, the very year of the school’s inception, has led to a path of inquiry and discovery that continues to engage and challenge the school’s research community, and could potentially lead to solutions that will improve the quality of life for people with a variety of vision and brain-related disorders.

Herbert Wertheim School of Optometry and Vision Science professor Austin Roorda, PhD, hadn’t thought much about eye movement until he started making images of retinas. Human eye image quality, he discovered, is “surprisingly poor.”

The eye’s optics are fraught with imperfections, casting an upside-down and blurred image on the retina, which is constantly moving. The eye’s three types of cones are arranged in a seemingly haphazard way, and the >300 million neurons in the retina that process signals from photoreceptors are crammed into a comparatively whisper thin conduit of axons connecting the eyeball to the brain.

As a person looking out at the world, this isn’t a problem. Our visual system has evolved remarkably to tolerate imperfections and deliver a stable and detailed picture.

“The problem is when I try to look into your eye,” says Roorda. The eye’s irregularities and movements result in distortions and limited image quality from standard equipment like a scanning laser ophthalmoscope.

So, Roorda set out to design and build better tools to study the earliest stages of the visual process. In the 20 years since, he not only pioneered new technology to see more clearly into the eye, but also into what the eyes are seeing— and, in the process, helped foster an era of discovery at Berkeley Optometry & Vision Science.

Pushing Exploration: From Imager to Tracker

Roorda started by taking a cue from the field of astronomical imaging, which had already developed optical technology for ground-based telescopes that corrects for distortions from variables like atmospheric turbulence or temperature fluctuations.

The result? The Adaptive Optics Scanning Laser Ophthalmoscope (AOSLO). Comprised of a thicket of lenses, mirrors, and scanners, along with beam splitters, photomultiplier tubes, and wavefront sensor array, all bolted to a large perforated tabletop, AOSLO’s similarity to an ordinary ophthalmoscope begins and ends with a chin rest.

Roorda credits the “heroic efforts of people in my lab and collaborators over the years” in pushing AOSLO’s capabilities through several rebuilds to its current incarnation. An integrated scanning and video system not only takes images of the retina, but can also be used to deliver images to targeted locations on the retina, while tracking and recording the eye’s incessant motion in real time. AOSLO’s high-resolution microscopy can now peer into a retina down to the level of single cone photoreceptors.

“Improved optical techniques get sharper images. Computer technology has improved real-time calculation of eye movement, and allows us to pin images on the retina in multiple wavelengths,” Roorda says. “We’ve effectively hacked human vision, meaning that we can bypass normal visual processes and control activity in the retina in the ways that the retina or the optics were never designed to do.”

AOSLO has been a wellspring for numerous publications and awards, including a 2015 Audacious Goals grant from the National Eye Institute (NEI) for Roorda to lead a retinal mapping project in support of an NEI long-term goal: restoring vision by regenerating neurons and neural connections in the visual system that have been lost or damaged.

Roorda’s more recent investigatory trajectory into eye movement arose out of a kind of scientific kismet, following curiosity and opportunity. The original idea—for an imager that removed optical distortions—had to address eye motion, which turned out to yield “a very accurate eye tracker,” says Roorda. “And that led to trying to understand the role of eye motion in vision.”

The Evolution—and Resolution—Of a Theory

The eyes’ continuous motion of rotating and darting has a long history of both puzzling and inspiring scientists and clinicians. Early on, the conventional—and more intuitive—wisdom was that the eyes’ involuntary movements were more akin to a distracting chatter, and a detriment to seeing fine detail.

However, in 1923—the same year that Berkeley’s School of Optometry opened its doors—Frank Weymouth, AM, PhD, FAAO, went against the grain, proposing in an American Journal of Physiology paper, “Visual Perception and the Retinal Mosaic,” that very fine spatial vision is critically dependent on eye movements. Weymouth (who, after retiring from Stanford University and the Los Angeles College of Optometry, continued his research at Berkeley and is a member of the Berkeley Optometry Hall of Fame) and his colleagues devised two innovative analog experiments to make their case.

First, they had subjects detect tiny offsets along an otherwise straight edge that was projected onto a frosted glass screen through a minutely perforated sheet of aluminum patterned to mimic an enlarged array of photoreceptor cones. The researchers simulated a moving eye by shifting the edge behind the perforated screen and a non-moving eye by holding the edge stationary. The ability to detect the minute offset was many times improved when the edge was moving.

“We’ve developed technology that allows us to revisit long-standing questions with more accuracy.”

In a second experiment, they removed the perforated screen and had subjects detect the minute offsets in the straight edge directly, this time controlling retina motion by either briefly flashing the shadow of the edge or allowing subjects up to seven seconds of continuous viewing. Just like before, eye movement was shown to confer an improvement in performance.

The initially controversial theory gained acceptance over the decades, but it remained just that—theoretical. Until AOSLO.

Roorda didn’t set out to prove Weymouth et al., but “we’ve developed technology that allows us to revisit long-standing questions with more accuracy,” he says. The similarities between their findings, he acknowledges, “are pretty striking. What Weymouth and colleagues thought might be the case 100 years ago, is for sure true.”

Roorda’s team reached a definitive answer thanks to their “Tumbling E Test.” Projecting a tiny “E” onto the eye of a participant, researchers asked whether the letter was facing up, down, left, or right.

While AOSLO tracked and recorded eye movement, the test would be repeated rapid fire under two conditions: a “static” projection, fixing the letter in the world, which allowed it to slip around the retina with the eye’s natural movements; and, a stabilized projection using adaptive optics to essentially “pin” the letter onto the retina in one position.

Unstabilized E Video ("Natural")

Stabilized E Video ("Pinned")

The findings, published in the Journal of Vision in 2017, showed that the human visual system has evolved to not only tolerate the eyes’ incessant movements, but also to leverage this motion, which improves acuity about 25%. “When we allowed eye movements, the letter became clearer—people could see it and which direction it faced more consistently,” Roorda says.

AOSLO’s high-octane tracking capabilities provided further revelations about how eye movement confers an advantage—through time. “If you get one static look at an object, it’s hard to see,” Roorda explains. “If you’re standing still and looking through a slatted fence, for example, all you get is a little glimpse through the openings. But if you are moving past the slats, you can get a good idea of the house or garden behind it, because your eyes—and brain—have more time and more views, and the information accumulates dynamically.” (An illustration of this phenomenon is shown at left, and on our magazine cover).

AOSLO is also getting researchers closer to cracking the neural processing circuitry that underlies the benefit of eye movement. In a 2020 Journal of Vision paper, Berkeley computational neuroscientist Bruno Olshausen, PhD, and his student, Alex Anderson (Physics PhD), used the device’s tracking data to show that by simultaneously estimating object shape and eye motion, neurons in the visual cortexcan compute a higher-quality representation of an object by averaging out non-uniformities—not unlike the computational imaging principles for achieving “super resolution” via camera motion.

From Bench Science to Clinical Care

Berkeley has become an informal hub of eye-tracking expertise with six different labs at the School of Optometry & Vision Science currently involved with research projects related to eye movement. Several of those investigators either use AOSLO data or devices built for them by the Roorda Lab, which has also nurtured innovative technology for direct patient care (see the sidebars at the end of this article).

“There’s a growing interest in trying to understand these processes for dynamic vision,” says Roorda, whose lab continues to advance basic science with AOSLO, while connecting the dots at every opportunity to clinical applications. The Roorda Lab is part of the first team, for example, to use adaptive optics imaging to monitor the efficacy of treatment for retinal degeneration—a collaboration with University of California at San Francisco (UCSF) Department of Ophthalmology Chair Jacque Duncan, MD.

AOSLO offers clinicians objective measurements of visual function—a necessary component of leveraging emerging treatments that can slow the progression of degenerative diseases, says Roorda, citing work by former student Kavitha Ratnam, PhD, who found that patients with retinal degeneration can lose 50% of their cones before they present with a significant drop in their visual acuity.

“This shows that eye movements and motion can help mitigate the effects of cone loss. It’s also a warning message to doctors that you can’t rely on visual acuity tests to determine the level of cone loss due to retinal diseases,” Roorda says. “With AOSLO we can ask questions like, ‘What is the function of that last cone at the edge of the lesion in this patient with a degenerative disease?"

"Microscopic images of the living human eye offer cellular-level insights into how eye disease is manifest, how to slow its progression, and how it responds to treatment,” he adds. “This allows us to study human vision in health and disease in an unprecedented way.”

Further Reading: Investigation

Eye Movement and Amblyopia—Chicken or Egg? When the vital dance between the eyes and the brain breaks down, one consequence can be amblyopia.

Commonly referred to as “lazy eye” because over time the brain relies more on the other, stronger eye, amblyopia is the leading cause of vision loss in kids—more than all disease and injury put together. Its effects can last a lifetime, impacting visual acuity and depth perception.

“In people with amblyopia, their eye movements are exaggerated, constantly in motion, and they drift more. We always thought that was the cause of poor vision,” says Berkeley Optometry & Vision Science professor Dennis M. Levi, OD, PhD. “But we’ve learned from Dr. Roorda and others that normal eye movements have a purpose—to see fine detail and contrast, to get the eye where it needs to go.”

So, with the goal of improving clinical treatment options, Levi and Berkeley Optometry & Vision Science colleague Susana Chung, OD, PhD, are pursuing the answer to a “chicken vs. egg” question: Does amblyopia cause abnormal eye movements? Or are the abnormal eye movements causing amblyopia?

Using adaptive optics built by the Roorda Lab, Levi and Chung are measuring study participants’ responses to visual cues in rapid succession—on the order of 1000 “tasks” per hour. “We can reverse-correlate what’s happening with perception and eye movement,” Levi explains. A scan can also be replayed back to the other eye, to help tease out whether the perception problem is with eye movement or some other factor—a faulty neural signal, for example

“We’re interested in understanding how much eye motion would be optimal to get the best vision,” Chung says. “Eye movements are very plastic. If we understand the limiting factors on visual performance, we can devise strategies that might solve the problem and help people see better."

Further Reading: Innovation

Measuring Eye Movement for Better Health. With a background in optical engineering, Christy Sheehy, PhD, wanted to use her hands and “build something” for her graduate work at Berkeley Optometry & Vision Science. Roorda suggested a high-resolution imaging device that—unlike the rest of the lab’s projects—would not use adaptive optics.

The results? A promising prototype with the potential to evaluate eye and brain health, a UCSF post-doctoral fellowship focused on clinical applications for multiple sclerosis (MS), and a start-up company that recently received FDA clearance for Sheehy’s pioneering retinal eye movement monitor.

Neurologic diseases—like MS, Alzheimer’s, Parkinson’s, Huntington’s—as well as brain injuries, retinal disease, psychiatric disorders, and even cardiac health can be examined “through the window of the eye,” says Sheehy, chief executive officer and co-founder of C. Light Technologies. The company’s Retitrack™ tabletop eye movement monitor records 10-second, non-invasive retinal video scans that measure fixational and saccadic eye movement at the micron level.

“Depending on the condition and the area of the brain, eye motion can be affected in a unique way,” she says “During my postdoc work looking at MS, we saw a lot of pattern changes, like nystagmus—uncontrolled, repetitive motion—or square-wave jerk intrusions.” Sheehy’s recent research on concussions showed microsaccades—small, jerk-like, involuntary eye movements—that are bigger and faster than age-matched controls.

Currently cleared as a general eye movement monitor with subsequent clinical interpretation, the Retitrack™ is in use by clinicians at UCSF, the University of Miami Health System, and the Medical College of Wisconsin. Sheehy sees a broad future for the technology, and is aiming for FDA clearance for the device to detect specific neurological indications. The company is also building algorithms to develop future AI capabilities for early detection and prognostication in health care.

Applying her research to clinical care has a personal element for Sheehy, who lost an aunt to early onset Alzheimer’s, a grandmother to later stage Alzheimer’s, and has an immediate family member with mild cognitive impairment.

“Being able to use the output of the eye as an early indicator or biomarker is huge. Neurology is decades behind, relying on surveys, memory tests, even hand dexterity and walking speed for assessments,” she says. “I would love to help change neurology from a reactionary space of medicine to one that’s more preventative, objective, and forward thinking."

Related Information

Roorda Lab
Levi Lab
Chung Lab

About the Images

Photo #1: Austin Roorda in his lab on the UC Berkeley campus; Photo #2 Bird's-eye view of the Adaptive Optics Scanning Laser Ophthalmoscope (AOSLO). Photos by Elena Zhukova. Illustrations by Harry Campbell.