Using Eye Movements to Diagnose Brain Health

2308312023_08_31_UCB_Optometry_preview048_1010x540-2

Article by Zac Unger

Inside the Lab of Dr. Jorge Otero-Millan

Dr. Jorge Otero-Millan’s lab feels like a cross between a video arcade and an underground goth night club. Everything from the walls to the door handles is painted pitch black, there’s not a window to be found, and screens of various sizes are placed in front of mysterious contraptions. The centerpiece of it all is a race car simulator, complete with captain’s chair, steering wheel, and three flat screen televisions, all atop a base that can move a seatbelted “driver” in any direction. “I have this here just for the ‘oh, wow factor,’” Otero-Millan jokes. And while it’s undoubtedly true that not every professor at the Herbert Wertheim School of Optometry & Vision Science gets their own personal carnival ride, Otero-Millan’s devices are actually critical tools that have great promise for the study and treatment of multiple maladies, from post-concussion syndrome to Parkinson’s disease.

We often talk about “staring intently” or “fixing our gaze” when we want to indicate that somebody is trying to get a good look at an object. But our eyes are anything but static while they work to make sense of the world around us. “Our eyes are not perfect,” Otero-Millan explains, “only a little bit of our eyes see with high resolution, and if things move too fast in front of us we just see a blur.” At the center of the retina is a small area of dense-lypacked photoreceptors known as the fovea; this area provides us with our sharpest and most acute vision. In order to send coherent images to the brain, the eye tries to keep the fovea stabilized and directed at the area of our main interest, Otero-Millan says. “But it also brings problems. Because as we have to look around, important images are jumping around in our retina,” which would make it hard to walk in a straight line if the path in front of us appears to be in constant motion.

Different animals have developed different solutions to this problem. If you hold a chicken and rotate its body, for example, the bird’s head will stay in one place so that the eyes can remain locked on target. Human beings evolved a different strategy, using smooth involuntary eye motions that keep the fovea directed at the area of interest, even as our heads—and the things we’re looking at—are in constant motion. Whenever we want to change the object of interest, we use rapid motions, called saccades, which are one of the subjects of Otero-Millan’s research.

“If you were to take a phone’s camera and make a recording moving the phone like your eyes move, looking at that video would make you dizzy,” he says, because you’d see everything moving all over the place. “And yet, when it’s happening, you are completely unaware. It is a nice illusion that everything is stable in front of us, while in reality the retina is constantly in motion.”

These saccadic motions happen constantly, often as many as three to five times per second. These are different from slow, voluntary eye motions, such as when a doctor asks you to follow her finger as she moves it back and forth. Instead, these are staccato motions: “It’s a jump and then a stop,” Otero-Millan says. “If I’m looking at a painting, my eyes don’t do a smooth scan of everything. They do a series of jumps, and each of those jumps is a saccade. Even if you are trying to stare directly at a point, your eyes are going to be making the very smallest of saccades that you can’t even notice.”

It’s the brain’s job, then, to take this scattered, disjointed information and assemble the discrete pictures into the smooth coherent movie that is how we experience our world. As an exercise, Otero-Millan has his students stand in front of a mirror, and asks them, “Focus on your left eye and then focus on your right eye, back and forth, and what kind of motion do you see?” If you try this at home you’ll notice…nothing. Your brain does such a good job of putting it all together that you won’t be able to detect any motion whatsoever. But look into the eyes of somebody else doing the exact same thing, and the eye motion will be immediately obvious.

“Life without eye movements would make completing simple tasks, like preparing food or navigating from one place to another, devastatingly difficult,” says Stephanie Reeves, a PhD student in Otero-Millan’s lab. Without eye movements, the visual input to our brains would be unstable and shaky due to our constant head and body movements. Without saccades, “we would probably experience neck pain because we’d be moving our heads rapidly to place the fovea on areas of interest,” Reeves says, “and we would experience double vision and poor binocular vision, as well.” Saccades are autonomic, but even further outside our control than other unconscious functions like breathing.

“We can control where we look,” Otero-Millan explains, “but we can’t control how we move our eyes. I can ask you to look from here to here,” he says, holding up a finger on each hand, “and you can do that. But if I ask you to do it slowly, you just can’t. You have some control over the general pattern, but in the end, the eyes are going to do what they’re going to do.”

This involuntary motion has great implications for the clinical discovery and monitoring of disease processes. “The eyes are like an EKG for the brain,” says Dr. Debora Lee Chen, Associate Professor of Clinical Optometry and chief of Berkeley’s Binocular Vision Clinic. “Optometrists,” Chen says, “diagnose and manage many visual problems related to all kinds of neurological conditions.” And this makes perfect sense when you consider that, according to researchers, over half the brain’s cortex is involved in visual processing in one way or another. When there is a problem with the brain, it is often expressed in the eyes and, in particular, in the way the eyes perform the precise saccadic movement that Otero-Millan studies. “When you have a problem in that circuitry,” Chen continues, “it can serve as a proxy to tell you that there’s a brain problem further upstream. And it could be one point or multiple points that start directing you towards the source of the problem.”

“The eyes are like an EKG for the brain.”

While the clinical application of using saccades to diagnose pathology is still in its infancy, there appears to be great promise. Alzheimer’s disease, stroke, epilepsy, dementia, and traumatic brain injury are just a few of the conditions that Otero-Millan and his colleagues hope to better understand through their work. “The great thing about eye movements,” he says, “is that they’re easy to measure very precisely.” Eyes can only move in three ways: up/down, side-to-side, and via torsion—a particular interest of Otero-Millan’s—where the eye rotates around the central axis, the way you would turn a volume knob on an old-school radio. A person’s arm, by contrast, has many more degrees of freedom along which it can travel, so measuring movements there is messy and imprecise. “The other advantage,” according to Otero-Millan, “is that the eye is directly connected to the brain through fewer steps than other muscles.”

Additionally, different types of eye motion are associated with particular and well-differentiated locations in the brain. For example, Otero-Millan explains, “If we look at patients who may have localized lesions in the brain, you may actually get eye movements and saccades that are completely normal when it comes to smooth following, but the eyes can’t stabilize when the head tilts or moves up and down.” The eyes, therefore, can manifest problems in very specific brain locales, pointing clinicians towards a diagnosis. “Instead of having to ask a patient whether they can see my finger or if a light was moving to the right or to the left,” Otero-Millan says, “I can introduce a measurable stimulus, track exactly how the eyes respond, and know how the brain is processing.”

One of Otero-Millan’s principal interests has been the study of vertigo and dizziness. When patients with these symptoms show up at the emergency room, it’s important to diagnose them quickly. A large percentage of these sufferers will have a relatively common and easily treatable case of benign paroxysmal positional vertigo. “But there is a percentage of people that may actually be having a stroke,” Otero-Millan explains, “and some of them may not have the other typical stroke symptoms, like arm weakness or struggling to speak. But if you do the right tests of eye movements, you can actually determine whether it’s a stroke or just a problem of the inner ear.” Because strokes must be treated within three hours of onset, the speed and clarity of diagnosis can have a profound impact on reducing morbidity and mortality.

Chen, who is hoping to expand on Otero-Millan’s research by bringing it more regularly into the clinical setting, is excited by the diagnostic possibilities. “Classically speaking, when we diagnose brain disorders, it is a drawn-out process of neuropsychological and neurological tests that can take hours,” she says. Direct brain imaging in an MRI machine can also be expensive, time-consuming, and even frightening. “We have certain patients who are brain injured or they’re elderly or who have autistic spectrum disorder,” she says, who can’t tolerate extended time in an MRI tube. “But eye movement is so accessible and easy to capture. From a clinical standpoint, that’s the dream come true.”

All the devices in Otero-Millan’s lab are designed to measure eye motion in various ways. The race car simulator allows a subject to be bounced around while wearing special goggles outfitted with inward-pointing cameras that track eye motion. Other devices are similar to what you might see in any ophthalmologist’s office, designed to minimize head motion while the clinician—or in this case, eye-tracking software—analyzes the patient. But the most effective device in Otero-Millan’s arsenal may be one that most people already spend hours a day staring into: our smartphones. “This is in its infancy,” Otero-Millan says, “but I think it’s about to explode. The devices people have at their homes are going to be able to measure eye movements with more and more accuracy.” Just as Apple Watches now have the ability to track our pulse and let us know when we may be having a cardiac event, we may one day be watching a TikTok video at the same time that the phone is watching our eyes, and then receive an alert that we are showing early signs of stroke or other adverse health event.

And while that might sound a touch dystopian for you, using a phone as an extension of a clinician’s practice has far-reaching potential. Otero-Millan foresees a day when high school athletes record baseline eye-motion data, which can then be compared to data from after a suspected concussion.

On an even simpler level, a smartphone can instantly transport the clinic to the living room. “Symptoms can be frustratingly sporadic,” says Chen, who sees hundreds of patients a year in her office. “Patients will say that they felt terrible, that every-thing was spinning, but then when they arrive for their appointment, everything is fine on that day.” Otero-Millan is developing software that can be downloaded as an app, which would allow patients to monitor themselves over long periods of time or simply record their saccades at the precise moment their symptoms become acute.

“We can incorporate all of this with AI and machine learning,” says Otero-Millan, describing how computers can learn to recognize signs and symptoms of disease by comparing individual results to massive databases. “Computational modeling will assist us with diagnosis and a better understanding of how the brain functions.”

The science of eye tracking is of particular interest to tech companies, who not only want to know precisely where their users are focusing their attention, but also need to interpret how saccades work in order to make virtual-reality successful. "If you want to understand why those [virtual reality] headsets make people nauseated,” Otero-Millan says, “you need to understand how the brain interprets motion, how we perceive things as being stable or not."

“If you want to understand why those [virtual reality] headsets make people nauseated,” Otero-Millan says, “you need to understand how the brain interprets motion, how we perceive things as being stable or not.”

Despite the commercial potential for his work, Otero-Millan remains committed to the clinical applications and the potential for making life easier for those suffering from diseases. “He doesn’t come from a clinical background,” says Chen, “and yet he has such a clinical eye, combined with a vantage point and a multidisciplinary perspective that will eventually be so valuable to our patients.” As for Otero-Millan himself, his dreams are even more expansive: “I want to create a full model of how the brain controls our eye movements,” he says. “How does the brain know if the eye has moved or the world has moved? Once we have that model, we can change any little piece of it and truly understand what our patients are experiencing. That model will be the ultimate diagnostic tool.”

About the Photos

1. Dr. Jorge Otero-Millan sits in a gaming pod that has been repurposed for investigations into how we perceive the world while we move.
2. Dr. Jorge Otero-Millan at his desk station.
3. Vision Science PhD student Stephanie Reeves (center) with Dr. Jorge Otero-Millan (right) and a fellow researcher (left).
4. Dr. Jorge Otero-Millan using eye-tracking software.

Photos by Elena Zhukova.

Related Information

Ocular-Motor Lab
Dr. Debora Lee Chen