man with virtual reality headset
Associate professor Geoffrey Wright tests out an Oculus virtual reality headset for a previous project.

In a cinder block basement room in Pearson Hall, inconspicuously down the corridor from the student swimming pools, Geoffrey Wright and his computer programmers create virtual realities. Wright, associate professor in the Department of Health and Rehabilitation Sciences, runs the Motion-Action-Perception (MAP) lab, where his team develops immersive fantasy landscapes to help diagnose real-world balance and motor skills impairments.

Now, with a new Loretta C. Duckworth Scholars Studio fellowship from Temple, Wright will spend a year building a new generation of software tools, employing VR technology not only to assess balance problems but also to become part of the rehabilitation process with clinical treatments for balance impairments.

“We can use the same kinds of virtual environments that we use for assessments in rehabilitation,” Wright says.

Balance can be a problem for people who have suffered brain injury and can turn into a significant health issue for many older adults, so understanding how to evaluate it and create appropriate therapies is important. In the United States, a person 65 or older falls every second, according to the Centers for Disease Control and Prevention. “If you take someone in their 80s, and they fall, it can be a downward health spiral,” Wright says. “It’s the impact of maybe breaking your hip and not ever recovering from it.”

In the basement MAP Lab, Wright offers a demonstration of his existing prototype system for assessing vision-related balance issues. A test subject puts on an Oculus Quest headset, computer-connected digital goggles that are normally used for video games, and then is transported inside a 360-degree simulation that looks like the cockpit of a Star Wars spaceship. The subject is asked to stand as still as possible while the computer takes baseline readings. Then the scenery begins spinning as if the ship is in a barrel roll, and the subject must hold his ground steadily. The computer measures physical responses in split-second detail. In a follow-up exercise, the subject has to withstand the dizzying spin while standing on a spongy pad that provides less support than the hard floor.  

This suite of tests that Wright is prototyping is designed to assess how much a person’s balance is influenced by visual input. Balance involves a coordination of three systems, he explains: vision, vestibular (“gyroscopes” in the inner ear) and somatosensory (such as feeling the floor with one’s feet). Disruptions to any of these, from injury or the natural process of aging, can compromise a person’s balance. 

“We can do a series of tests to see if you're likely to fall because of a loss of somatosensory input, or because of your being visually dominant, or because you have a vestibular disorder,” Wright says. “We can make people feel like they're turning upside down, and if they’re a visually dominant person, they're more likely to stumble and fall. All this helps the therapist or clinician determine what sort of intervention they should do, what sort of exercises the person should be doing.”

The new fellowship will let Wright work toward expanding the technology from diagnosis to intervention. It will also let him extend his work beyond his hidden-away lab to the well-equipped Duckworth scholars studio in the new Charles Library, where more programming support and research assistance will be available.  New exercises will involve weight-shift training, dynamic balance exercises to a metronome, and keeping balance while moving one’s head.  

“By  immersing somebody in a virtual environment, then having them reaching or leaning towards targets, while viewing moving dynamic environments, we can habituate them to become a little less dependent on their visual system, become better at integrating that visual input with their other sensory inputs, and train them to multi-task," Wright says. "In VR, all this can happen in a safe setting, as we expose them to progressively more difficult tasks, which ultimately will prepare them better for real world challenges.”

Once the software is ready, “we’ll begin recruiting some older adults,” Wright says, “especially those at high fall risk." The current method for determining whether a person is at high fall risk is a bit simple and overdue for an update, Wright suggests. When a person reaches 65 and becomes eligible for Medicare, a doctor will ask whether they have fallen in the past year.

"If they say 'no,' then the doctor usually doesn't pursue it any further. Which really is not a very exact way of doing it," Wright says. "That's why there needs to be more research like this."