Quantcast
Channel: Science News | Autism Speaks
Viewing all articles
Browse latest Browse all 816

Virtual Reality Training Improves Social Skills and Brain Activity

$
0
0

Autism Speaks Meixner fellow working with young adults who have autism to improve effectiveness of virtual-reality training program

November 17, 2014

Twice a week, each of the young adults in Daniel Yang’s study spends an hour inside a virtual world. A webcam projects his or her facial expressions onto a digital avatar that interacts with the avatar of an autism therapist.

One or more virtual characters join in as the therapist presents the day’s situation. It may be a job interview, a new neighbor or a blind date. The counselor also describes the social skills they’ll be practicing. The task may involve recognizing the unspoken intentions behind a behavior or sharing an opinion in a socially acceptable way.

With funding from Autism Speaks, Dr. Yang and his team hope to improve the Virtual Reality Social Cognition Training program’s effectiveness in teaching social skills to adults with autism spectrum disorder. In the first phase of the study, they are evaluating how well the program improves social understanding. Using brain imaging and brain-wave monitoring, they’re also tracking whether and how the program changes brain activity and connections between brain regions involved in social behavior.

“Our early results are beginning to reveal a remarkable degree of malleability in the neural systems involved in social cognition in adults with ASD,” he says. Lay translation: On brain scans performed after the training, the researchers are seeing brain regions associated with social understanding literally light up in ways they hadn’t before. (See images below.)

They’re also seeing new connections form between brain regions that need to exchange information during an effective social interaction. (See images at right.)

Presentation at Neuroscience 2014
Dr. Yang, an Autism Speaks Meixner Postdoctoral Fellow, presented his study’s preliminary findings on Sunday, at Neuroscience 2014, the world’s largest annual gathering of brain researchers. Dr. Yang’s fellowship mentors include Yale neuroscientist Kevin Pelphrey.

So far, four participants with autism (three men and one woman, between age 18 and 35) have completed the 5-week training and undergone the before-and-after brain monitoring and social-understanding testing. In all, Dr. Yang hopes to complete the testing with 48 young men and women with autism. Participants are verbal and have typical or above-average IQ.

For comparison, Dr. Yang is also enrolling young adults not affected by autism. So far, two men and two women have completed the testing.  

Pre- and post-training test results
To gauge changes in social awareness and understanding, the participants watch an animation of two “dancing” triangles. In some cases, the triangles move randomly. In others, their movements suggest an interaction. (Watch the dancing triangles animation task here.)

After the virtual-reality training, the participants with autism were better able to discern the “interacting” triangles from those moving randomly. They also improved in their ability to identify the kinds of interactions the triangles were mimicking (teasing, chasing, etc.).

While the participants perform the task, the researchers monitor their brain activity with functional magnetic resonance imaging (fMRI). The fMRI reveals the location and degree of brain activity. Importantly, Dr. Yang says, the increased brain activity and connections seen after the virtual-reality training more-closely resemble the brain activity seen in the four “neurotypical” adults serving as controls.

In addition, the researchers will use electroencephalography (EEG) to track whether different areas of the brain respond more quickly, or efficiently, to social cues. Dr. Yang and his team are still analyzing the results of their first EEGs.

“Virtual reality and avatar-based programs may be especially promising for people with autism who are uncomfortable in social interactions where subtle social cues are important,” comments Daniel Smith, Autism Speaks senior director of discovery science. “Drs. Yang and Pelphrey are taking this area of research to a new level by studying how a virtual social skills experience changes the brain. Ultimately, their work may reveal new clues about the unique biology of autism and what types of interventions are effective for which people.”

View Dr. Yang’s complete poster presentation here.

Read more coverage of autism research at Neuroscience 2014 here.

 

Explore all the research and family-service projects that Autism Speaks is funding using this website’s grant search. These projects are made possible by the passion and generosity of our families, donors and volunteers.

Subscribe to Autism Speaks Science Digest for autism research news, perspective and expert advice delivered biweekly to your inbox.

 

 


Viewing all articles
Browse latest Browse all 816

Trending Articles