Programmed to Treat Autism

 
Scientists hope a lifelike robot named Zeno can help diagnose and treat children with autism. Dan Popa, who directs the Next Generation Systems group at the UT Arlington Research Institute, is developing robots that are smaller, less expensive, and more intelligent than current models.

Zeno has empathetic eyes in a beautiful hazel hue and can walk and gesture with two hands. His lifelike skin is called Frubber frown, and look inquisitive. and allows his face to smile, But he doesn’t make judgments.

Zeno is a 2-foot-tall robot, and researchers believe he may be able to recognize autism in infants and toddlers before traditional speech and social diagnoses that rely on interactions.

UT Arlington scientists have teamed with colleagues at the University of North Texas Health Science Center, the Dallas Autism Instruments, and Hanson Treatment Center, Texas Robotics in Plano to rework Zeno and other lifelike robots to diagnose and treat children suffering from autism spectrum disorders. The robot would not only interact with the children but would measure their movement and indicate what therapies work best.

“It’s more than just seeing how autistic children react when interacting with the robot,” says electrical engineering Associate Professor Dan Popa, principal investigator of the project, which is funded in part by a grant from the Texas Medical Research Collaborative. “Eventually, we want to customize the robot to better fit individual needs of children with autism.”

Carolyn Garver, director of the Dallas Autism Treatment Center, says the earlier the disorder is identified, the sooner it can be treated.

“Children with autism are intrigued by the robot. Robots are nonjudgmental. Sometimes autistic children just shut down with human interaction,” says Garver, who notes that one in every 88 children will have an autism spectrum disorder.

“If we can document that a certain eye gaze or motor movement means some level of autism, this could help in developing ways to treat it early on.”

She believes the best possible outcome of the research would be to identify biomarkers though a child’s movement to aid in diagnoses.

“There really are no biological methods of determining autism. Right now we just observe. If we can document that a certain eye gaze or motor movement means some level of autism, this could help in developing ways to treat it early on.”

Nicoleta Bugnariu, an associate professor at the UNT Health Science Center and a physical therapist/neuroscientist, is most interested in motor control issues.

“How these children keep their balance, reach for an object, and move about a room may be extremely important in diagnosing autism,” she says. “If we can detect these motor biomarkers and determine the timing of these differences during the developmental process, that would be of great benefit for diagnosis and treatment.”

Autism is typically diagnosed based on deficiencies in social interaction and speech problems. But with infants or toddlers, an emphasis on motion could aid early detection.

“In the first two years of life, language is a small part of a person,” Dr. Bugnariu says. “Children move first, then speech comes. We can’t wait until they use speech. We need to determine sooner who has autism.”

Dr. Popa, who directs the Next Generation Systems group at the UT Arlington Research Institute, focuses on developing robots that are smaller, less expensive, and more intelligent, agile, and networked than those on the market today. Hanson Robotics sought his help to make the robots more human and take them from the lab to the home. Hanson provided the initial robot, and Popa’s team embedded a more performance-controlled system into it.

“That way you can adapt the robot behavior to do anything you want,” he says.

Popa, who has worked with Hanson since 2005, says responsive cameras similar to the technology in game systems like Microsoft’s Xbox Kinect could be placed in Zeno’s eyes. Such vision tools would record a child’s movements and mimic behavior. Hardware based on Texas Instruments chips and cameras could be used to fashion a control and perception system to record movement.

“We believe the research will lead to a better life for the child with autism,” he says.

Zeno the Robot

Explanation of research accomplished

The goal of this research project was to create the technological and clinical basis for a one-of-a-kind Human-Robot Interaction System and Virtual Environments, which could be used as early assessment and treatment tools for children suffering from Autism Spectrum Disorders (ASD). All four specific aims proposed have been accomplished:

Specific aim 1: Develop advanced human-robot interaction hardware, software and protocols specific for ASD

Specific aim 2: Evaluate the sensory-motor performance in children with ASD using robotic hardware

Specific aim 3: Evaluate motor performance and postural control in children with ASD using virtual environments

Specific aim 4: Test the hypotheses that sensory-motor performance can serve as potential markers for early diagnosis of Autism Spectrum Disorder. We consolidated and compared data collected with both the robot and virtual environments, and compared ASD and typically developing children.

Advances in Zeno control and production of human like movement

Zeno is a child-size, 2 foot tall, articulated humanoid robot by Hanson Robotics with an expressive human like face shown in Fig. 1. It has 9 degrees of freedom (DOF) in the upper body and arms, an expressive face with 8 DOF, and a rigid lower body. The robot is capable of moving the upper body using a waist joint, and has four joints in each on the arms implemented using Dynamixel RX-28 servos. It has a 1.6 GHz Intel Atom Z530 processor onboard and is controlled by an external Dell XPS quad core laptop running LabVIEW. The appearance of Zeno is based on a fictitious character – he looks like a 4-7 year old child, and his head is about ¼ the size of an adult human head. Its unique features include life-like skin made of Frubber™ material. We have implemented a real-time sensor, processor and actuator technology, such that the motion of the robot does not appear “choppy” or “lagging” during tracking and interaction with a human. We have achieved a smooth motion that resembles human-like movement taking into consideration that Zeno has a lower number of DOF compared to the DOF of a human. The smooth motion of the robot is important to generate human-like movement. To make sure this was achieved, we took into consideration the update rate of its control system and the processing power. We have found that the system operates smoother in quad-core processors than in dual-core processor based system. We were able to achieve a 0.0235 second robot response time.

Implementation of Zeno in different configurations

Autism Testing

Zeno can interact with children in three modes:

Interactive mode – responds to user actions

Zeno robot can generate any desired motion, with a joint velocity generated by the human motion as long as it does not exceed the servo’s velocity constraints and is able to generate a more human-like motion during therapy using a Kinect control.

Teaching mode – the robot leads the child to perform exercises

Zeno can play a specific combination of body movements, facial expressions and speech for controlled sessions. Specific movements for arm waving hello/goodbye, good job fist bump and tummy rub have been implemented.

Reward mode – the robot rewards appropriate action for the child.

Development of Dynamic Time Warping Algorithm

 

There is a need in the Autism research community to obtain quantitative measurements of imitation quality. We developed a

Dynamic Time Warping (DTW) algorithm to obtain a similarity measure between time series joint angle signals produced as the children imitated the robot movement. Zeno RoboKind software allows pre-programming of scripted motions as well as conversation using voice recognition and text to speech software. The robot interacts with children implementing the following behaviors: verbal dialog: look at me, follow me, imitate arm and hand movement. Children are directed by the robot Zeno to follow along and imitate its movements by performing several gestures. Specific movements such as “wave hello/goodbye”, signaling “I am hungry” with a “tu

Graph2

mmy rub”, and “good job fist bump” are performed with the left and right arms at least 6 times. Reflective markers are placed on equivalent anatomical locations on the child’s body and Zeno structure. The 3D joint position data of the head, trunk and limbs is captured with a 12 camera motion analysis system at 120 Hz (Motion Analysis corp, Santa Rosa, CA). Figure 3 shows a child participant and Zeno instrumented with reflective markers and performing a wave motion and the system capturing the markers using the Motion Analysis cameras and generating skeletal models of Zeno and child.

AlgorithDTW is an established signal processing method that offers a distance measure between signals similar to the Euclidean distance. However, time-warping is applied to signals to align them optimally, prior to taking the difference. Optimal alignment in this context is the alignment of the signal time samples that makes the total distance between the signals as small as possible. This alignment induces a non-linear mapping between the two signals, e.g. warping of the signals. The strength of DTW is in its ability to compare the similarity between signals by ignoring time-delays and uneven time sampling. This situation is very relevant since the motion of child and robot experiences both these effects. A typical result when using the DTW algorithm, with the gray lines depicting the nonlinear map between two signals can be seen clearly from the right side of Fig. 4.Algorith

Development of Virtual Environments (VE) for assessment of motor function in ASD

Novel approaches to evaluate function and provide therapeutic intervention for children incorporate technologies such as virtual reality that allows children to engage in interactive gaming scenarios. Presenting children with virtual environments that they can explore in a game-like matter facilitates monitoring and recording of motor and postural tasks. The assessment of sensory-motor and postural control with a unique tool, the V-Gait Computer Assisted Rehabilitation Environment Network (CAREN) system allows children to play while kinematic and balance data are collected. Three VE were developed and used in this project: “Shooting Ducks” for assessment of reaching and pointing movements, “Walk in the park” for assessment of gait, and “Driving a car” for assessment of weight shift and postural control in standing.

Autism Testing

“The TexasMRC grant program has been a strong catalyst for research. It has achieved the goal of bringing together a robotics & electrical engineer from UTA, a neuroscientists/ physical therapist from UNTHSC and a clinician from ATC working together to solve complex, relevant medical problems encountered in Autism. The program has also benefited the graduate students who worked on this project by exposing them to an interprofessional way of conducting research. We very grateful for the opportunities the TexasMRC grant program has created.”

“If we can document that a certain eye gaze or motor movement means some level of autism, this could help in developing ways to treat it early on.”
Carolyn Garver, Director
Dallas Autism Treatment Center

 

 

Primary Investigator:

 

Dan Popa, UTA

Professor-College of Engineering, Associate Dean of Engineering for Research-College of Engineering

popa@uta.edu // 817-272-3342 // Biography

 

Team Members:

Nicoleta Bunariu, UNTHSC

Leave a Comment About This Project

Your email address will not be published.

Please Prove You Are Human *

Fw82hx