Loading Events

RI Seminar

February

12
Fri
Andrew B. Schwartz Professor University of Pittsburgh
Friday, February 12
3:30 pm to 12:00 am
Useful signals from motor cortex

Event Location: NSH 1305
Bio: Dr. Schwartz received his Ph.D. from the University of Minnesota in 1984 with a thesis entitled “Activity in the Deep Cerebellar Nuclei During Normal and Perturbed Locomotion”. He then went on to a postdoctoral fellowship at the Johns Hopkins School of Medicine where he worked with Dr. Apostolos Georgopoulos, who was developing the concept of directional tuning and population-based movement representation in the motor cortex. While there, Schwartz was instrumental in developing the basis for three-dimensional trajectory representation in the motor cortex. In 1988, Dr. Schwartz began his independent research career at the Barrow Neurological Institute in Phoenix. There, he developed a paradigm to explore the continuous cortical signals generated throughout volitional arm movements. This was done using monkeys trained to draw shapes while recording single-cell activity from their motor cortices. After developing the ability to capture a high fidelity representation of movement intention from the motor cortex, Schwartz teamed up with engineering colleagues at Arizona State University to develop cortical neural prosthetics. The work has progressed to the point that monkeys can now use these recorded signals to control motorized arm prostheses to reach out grasp a piece of food and return it to the mouth. Schwartz moved from the Barrow Neurological Institute to the Neurosciences Institute in San Diego in 1995 and then to the University of Pittsburgh in 2002. In addition to the prosthetics work, he has continued to utilize the neural trajectory representation to better understand the transformation from intended to actual movement using motor illusions in a virtual reality environment.

Abstract: Over the years, we have shown that detailed predictive information of the arm’s trajectory can be extracted from populations of single unit recordings from motor cortex. Using drawing movements as a behavioral paradigm, these signals have been shown to contain instantaneous velocity information and many of the invariants describing animate movement. Furthermore, this technique can be used to study visuo-perceptual processes taking place as objects are drawn. By developing techniques to record these populations and process the signal in real-time, we have been successful in demonstrating the efficacy of these recordings as a control signal for intended movements in 3D space. Having shown that closed-loop control of a cortical prosthesis can produce very good brain-controlled movements in virtual reality, we have been extending this work to robot control. We are using an anthropomorphic robot arm with our closed-loop system to show how monkeys can control the robot’s movement with direct brain-control in a self-feeding task. The animals controlled the arm continuously in 3D space to reach out to the food and retrieve it to their mouths. Currently we are extending this work to include the control of an artificial wrist and hand. Since the recorded signals are a high fidelity representation of the intended behavior and contain features of animate movement, neural prosthetic devices derived from this technology are capable of producing agile, natural movement.

Recently we have been using the brain-control paradigm to examine learning as it takes place across the network of recorded neurons. Our paradigm allows us to drive neurons to adapt new tuning functions and we can track this process continuously as it takes place. This shows that there are distinct global and local processes taking place as subjects regulate their neural activity when learning to operate novel tools.