Program

Upcoming Technical Program Announcement

We are excited to announce that the technical program for the BMI Workshop will be published soon – the first draft of the technical program is expected to be released in August 2026. This program will detail the sessions, keynote speakers, and topics that will be covered during the events.

Participants can look forward to a diverse range of presentations and discussions aimed at advancing knowledge in the field. We will ensure that the program addresses the specific interests of our attendees.

Stay tuned for the official release, as we will provide all the necessary information to help you prepare for an engaging and informative experience. Thank you for your patience, and we can’t wait to share the program with you!

Keynote Speakers

Amy Orsborn, Ph.D.

Cherng Jia and Elizabeth Yun Hwang Associate Professor
University of Washington

Dr. Orsborn is a Cherng Jia and Elizabeth Yun Hwang Associate Professor in the departments of Electrical & Computer Engineering and Bioengineering at the University of Washington. Her research explores sensorimotor plasticity in brain-computer interfaces and how plasticity is influenced by the algorithms used. She completed her Ph.D. at the UC Berkeley/UCSF Joint Graduate Program in Bioengineering and her postdoctoral training at NYU’s Center for Neural Science. She recently received the NSF CAREER award, a Sloan Fellowship, and was named an Emerging Leader by the American Institute of Medical and Biological Engineering.

Title: Predicting and Shaping User-device Interactions in Neural Interfaces

Abstract: Neural interface technologies provide new opportunities to assist and augment human behaviors. For instance, muscle activity can be transformed into commands for an assistive device for people with disabilities or provide richer control for a computer than interfaces like mice and keyboards. Connecting signals from the nervous system to an external device in this way presents users with a new, potentially unintuitive, mapping between their movements and those of the device. Users often change their behavior as they learn to control neural interfaces, and many neural interfaces leverage machine learning to let the device adapt to the users. This co-learning creates complex and high-throughput interactions between algorithms and the nervous system. In my talk, I will present recent research in my lab demonstrating that the algorithms we use in neural interfaces influence neural computations and user learning. I will then present new computational frameworks we’ve developed to predict and shape user-algorithm interactions. These discoveries open possibilities to build neural interfaces that intelligently interact with the nervous system to assist and rehabilitate motor function across diverse users and applications.

Adam O. Hebb, MD, FRCSC, FAANS

Neurosurgeon, Colorado Permanente Medical Group
Research Associate Professor, University of Denver

Adam O. Hebb, MD, FRCSC, FAANS, is a neurosurgeon with Colorado Permanente Medical Group and Research Associate Professor at the Knoebel Institute for Healthy Aging, University of Denver. After his medical education at Dalhousie University in Halifax, he trained in neurosurgery at the University of Minnesota and completed fellowship training in neuro-oncology and epilepsy surgery at the University of Washington. His clinical work includes deep brain stimulation for movement disorders, stereoelectroencephalography, stereotactic robotics, and MRI-guided laser interstitial thermal therapy for epilepsy and brain tumors. His research focuses on human electrophysiology, closed-loop neurostimulation, and decoding behavior from subthalamic nucleus local field potentials using machine learning. He has collaborated with engineering groups on adaptive DBS and brain-machine-interface development and is an inventor on a patent for motor task detection using electrophysiological signals. He is also pursuing a JD at the University of Denver Sturm College of Law, reflecting a broader interest in the regulatory, commercial, and health-system pathways needed to bring emerging neurotechnology from the laboratory and operating room into real-world clinical use.

Title: Neural Decoding to Clinical Action: A Neurosurgeon’s View of Adaptive Brain-Machine Interfaces

Abstract: Brain-machine-interface engineering often begins with a signal. Neurosurgery begins with a patient, anatomy, a trajectory, and a decision that must be safe enough to make in the operating room. This keynote tells the story of BMI translation from that clinical vantage point. Drawing on work in deep brain stimulation, stereoelectroencephalography (SEEG), chronic subthalamic nucleus local field potential recording, and machine-learning approaches to behavior recognition, I will discuss what human brain signals look like when they are acquired through real implanted systems rather than idealized channels. The talk connects three clinical realities: stereotactic access to the brain, decoding behavior and disease state from noisy neural recordings, and adaptive neuromodulation that updates therapy according to the patient’s current state and goal. The aim is to give BMI engineers a practical neurosurgical framework for designing systems that are not only accurate in analysis, but useful, safe, and durable in patients.

Scroll to Top