AI for Automation
Back to AI News
2026-04-13ALSbrain-computer interfaceEEGBCIbrainwave technologyassistive technologyneurotechnologydigital avatar

ALS Dancer Performs Live via Brainwave Avatar in Amsterdam

ALS dancer Breanna Olson performed live in Amsterdam using only brainwaves — a world-first brain-computer interface performance powered by AI and EEG.


Breanna Olson has danced since childhood — ballet, contemporary, jazz. Two and a half years ago she was diagnosed with ALS (Amyotrophic Lateral Sclerosis, a disease that kills the motor neurons controlling your muscles, leaving the mind completely intact while the body goes silent). She hadn't performed since. Then, in December, she stepped onto a stage in Amsterdam — virtually, powered by AI and brain-computer interface (BCI) technology. Her brainwaves did the dancing.

At the OBA Theatre in Amsterdam, Breanna sat with an EEG headset (electroencephalogram — a device that reads the electrical signals produced every time a neuron fires in your brain) and watched a mixed-reality avatar — a constellation of blue dots — move across the stage. Every step, turn, and gesture was driven by her mental intention, translated in real-time by a system built by Dentsu Lab and NTT. It was described as the first live performance of its kind anywhere in the world.

Breanna Olson's brainwave-controlled blue-dot digital avatar performing live on stage at OBA Theatre Amsterdam via Waves of Will BCI technology

From Tacoma to Amsterdam: 2.5 years of ALS and one night on stage

Breanna grew up in Tacoma, Washington, training across 3 dance disciplines — ballet, contemporary, and jazz — from childhood. Dance was not a hobby. It was her primary language for expression and connection with others.

ALS — the most common form of MND (Motor Neurone Disease) — progressively destroys the motor neurons (nerve cells that relay every "move now" signal from your brain to your muscles). As it advances, walking, talking, swallowing, and eventually breathing all become impossible. There is no known cure. Average life expectancy after diagnosis ranges from 2 to 5 years. Approximately 450,000 people globally are living with ALS at any given time, with around 5,000 new diagnoses confirmed in the UK each year.

By the time Breanna sat in Amsterdam, she had been living with ALS for approximately 2.5 years. Her physical mobility had declined significantly. But her capacity to imagine movement — to mentally rehearse a pirouette, a reach, a step — remained completely untouched. That is the neurological gap Dentsu Lab built their system to bridge.

How your brain becomes a choreographer: the AI-powered EEG and BCI pipeline explained

The core insight behind Waves of Will is both simple and striking: when you imagine moving, your motor cortex (the brain region responsible for planning and initiating physical actions) generates the same electrical patterns it would if you actually moved. The body does not follow. But the signal is there — readable, consistent, and mappable.

Dentsu Lab, a Japanese technology research firm, worked with NTT (Nippon Telegraph and Telephone, one of the world's largest data infrastructure companies) to build a system that captures imagined-movement signals and converts them into real-time choreographic instructions. The AI layer does the heavy lifting: machine learning models trained on Breanna's specific brain activity patterns learn to distinguish "raise arm" from "turn" from "step forward" with enough precision for a live performance. Here is the full pipeline:

  • Intent formation — Breanna imagines a specific dance movement (raising her arm, turning, a step sequence)
  • Neural firing — her motor cortex generates the same electrical pattern it would if she physically performed that movement
  • EEG capture — the headset records those electrical signals across her scalp
  • AI signal translation — a BCI (brain-computer interface, the software layer that maps brain signals to machine commands) matches the pattern to a pre-learned choreographic library using trained AI models
  • Avatar execution — the matched movement is sent to a mixed-reality avatar composed of blue motion-capture dots, which performs it live on the OBA stage

The performance was not pre-recorded or scripted. Breanna performed live alongside human dancers on the Amsterdam stage. Her avatar moved in real-time, driven entirely by her mental choices in each moment — making it a genuine interactive performance rather than a replay.

Breanna Olson wearing an EEG headset for brain-computer interface training during the Dentsu Lab Waves of Will AI-powered dance project

Waves of Will: built to return expression to people with motor-degenerative conditions

The project is called Waves of Will — named deliberately. It is about transmitting intent when the physical pathways are no longer available. But Dentsu Lab is clear that this is not just an art installation. It is part of a broader initiative to make brainwave-based expression technology available to people living with ALS, MS (Multiple Sclerosis), Parkinson's, and other motor-degenerative conditions.

Naoki Tanaka, Dentsu Lab's Chief Creative Officer, acknowledged the fundamental obstacle the project faces at scale: most EEG and BCI-based systems today are "very expensive and not accessible to everyone." The goal of Waves of Will is to advance toward a version of this technology that works outside of a performance venue — something reachable by people managing these conditions in daily life, not just controlled research settings.

Mariko Nakamura, a representative from NTT, pointed to the next frontier: the same brainwave-to-instruction pipeline could extend to wheelchairs (moving them via mental commands), home environments (controlling lights, doors, and appliances without physical input), and other assistive devices. The Amsterdam performance was the proof of concept. Scale-out is what the team is building toward now.

What Breanna said — and why it matters beyond one performance

For someone who grew up as a dancer — for whom physical expression was as fundamental as language — an ALS diagnosis does not just remove mobility. It systematically removes the mechanism through which identity is communicated. When BBC News spoke with Breanna about the Amsterdam performance, she did not describe the technology. She described what it gave back:

"I never dreamed that I would be able to dance on stage again. It was just a beautiful and memorable moment I will remember for the rest of my life."

"This is a new way of expression. To be able to move in a new way and a different way is just freeing."

"We can do more than we think we can."

That last line carries real weight. One consistent pattern in assistive technology research is that a single demonstrated proof — one person who genuinely does the thing everyone assumed was out of reach — shifts both public imagination and research investment. Breanna's Amsterdam performance is exactly that kind of proof.

Breanna Olson, ALS dancer from Tacoma Washington, who performed via Waves of Will brainwave-controlled BCI avatar live in Amsterdam

AI, BCI, and EEG: what to watch as brainwave technology moves beyond the stage

The honest assessment of where Waves of Will stands in 2026: proven in a controlled performance environment, not yet accessible to most people who need it. The EEG hardware remains expensive. Training the AI models to read a specific person's motor signal patterns takes significant time. Miniaturizing this into something a person with ALS could use at home is a meaningful unsolved engineering challenge.

But the trajectory is worth paying attention to. When eye-tracking communication technology was first used by people with ALS in research settings in the early 2000s, no one expected it to become a standard consumer device within a decade. BCI technology is on a similar curve — hardware costs falling, AI signal recognition improving, latency shrinking toward real-time viability. Dentsu Lab and NTT have now demonstrated that it works outside a lab. That is the threshold that matters.

If you work in accessibility design, healthcare technology, or human-computer interaction, the Waves of Will project is worth bookmarking. If you or someone you know is navigating ALS or another motor-degenerative condition, BBC's full feature on Breanna's story is the best starting point. And if you are watching where AI-powered brain-computer interface research is heading in 2026 — the gap between EEG, real-time avatar control, and daily assistive technology is closing faster than it looks from the outside. You can follow this space through our AI automation guides as the technology evolves.

Related ContentGet Started | Guides | More News

Stay updated on AI news

Simple explanations of the latest AI developments