Article image
04-14-2024

New brain-computer interface lets users game with their minds

Researchers at the University of Texas at Austin have made a major leap forward in brain-computer interfaces. Forget those controllers! Imagine navigating the twists and turns of games like Mario Kart with nothing but your mind. Turns out, it might not be so far from reality.

Brain-computer interfaces and games

Brain-computer interfaces have the potential to revolutionize the lives of people with motor disabilities. Imagine the freedom of controlling a wheelchair or even a robotic prosthetic limb using only your thoughts. This technology could restore independence and improve daily activities for millions of people.

However, despite significant research progress, a major hurdle has limited the widespread use of brain-computer interfaces: calibration.

Every brain is unique, just like a fingerprint. Traditional brain-computer interfaces require extensive calibration for each individual user.

This involves a lengthy process of mapping brain activity patterns to specific commands, making it impractical for many potential users.

The ‘one-size-fits-all’ brain decoder

The heart of the University of Texas team’s innovation is the integration of machine learning into their brain-computer interface system.

Machine learning algorithms excel at identifying patterns, and in this case, they learn to recognize the unique patterns of an individual’s brain activity.

Machine learning allows the brain-computer interface to adapt its “translation” between brain signals and commands much faster, essentially calibrating itself as the user interacts with the system.

“When we think about this in a clinical setting, this technology will make it so we won’t need a specialized team to do this calibration process, which is long and tedious,” explained study lead author Satyam Kumar.

Impact of brain-computer interfaces in games:

  • Eliminating specialists: Traditionally, setting up a brain-computer interface might require trained technicians or clinicians to interpret the user’s brain signals and map them to device actions. Self-calibration dramatically reduces the need for this specialized expertise.
  • Reduced setup time: Lengthy calibration sessions can be tiring and frustrating. A system that learns as you go cuts setup time significantly.
  • Increased accessibility: The combination of easier setup and less reliance on specialists means that brain-computer interfaces technology could be used in a wider range of healthcare settings and potentially even in a user’s home, significantly widening its reach and potential benefit.

Brain-computer interfaces in gaming

Let’s delve deeper into the fascinating interplay between your brain and this technology:

The thinking cap (or EEG)

This seemingly simple cap plays a crucial role. It’s equipped with dozens of electrodes that detect incredibly faint electrical changes on the surface of your scalp.

These changes reflect the coordinated activity of millions of neurons (brain cells) as they communicate with each other. Imagine it like trying to overhear the buzz of a massive crowd from outside a stadium – a complex mixture of signals!

The decoder

This is where the real challenge lies. The decoder (made of sophisticated computer algorithms) must first make sense of the chaotic symphony of electrical signals picked up by the EEG. It then has to pinpoint the specific patterns of brain activity associated with your thought of, let’s say, “turn left” or “accelerate”.

The innovation here is a decoder that doesn’t just work, but learns. Using machine learning, it continuously adapts to your unique brain signals, making the translation process more efficient and personalized over time.

The way our brains represent thoughts isn’t fixed. There’s natural variation between people, and even within the same person, things like focus or fatigue can alter how brain activity looks.

The electrical signals from the brain are incredibly faint by the time they reach the scalp. It takes very sensitive equipment and smart algorithms to pick up the relevant signals within all the background “noise”.

Brain-computer interfaces from games to robot rehab

While the racing game analogy makes for a great hook, the true impact of this work lies far beyond the virtual track. “We want to translate [brain-computer interfaces] to the clinical realm to help people with disabilities,” says Professor José del R. Millán.

The researchers envision their breakthrough as a stepping stone towards a whole new generation of brain-computer interfaces designed specifically to assist people with motor disabilities.

The goal is to create brain-computer interfaces that provide more intuitive and effortless control over assistive devices, ultimately helping individuals regain greater independence.

The potential extends far beyond theoretical applications. In recent demonstrations, researchers showcased their technology working with robotic rehabilitation devices designed for hands and arms. This highlights two crucial points:

  1. Adaptability: The self-calibrating decoder isn’t just about games. It demonstrates the ability of the system to adapt to diverse control tasks, opening up applications in many areas of rehabilitation.
  2. Proof of concept: The success with rehabilitation robots underscores that this technology is moving from the lab into real-life clinical scenarios. It shows the potential to create brain-computer interfaces that interact with a wide array of tools and devices designed to improve motor function for those with injuries or neurological conditions.

Where will mind-control take us next?

The potential here goes beyond even healthcare. This new type of interface could revolutionize the way we interact with technology in general. Imagine controlling your smart home, composing emails, or even creating digital art with just your thoughts. The possibilities seem endless.

“The point of this technology is to help people, help them in their everyday lives. We’ll continue down this path wherever it takes us in the pursuit of helping people,” said Professor Millán.

The study is published in the journal PNAS Nexus

—–

Like what you read? Subscribe to our newsletter for engaging articles, exclusive content, and the latest updates.

Check us out on EarthSnap, a free app brought to you by Eric Ralls and Earth.com.

—–

News coming your way
The biggest news about our planet delivered to you each day
Subscribe