Mind Control For The Masses – No Implant Needed

A wave of startups wants to make brain-computer interfaces accessible without needing surgery. Just strap on the device and think.

By Arielle Pardes | WIRED 

WHEN SID KOUIDER showed up at Slush, the annual startup showcase in Helsinki, wearing an ascot cap and a device he claimed would usher in a new era of technological mind control, no one thought he was crazy. No, he was merely joining the long line of entrepreneurs (see: Elon MuskMark Zuckerberg) who believe that we will one day manage our machines with our thoughts.

The quest to meld mind and machine dates back to at least the 1970s, when scientists began, in earnest, to drill into peoples’ skulls and implant the first brain-computer interfaces—electrodes that translate brain cell activity into data. Today, BCIs can regulate tremors from Parkinson’s disease and restore some basic movement in people with paralysis. But they are still surgically implanted, and still quite experimental. Even so, the likes of Musk already envision a future where we’ll all have chips in our brains, and they’ll replace our need for keyboards, mouses, touchscreens, joysticks, steering wheels, and more.

Of course, that won’t happen anytime soon. The mysteries of the mind remain vast, and implanting hardware in healthy brains—well, forget about that, at least until the FDA deems it safe (light-years away). In the meantime, a wave of companies is betting on bringing Mind Control Lite to the masses with a neural interface that requires no surgery at all.

Mind Control Technology (No Implant Needed)

That’s where Kouider comes in. His startup, NextMind, makes a noninvasive neural interface that sits on the back of one’s head and translates brain waves into data that can be used to control compatible software. Kouider’s vision begins with simple tasks (sending text messages with a thought; calling up a specific photo in your camera roll with passing thoughts) and ends somewhere close to science fiction (controlling every device in our world, like the sorcerer in Fantasia). “This is real,” he said onstage at Slush, “and the possibilities are endless.”

Going the nonsurgical route comes with some trade-offs, namely all that skin and bone between your soggy brain and any device that’s trying to read the neural signals it emits. On the other hand, it’s cheaper, it’s safer, and it’s much easier to iterate or push software updates when you don’t need to open someone’s head. And for all the promise of BCIs, people first need to see that this stuff can be useful at all. For that, devices like NextMind’s do the trick.

I had a chance to try out the NextMind device, a few weeks after Kouider gave his Slush talk. He had taken a flight from Paris to San Francisco and carried the device casually in his bag. It weighs 60 grams, about as much as a kiwi fruit, and bears a passing resemblance to flattened TIE fighter.

The NextMind device is basically a dressed-up electroencephalogram, or EEG, which is used to record electrical activity in the brain. It’s not so different from the tools Kouider used as a professor of neuroscience before he ran NextMind. His lab, in Paris, specialized in studies of consciousness. In a hospital setting, EEGs often require the use of gel and some skin preparation, but recently researchers have developed functional dry electrodes that only require contact with the skull. The NextMind device uses these, along with a proprietary material that Kouider says is “very sensitive to electrical signals.” (He wouldn’t tell me what, exactly, the material is.)

Kouider placed the device on my head; it comes with little comb-like teeth that brush through hair to hold the device in place, right on the back of the skull. (Kouider, who is bald, wears it clipped to the back of his hat.) There, the device’s electrodes are well positioned to record activity from the visual cortex, a small area in the rear of the brain. Then it translates the signals to digital data, processes them on the computer, uses a machine learning algorithm to decipher them, and translates those signals into commands.

On a laptop, Kouider walked me through a calibration exercise to create my “neural profile”—in essence, how my visual cortex lit up in response to my eyes focusing on specific things. (I followed a series of flashing triangles around the screen; you only have to do this once, and only for a couple minutes.) The NextMind device is designed to work on anyone, but it works faster when someone has had practice. Kouider says it’s about a neural feedback loop: Ah, when I focus on that, then that happens on the screen.

Neural profile generated, I was ready to play some games. NextMind will announce its developer kit at CES in January. In an effort to court developers, the company has designed a few demos to show off what its device can do. I tried one that’s a riff on Nintendo’s Duck Hunt, which Kouider played as a kid. As ducks danced across the screen, Kouider leaned over. “Try to shoot him,” he whispered, “with your brain.”

I focused my gaze on the ducks and, in less than a second, they exploded. This little magic trick was repeated through a series of demos. I changed the channel on a mock TV set by glancing at one corner of the screen. I cracked a digital vault by concentrating on the right numbers on a pincode. I changed the colours on a set of smart light bulbs that Kouider had set up for me. It’s hard to say why you’d need to do these things with your mind, but when you do, you really feel like a Jedi.

NextMind isn’t the only company trying to develop, for the masses, noninvasive BCIs. Another startup, CTRL-Labs, released a developer kit last year for a similar noninvasive neural interface. It also uses dry electrodes, but this device is an armband and captures signals from nerves. Facebook acquired the company for close to $1 billion in September.

A few months earlier, I had a chance to try out CTRL-Labs’ device myself. The demo was designed to show off the company’s vision: “The question at CTRL-Labs is not, how do we make our devices more capable?” as cofounder Thomas Reardon told an audience at Slush in 2018. “It’s, how do we ourselves become more capable?” I strapped the device to my arm and played some games. One involved a dinosaur jumping over a series of obstacles. I thought jump and, with just a twitch of my arm, the dinosaur jumped. At one point, Patrick Kaifosh (then CTRL-Labs’ CTO, now Facebook Reality Labs’ research manager) entered the credentials to unlock his laptop by simply staring at it. Neuroauthentication, he called it.

That device, like most of the work in BCIs, makes use of the motor cortex, the part of the brain that manages movement. Reardon’s breakthrough was in singling out the neurons in your spinal cord, which send electrical signals to your arm and hands, rather than going to the brain region itself. Most of the clinical work around BCI also involves the motor cortex, in part because so much of the research has focused on movement disorders: Parkinson’s, paralysis, and so on. But Kouider thinks the visual cortex offers a richer set of neural signals for people trying to control their personal devices. When I asked him why so much of the work was being done in the motor cortex, he paused, and then said, “I think that’s because they’re making a mistake.”

Because the NextMind device utilizes signals associated with sight, the technology can feel a little like gussied-up eye-tracking. So what if you can change the channel with your eyes? People have been doing that for years. (After the demo, Kouider claimed his BCI could work even if I closed my eyes.) Right now, you control things with your gaze. Soon, Kouider believes, the device will be able to tap into our imagination, turning visual thoughts into actions.

The problem with some of these BCI devices, though, is not whether they can become fast enough to enhance gameplay or control smart-home devices. It’s whether anyone cares to. InteraXon, a Canadian startup, used to make a head-worn device that could control lights with the power of thought but eventually gave it up. “Frankly, you could just turn the thing with your hand much more readily,” the company’s cofounder, Ariel Garten, told Scientific American. While, arguably, there would be accessibility use cases for this technology, InteraXon pivoted to make Muse, a meditation headband.

As he gins up interest in his developer kit, Kouider is pitching the idea that NextMind’s device and other noninvasive neural interfaces of its ilk will be like the touchscreen, or the computer mouse: the thing that upends the way we interact with our personal technology. At this early stage, though, BCI is more like the virtual-reality headset than the Next Great Interface: mind-blowing in their demos, but easy to put back in the box.

*  *  *

Via
WIRED

Leave a Reply

Your email address will not be published. Required fields are marked *