Voice-controlling your gadgets is so 2012. In the not-too-distant future, you may be able to tell your smartphone to launch an app, send an email or adjust its volume with a single thought. Samsung is currently experimenting with tablets that interact with your brain through the use of head-mounted EEG sensors, and other companies are working on similar products.
Although it’s still in the early phases, Brain-Computer Interface (BCI) technologies are gaining traction in the consumer-electronics space.
InteraXon, one of BCI’s pioneers, sells a headband called Muse that measures brain activity in real time and displays it on your smartphone or tablet. The headpiece comes with four clinical-grade EEG sensors embedded into its body to monitor your brain waves.
“I’ve never met anybody who’s not excited about being able to control something with their mind,” said InteraXon CEO Ariel Garten.
Brain Controlled Gadgets Arrive
Although thought-controlled computing is the ultimate goal of devices like the Muse, the headband also helps you understand what’s happening inside your head by displaying your brain activity on screen.
“The technology is so amazing,” said Garten. “For us, one of the really compelling things that people wanted was the ability to improve their own mind, whether you want to perform at your peak at work or are looking for stress control.”
The Muse is just one of several devices in a growing market of BCI-oriented gear. Necomimi’s Brainwave Cat Ears ($69.99) wiggle and move based on your brain activity. During this year’s Consumer Electronics Show, neuroscientist and former software engineer Ruggero Scorcioni won AT&T’s Hackathon with his Good Times app, which communicates with the Brainwave Cat Ears to redirect your phone calls while you’re busy.
“I think this is just one of the eventually many [brain-controlled technologies],” Scorcioni said. “[Smartphones] are an integral part of our life, and in the future, we may not always want to [input commands] manually.”
Good Times communicates with the Brainwave Cat Ears to read your brain’s current level of activity. The iOS app then uses these readings to decide whether or not to let the phone ring or direct the caller to your voice mail.
“If you are mentally busy, the signal doesn’t get higher,” Scorcioni said. “It’s just changing the type of electrical activity that is there. It’s not a matter of finding a threshold; it’s a matter of analyzing the data and detecting when the brain is under a mental workload.”
Both Muse and Good Times let you carry out smartphone commands at the most basic level, but cannot complete complicated and advanced tasks. This means you won’t be able to post on your friend’s Facebook wall or play a complex game with your mind, but you could adjust the phone’s volume or brightness with Muse.
“When you think of Muse in the control concept, it’s like a dimmer switch,” said Garten. “So it’s not really good for switching between apps. It generally works for things that go in one direction or the other.”
Although BCI research is growing, it remains unclear whether or not a completely touchless mobile experience is possible: BCI technology can’t actually process or interpret your thoughts; it merely measures the electrical activity happening in your mind.
“The BCI is not a mind reader,” said Hasan Ayaz, assistant research professor of biomedical engineering at Drexel University in Philadelphia. “It’s basically capturing a response to what the user engages. It’s not capturing the individual’s thinking, which is good for privacy.”
Right now, BCI is primarily used for clinical purposes to help users with limited mobility, muscle disorders and those who do not have full use of their limbs. According to Ayaz, BCI research started about 23 years ago to help clinical patients communicate more easily.
“Even though the BCI is sort of in its early stages and is still forming, it is the only alternative for some,” Ayaz said. “The target population is mainly clinical patients because anything that can help improve their interaction is appreciated and helpful.”
It’s this concept of creating new ways to interact with mobile devices that drives Samsung’s research in the BCI field. In collaboration with the University of Texas, Dallas, the smartphone maker is crafting a cap studded with EEG sensors that would allow users to control a Galaxy tablet by using their mind. Researchers found that people could launch applications and make selections by concentrating on an icon that blinks at a specific frequency.
“Several years ago, a small keypad was the only input modality to control the phone, but nowadays, the user can use voice, gesture, touch and eye movement to control and interact with mobile devices,” Insoo Kim, Samsung’s lead researcher, told MIT Technology Review. “Adding more input modalities will provide us with more convenient and richer ways of interacting with mobile devices.”
Overcoming the Geek Factor
A major challenge in developing mainstream brain-controlled mobile devices is in the placement and presentation of the EEG sensors. According to Scorcioni, the advantages that come with using this type of technology in everyday life won’t be enough to convince most people to wear a headset regularly.
“Based on our research, not many people are willing to wear a headset,” Scorcioni said. “There aren’t many benefits just yet. There are limitations that need to be considered before you can really say, ‘Well, can you control that [with your mind]?’ The answer is, ‘Well, what are consumers willing to wear on their heads?’”
InteraXon has tackled this obstacle head-on with its Muse headset, which makes it “look like you’re walking out of an American Apparel catalogue,” Garten said. “You put something into a cap or a hat, and it ends up hiding the technology in a way that doesn’t bring it to the forefront.”
At the same time, those wearing BCI gear may want to let others know that they’re in the vanguard, similar to how Google Glass users would. “In a headband, it’s purposeful and stylish,” Garten continued. “But it also has an edge to it that demonstrates that it really is a technology that does something phenomenal.”
There’s also the challenge of developing dry EEG sensors to such a point that wearing them would be convenient for mainstream use. Typical EEG sensors that are common in clinical environments require a layer of liquid between the sensor and the scalp to pick up a stronger signal. Samsung is exploring the use of head-mounted dry sensors in its BCI research and hopes to one day create a device such as a cap that can be worn regularly.
“There’s more interest in dry sensors,” Ayaz said. “They’re practical for everyday use because you don’t need to use gel every time. If we see improvement in the types of signal processing and machinery used, we can reach more sophisticated outlets.”
Although it will be quite some time before brain-controlled interfaces become part of people’s everyday lives, Garten speculates that the technology could take on other forms that fit more seamlessly into people’s daily routines.
“It could be another device on your prescription glasses, or a part of cellphones…things that sit near your ear naturally,” she said. “There will be some sort of technology we’ll be wearing in order to mediate all of our communications years from now.”
How many years are we talking about before BCI goes mainstream? It will depend on not just the pace of innovation, but consumer acceptance.
“Just given the advances that need to happen in algorithm detection and consumer behavior, 20 to 25 years is really the right time frame for [BCI] to be pertinent in the way that touch screens are today,” Garten predicted.
Copyright 2013 LiveScience, a TechMediaNetwork company. All rights reserved. This material may not be published, broadcast, rewritten or redistributed.