Thought-controlled devices have long been a staple in science fiction but thanks to advances in neuroscience they may soon become a staple in smart homes. Adelle King reports.
Since the development of electroencephalography (EEG) technology in the 1920s, scientists and researchers have been exploring the possibility of using human brain signals to control electronic devices.
Known as brain-computer interface (BCI), this technology creates a unidirectional communication system between the brain and a computer system via a direct electrical connection to the human body. BCIs pickup brain signals taken from an EEG monitoring machine attached to the scalp, translate these into control signals and relay them to smart devices and various virtual reality applications.
It sounds like something out of The Matrix but BCI technology has been used for years in the medical field to help replace or restore function to people disabled by neuromuscular disorders.
Now, thanks to advancements in the sensors used in EEG monitoring and the growth of the Internet of Things (IoT) market, companies such as EMOTIV, BrainCo and NeuroSky have begun mainstreaming BCI technology. These companies have each developed ‘wearable’ EEG-monitoring products that can be integrated into home automation systems, revolutionising BCI technology by making it accessible and affordable for everyday consumers.
The integration of BCI technology into smart homes seems like a natural progression given that the idea behind home automation systems is the ability to understand and predict human needs. BCI technology enhances these systems by allowing smart devices to simply read brain signals to determine the appropriate action rather than a huge range of environmental and situational factors.
“The ability to initiate control through a mental command and having an environment adapt and adjust without pushing a button or controlling a dial is very appealing,” says EMOTIV vice president of corporate development Kim Old.
EMOTIV was founded in 2011 by tech entrepreneurs Tan Le and Dr Geoff Mackellar as a US entity of Emotiv Systems, an Australian company that Tan founded in 2003.
EMOTIV initially targeted the research field, developing a wireless, portable EEG monitoring headset called EPOC+ that could collect data in an organic, natural environment. In 2015, the company moved into the consumer market with the release of EMOTIV Insight, a second generation, five-channel, wireless headset that allows users to monitor their mental performance and control electronic devices.
EMOTIV Insight, which is designed for everyday use, features a proprietary polymer biosensor system that absorbs moisture from the environment and eliminates the need for extensive preparation and conductive materials, such as gels or saline solutions. Electrical inputs and sensors rest on the scalp, touching the frontal, temporal, parietal and occipital lobes to monitor the brain’s activity across all its key areas.
“The control aspect of this device is the algorithms and consumer number system we’ve created that allows users to record and train a baseline neutral setting as the restive state of the brain. This baseline is then used to train the system to recognise thought patterns related to different desired outcomes, such as moving objects,” says Kim.
Essentially, users record their brain activity while thinking basic mental commands, such as ‘rotate right’ to adjust a thermostat, and then assign the command ‘increase temperature’ in the system and in pre-programmed smart devices. The next time the user thinks the command ‘rotate right’ the temperature of the thermostat will adjust based simply on the user’s thought. The way EMOTIV’s learning algorithms work, the more users build up the profile of various actions, the more intuitive and responsive the system becomes.
EMOTIV Insight also has the ability to detect facial expressions, including blinking, left and right winking, frowning, raised eyebrows, smiling and clenching teeth. These facial expressions are then used to measure and track users focus, engagement, excitement, interest, stress and relaxation levels.
Kim says this is particularly exciting in terms of smart home innovations.
“With this technology you could set up a home automation system where the EEG sensors are able to detect stress in a user and communicate this to smart devices throughout the home. These devices can then respond with pre-set filters, adjusting the light or temperature to optimal settings to reduce stress in that user.”
This idea of the brain becoming the terminal to interact with everything directly was behind the 2015 founding of Boston-based BrainCo by Bicheng Han. Born out of Harvard Innovation Lab with $US5.5 million ($AUD7.2 million) in seed funding, BrainCo has developed the Lucy headset, which can control everything from lights and toys to a robotic hand.
The Lucy series, which debuted at CES 2017 alongside proprietary application systems, reads a user’s brain waves through the EEG-monitoring headband, which features medical-level hydrogel electrodes that detect exact voltage even below one microvolt. The headset also features a proprietary circuit board, LED light indicators, WiFi smart synching and shape-memory polymer.
The brainwaves read by Lucy are run through complex NASA-related algorithms developed by researchers fromHarvard and the Massachusetts Institute of Technology (MIT), which translate the signals into numerical attention scores from zero to 100. This readable focus index enables users to control smart devices, with rising attention levels sent in real-time via a smart application to any device with a WiFi connection.
“Through years of research and development, scientists at BrainCo have successfully adopted advanced techniques from NASA and optimised the algorithms that can accurately process EEG data. The algorithms we use are based on big data analysis to accommodate the majority of the population and compare the attention levels of a given user,” says BrainCo representative Ruthy Li.
The EEG analysis is also collected and presented in real-time on an app for smart phones, which enables users to monitor and analyse their brainwaves. Users can then train to attain certain frequencies to enhance their focusing abilities.
By interacting with smart devices, users can passively train their focus related brainwaves to improve their productivity.
“The training tasks can be applied to the IoT market as BCI technology becomes the new interface and transforms the way users interact with electronic devices In the future, users will not only train brainwave frequencies to improve focus levels but also bundle BrainCo developed wearables with other smart home appliances from the market and open communities.”
It’s NeuroSky that is really bringing BCI technology to mass markets though, primarily working as an original equipment manufacturer (OEM) and distributing BCI technology to organisations that are building next-generation, consumer-facing wearable and mobile products.
“We wanted to help advance the human to machine interface beyond the standard touch pads and keyboards, with our ultimate goal to make this interface more natural,” says NeuroSky senior manager of business development Masamine Someha.
NeuroSky’s ThinkGear AM (TGAM) EEG sensor PCB module is found in over one million consumer EEG devices around the world including toys, mobile devices and educational products. It uses low-cost dry-electrodes that sense signals from the brain, filter out unrelated noise and electrical interference, and then convert these into digital power.
“Initially it takes around four seconds to calibrate a user’s data but after this auto-calibration, the machine learning algorithm takes over and the headset provides output every 0.5 seconds. Additionally, our technology has an accuracy rate of 98% in interpreting brain signals,” says Masamine.
TGAM was used to power Octoblu, an IoT manager developed by Citrix, which uses brain signals to control Phillip Hue lights, as well as Nervanix Clarity, a study tool that determines how much attention a student is paying to a specific element of a teaching curriculum.
NeuroSky is also behind the Necomimi headset and shippo tail made by Japanese company Neurowear.
The Necomimi headset is a cat-ear shaped headset that uses two light-touch sensors on the forehead and ear to determine a user’s attentiveness based on alpha and beta brainwaves. This attentiveness is expressed by the ears, which droop when the user is relaxed, stand up when they’re concentrating and wiggle when they’re happy or excited.
Shippo, which was demonstrated at the 2012 Tokyo Game Show but will not have a commercial launch, is a tail that synchronises with Necomimi via an external output module. Based on the readings sent by Necomimi, the tail will move side to side or top to bottom at three different intensities to show relaxation, concentration and excitement.
Necomimi and Shippo are a bit on the strange side but the idea behind the products, devices being able to detect human emotion, is something that could become more common in the future.
BCI technology like this could help give artificial intelligence (AI) the capacity for human-like cognitive behaviour, which could power machines to understand human thoughts and emotions.
BCI technology provides a greater understanding of the human brain, opening up the possibility of EEG learning devices being able to teach algorithms and patterns to AI and create personalised smart home experiences.
“Applications of machines learning methods for decoding speech and language, optical neuroimaging systems with advanced spatial resolution and next-generation neural prosthetics demonstrate the collaborative development of BCI and AI,” says Ruthy.
This collaborative development looks set to increase too, with Tesla chief executive Elon Musk and Facebook looking at ways to develop a closer blend of biological and artificial intelligence in devices.
Although the Brain Research through Advancing Innovative Neurotechnologies (BRAIN) Initiative says we still have a limited understanding of how the brain works and how individual cells and complex neural circuits interact, advancements in these areas are accelerating.
“In the near future smart devices and robots will be able to better understand people’s thoughts and work on complicated tasks. This is the next frontier of neuro-technology,” says Ruthy.