Brain hacking is a hot subject right now and has moved from science fiction into reality. At the Usenix Security Symposium, one mind hack looked to create better security and an “unbreakable crypto” system; another brain hack focused on threats to privacy by extracting secrets with brain computer interfaces. Yet other scientists have created a helmet to make an Inception-like world in which reality can be manipulated.
With all the hacks and users continually using and reusing pathetically weak passwords, neuroscientists and cryptographers came up with an “unbreakable crypto” system that relied on implicit learning so your subconscious could remember a 30 character passphrase. However, the trick is that the password couldn’t be consciously recalled and obtained via “rubber hose attacks” meaning torture. The method relied on learning the password with a computer game similar to Guitar Hero, but that plants “a secret password in the participant’s brain without the participant having any conscious knowledge of the trained password.”
The game creates a random sequence of 30 letters chosen from six buttons corresponding with notes marked as S, D, F, J, K, L, explained Extreme Tech. The game lasts for about 45 minutes during which users make around 4,000 keystrokes that subconsciously teach the long, random password. “Neuroscience Meets Cryptography: Designing Crypto Primitives Secure Against Rubber Hose Attacks” was presented at the Usenix Security Symposium. If you are interested, you can download the research paper [PDF], slides [PDF] or watch the video presentation.
The flipside of such security deals with invading privacy by extracting your secrets with commercially available brain computer interfaces (BCI) which are used for hands-free gaming. “The security risks involved in using consumer-grade BCI devices have never been studied and the impact of malicious software with access to the device is unexplored,” scientists wrote. “On the Feasibility of Side-Channel Attacks with Brain-Computer Interfaces” explored how the “technology could be turned against users to reveal their private and secret information.” They used $200 - $300 off-the-shelf, neuro-tech headsets such as those made by Emotiv or NeuroSky and then read brainwaves on 28 different people. They were able to “extract hints directly from the electrical signals” about “private information” such as the location of their homes, “persons known to the user” and bank PIN numbers.
The Usenix Security conference posted the video presentation and research paper, but the researchers relied on identifying “P300” responses that we all have when our brain recognizes someone or something familiar. This is the same type of brain activity that is measured in Veritas Scientific’s “mind-reading” helmet machine that could obliterate privacy of the mind and could make Orwell’s Thought Police a reality. Although the U.S. military expressed an interest in Veritas' brain-spying device, the truth-telling technology is expected to eventually be used by “law enforcement, criminal trials and corporate takeovers.”
The scientists who studied and recently presented brain hacking via BCI also noted that the technology will continue to improve. Developers already have access to make apps, but games could be crafted to secretly extract sensitive information from the brain. Eventually there could be “brain malware.” We’ve looked at something similar such as how DNA hackers might weaponize a virus to infect your brain and behavior, and how future zero-day exploits may also target your body biology, and not just your computer or mobile device.
Lastly, the Laboratory for Adaptive Intelligence, part of the RIKEN Brain Science Institute, created a helmet made of cheap and commercially available components to make an Inception-like world in which reality can be manipulated. Nature.com reported on a Substitutional Reality (SR) System: A Novel Experimental Platform for Experiencing Alternative Reality that could be used to study cognitive dysfunction in psychiatric disorders such as schizophrenia. In Inception, the participants had a “kick” to bring them out of the “dream world;” but there is no kick to distinguish reality from a “dream state” for some psychiatric conditions.
Keisuke Suzuki, one of the research paper [PDF] authors, told the Guardian, “In a dream, we naturally accept what is happening and hardly doubt its reality, however unrealistic it may seem on reflection. Our motivation is to explore the cognitive mechanisms underlying our strong conviction in reality. How can people trust what they perceive? Answering these questions requires an experimental platform which can present scenes that participants believe are completely real, but where we are still able to manipulate the contents.”
Additionally Suzuki believes hacking the mind with SR will "open a new direction in cybertherapy. Virtual reality technologies effectively treat post-traumatic stress disorder and phobias by repeatedly exposing patients to traumatic episodes in immersive devices. The SR system provides the conviction of being in the 'real' world, which is absent in current VR technologies."
Be it cool or somewhat creepy, brain hacking is no longer something that only applies to science fiction. While there are certainly good potential uses, brain hacking also opens the door to all sorts of new security and privacy risks.