Brain Control Interfaces: What Are They?
Science fiction is littered with far-fetched characters like cyborgs, androids, terminators, daleks, and cybermen. These impossible creatures are a blending of man and machine in perfect harmony with things such as cybernetic arms and enhanced intelligence and strength. That future might be here sooner than you think, albeit minus the Hollywood exaggeration of mayhem and destruction (we hope). Brain control interfaces (BCIs) are defined as devices that, “acquire brain signals, analyze them, and translate them into commands that are relayed to output devices that carry out desired actions.” (1) Long story short, BCIs are the key piece of technology that helps man bind to machine. They act as the middleman between your brain and the machine you wish to control. Think of them as a translator. Your brain only speaks in neural impulses, and machines only understand binary. BCIs help convert your neural impulses into binary data and vice versa. These devices have been used for a variety of purposes, ranging from, “complex control of cursors, robotic arms, prostheses, wheelchairs, and other devices.” (1) In recent years, there have been amazing breakthroughs in the development of these BCIs.
In 2019, researchers at Carnegie Mellon, in collaboration with the University of Minnesota, made a significant breakthrough that improved a non-invasive BCI’s ability to help the robotic arm track a mouse cursor in real time without any lag. The major impacts of this development lie in the facts that the BCI required no invasive surgery to interact with the command source, and that the movements of the robotic arm were no longer jerky or discrete (2). This will be critical in future development because the use of invasive BCI requires extensive surgery and neural implants which can lead to potential problems for the individual undergoing the surgery. The lack of discrete movements makes the robot safer and more effective at completing its task. Surgeries would have a very poor success rate if the robot holding the scalpel could only move in discrete increments.
Another major development in BCI research centers around integrating Artificial Intelligence (AI) into BCIs. The amount of neural information that is taken in by a BCI unit is incredible, and it is not possible for it to process all the information at once. Along with the flood of information, there are sometimes weak signals given off by the brain which could be noise or incomplete neural transmissions. As a result, it can be difficult to distinguish between needed information and those weak brain signals, and so this is where AI comes in handy. AI can utilize machine learning and advanced algorithms to take on the neural load and rapidly decipher the incoming signals from the brain. This next step can open up a world of possibilities for future implementations of BCIs. One example is in the safety of new medical devices that are being put out on the market. Using BCI applications such as electroencephalogram (EEG) signal processing, certain emotions such as frustration, joy and relaxation of patients or healthcare providers can directly be extracted in real-time (3). This can lead to better decisions in the manufacturing and design of medical devices. This early-stage optimization can cut down on testing time, approval time, and other extra costs without sacrificing the thoroughness of the testing.
Merging AI with BCIs can also greatly improve medical prosthetic devices. Researchers at Duke University are looking into incorporating AI and BCIs for two purposes: closed-loop neuro-prosthetic systems in which the brain or nerves receive sensory information feedback from the prosthetics, allowing one to experience the “feel” of sensory input conveyed through the prosthetic limb, and wireless neuro-prosthetics that can be remotely controlled by brain signals (4). Both of these approaches have the potential to revolutionize the medical use of prosthetic limbs and give mobility and independence back to those who have suffered from paralysis, amputation, or atrophy of muscles in certain limbs.
The developments that are taking place in the field of brain-control interface research is outstanding. Now, with the evolving implementation of AI, the outlook of this new technological development is uplifting and incredible. Even with the current pandemic keeping us separated, the medical community is still working to develop new products and ideas to improve our everyday lives. The future of cyborgs and androids is coming very soon, and it will shape the world we live in and the laws we abide by. Regulations and standards will be required, and the safety of these advanced intelligent devices will have to be properly assessed. We can only hope that when BCIs are implemented, they are done in a safe and effective manner to help evolve the medicine and technology we rely on.
References
1. Shih, Jerry J et al. “Brain-computer interfaces in medicine.” Mayo Clinic proceedings vol. 87,3 (2012): 268-79. doi:10.1016/j.mayocp.2011.12.008
2. Technology Networks. “First Ever Non-Invasive Brain-Computer Interface Developed.” Informatics from Technology Networks, Technology Networks, 21 June 2019, www.technologynetworks.com/informatics/news/first-ever-non-invasive-brain-computer-interface-developed-320941.
3. “AI in BCI: The New Era of Human Factor Design and Research.” Mc.ai, 12 May 2020, mc.ai/ai-in-bci-the-new-era-of-human-factor-design-and-research/.
4. Song, EunYoung. “Brain-Computer Interface Based Neuro-Prosthetics.” SciPol.org, 17 Jan. 2020, scipol.duke.edu/learn/science-library/brain-computer-interface-based-neuro-prosthetics.