Brain-Machine Interface, AI and Sixth Sense

Original article was published on Artificial Intelligence on Medium

Brain-Machine Interface, AI and Sixth Sense

Technology Innovation Help Create Sixth Sense.

A braincomputer interface (BCI) is a computer-based system that acquires brain signals, analyzes them, and translates them into commands that are relayed to an output device to carry out a desired action. BMIs (Brain Machine Interfaces) are an area of research with huge potential, offering the ability to directly connect the human brain to computers to share data or control devices. Some of the work on BMI have been a science fiction for decades and we have always dreamed of such technology.

Braincomputer interfaces are systems that use signals recorded from the brain to enable communication and control applications for individuals who have impaired function. BCIs are blowing up in a range of fields, from soldiers piloting a pack of drones and make drones alter their flight at the DARPA to a Chinese school monitoring students’ attention. The devices are also used in medicine, including versions that let people who have been paralyzed communicate, operate a tablet with their mind and other devices or that give epileptic patients advance warning of a seizure. In the future, BMIs could provide a path to brain enhancement and memory.

Mind-reading device uses AI( Artificial Intelligence ) to turn brainwaves into audible speech. Electrodes on the brain have been used to translate brainwaves into words spoken by a computer — which could be useful in the future to help people who have lost the ability to speak. Further, brain implant is the first step toward returning sight to the sightless and Groundbreaking A.I.& brain implant translates thoughts into spoken words

The Sixth Sense is just that, an extrasensory perception (or ESP) beyond our five commonly recognized senses — hearing, taste, sight, smell and touch. A sixth sense is a special ability in Predicting things and intuition, and awareness of one’s body in space. Intuition draws from that deep memory well to inform our decisions going forward.

The Columbia’s Laboratory experiment monitors the subjects’ brain activity through electroencephalography technology (EEG), while the VR headset tracks their eye movement to see where they’re looking — a setup in which a computer interacts directly with brain waves, called a brain-computer interface (BCI). In this experiment, the goal is to use the information from the brain to train artificial intelligence in self-driving cars, so they can monitor drivers attention. And Elon Musk, the founder of Tesla and his venture Neuralink, which could implant BCIs in people’s brains to achieve a intimate association with artificial intelligence.

For every thought, feeling, and movement, our brains emit electrical signals. Some BMIs use sensors mounted in a removable cap or MRI technology to read signals from the brain. Others connect directly to the surface of the brain, through tiny wires and an array of nano-electrodes. BMIs can also be entirely implanted in the brain. Neuralink plans to take this even further by inserting a chip with as many as 3,072 thin metal threads into people’s brains.

Brain-computer interfaces can be used to control everything from drones to bionic arms, and they’ve become a trending topic in emerging technology. Elon Musk is working on the Neuralink project to use cybernetic implants to allow people to interface with gadgets or software, and Facebook is working on its own brain-reading computer system. However, these projects are a long way from creating workable prototypes and in order for humans to interface neurally with computers, researchers need to find a way to integrate incoming information from a computer into the brain.

The brain-machine interface research focuses on trying to replace lost sensory information, such as restoring a sense of touch to people with spinal cord injuries. However, a study has taken a different approach by using a brain-machine interface to augment existing sensory systems and create a “sixth sense” in rats. The study from Penn Medicine does just that, by implanting tiny electrodes into the brains of rats and feeding them information in the form of sensory feedback. The rats couldn’t see an object, so they received no visual information about how to navigate. But they did have information from the interface. The electrodes stimulated their brains to inform the rats where the object was located relative to their current position, and the rats were able to use this information to reach the object even in the darkness.

One eventual application of this brain-computer device is to restore sensation to individuals who have suffered from spinal cord injury, that opens the door for applications which connect devices in the brains to devices elsewhere in the body. The long term vision is to link such system with implantable sensors in paralyzed limbs to provide a complete sensory experience for paralyzed patients and help people with disabilities.

Although, Some BCIs are cause for concerns with privacy, ethical and social issues but there is also the potential for the alteration of one’s sense of self, or identity, which raises questions of autonomy, or the capacity for self-direction. Brain-computer interfaces is an emerging technology and the technology may eventually change how a person’s brain functions — or even how users perceive their own identity. However, there’s a challenge in processing the externally produced signals just as successfully as if it was using ones natural-born senses.