James Pero | The Daily Mail | Source URL
The Defense Advanced Research Projects Agency (DARPA) is funding research that could give a future generation of soldiers the power to control machines and weapons with their minds.
The agency said it will fund six organizations through the Next-Generation Nonsurgical Neurotechnology (N3) program who will work to design and build interfaces for application in the U.S. military, that could be worn be soldiers and translate their brain signals into instructions.
Those instructions could be used to control swarms of unmanned aerial vehicles, wield cyber defense systems, or facilitate military communications.
While the feat may sound firmly in the realm of science fiction, according to DARPA it is setting a completion date within four years.
‘DARPA is preparing for a future in which a combination of unmanned systems, artificial intelligence, and cyber operations may cause conflicts to play out on timelines that are too short for humans to effectively manage with current technology alone,’ said Al Emondi, the N3 program manager.
‘By creating a more accessible brain-machine interface that doesn’t require surgery to use, DARPA could deliver tools that allow mission commanders to remain meaningfully involved in dynamic operations that unfold at rapid speed.’
According to a report from IEE Spectrum, DARPA has not only benchmarked aggressive timelines for when preliminary versions of the technology could be completed, but it has backed initiatives with a substantial amount of funding.
Two grant recipients have reported receiving between $18 to $19.5 million to carry out their endeavors reports IEE Spectrum.
While brain-to-computer interfaces have been studied and tested by DARPA in the past, previous applications, like those used to control prosthetic limbs and restore a sense of touch, have relied on invasive surgical implantations.
A new crop of technologies, says DARPA, will avoid those surgical methods and develop hardware that can read brain signals just by being proximate to users’ head.
‘If N3 is successful, we’ll end up with wearable neural interface systems that can communicate with the brain from a range of just a few millimeters, moving neurotechnology beyond the clinic and into practical use for national security,” Emondi said.
‘Just as service members put on protective and tactical gear in preparation for a mission, in the future they might put on a headset containing a neural interface, use the technology however it’s needed, then put the tool aside when the mission is complete.’
DARPA says that those non-surgical methods range from technology that employs the use of ultrasound to read electrical signals in the brain to other ‘minutely invasive’ modes that require ingesting a ‘nanotransducer’ that helps the brain communicate with a helmet-mounted transceiver.
DARPA will be joined in efforts to develop a viable brain-to-computer interface by at least one high-profile company from the private sector, Neuralink, which is backed by Tesla CEO, Elon Musk.
According to a report from Bloomberg this month, Neuralink has continued to raise millions of dollars in its endeavor to build a device capable of linking brains to computers. Latest filing show Neuralink raised $39 million of a $51 million funding goal.
What exactly Neuralink’s device will look like or what it will be capable of doing, however, remains a mystery as Musk has offered little detail since the company’s founding several years ago.
HOW DO MIND-CONTROLLED PROSTHETICS WORK?
Prosthetics that attach to part of the human body are often objects that allow a person to perform a specific function – such as blades for running.
Scientists are working to develop prosthetics that are personalised and respond to the commands of the wearer.
To do this, small pads are placed on the skin of the patient.
They are located around the end of muscles and where the nerve endings begin.
The pads detect the electrical signals that are produced by the muscle nerves and translate this via a computer.
To trigger these sensors, the patient must actively think about performing an action.
For example, in order to signal a bicep contraction, the person wearing the prosthetic would have to think about bending their arm.
By understanding what muscles are being signalled by the brain to contract, scientists can predict how a limb would move.
This is then recreated by the prosthetic in real-time, allowing wearers to think an action and then the artificial limb will perform it.