Skip to Content, Navigation, or Footer.
The Tufts Daily
Where you read it first | Tuesday, October 1, 2024

Neuroscience and the military

What if the government could read your mind? If they choose to utilize recent research from the field of neuroscience, they already can — several recent studies have used computing power to match brain scan patterns with what a person thinks and feels, effectively reading their mind.

A person experiencing fear or looking at a picture of a bird can be identified with great accuracy simply by correlating their MRIs with existing data. And as scary as that thought is, all sorts of practical applications to learning to read someone's mind have nothing to do with the government or military, and many of involve helping those with damage to their bodies regain function. In the case of "mind reading" by interpreting the brain's electrical impulses, people with paralyzing brain damage might be able to communicate via thought that is translated by a computer, the accused could prove their innocence by essentially passing the ultimate lie detector test or someone missing a limb could control where the limb goes simply by thinking about it.

Though trying to read what people are thinking about (neuroimaging) is an especially controversial area of research, it's a good example of the phenomenon of dual use, where a technology has both constructive and destructive applications. Most neuroscience research has aspects of dual use that go unrecognized by both the public and scientists actually conducting the research, who either don't understand what the military might use their research for or think the benefits outweigh such potential uses.

The military is of course interested in neuroscience for the advantages it can provide in war and conflict. Interesting areas of advancement are in improving the soldier and incapacitating the enemy; neuroscience is getting much better at human−machine interfaces and brain−machine interfaces — meaning humans controlling machines with their minds — and the military is interested in how soldiers might be able to control weapons like robots and guns with thoughts. Other military research interests include improving soldiers' memories and mental functions with drugs, incapacitating the enemy with nonlethal weapons like paralyzing chemicals or designing successful interrogations with the aid of neuroscience.

What directions would unchecked military research take us in? Besides the freaky aspect of some of these technologies, the military sways research directions like no other single organization and thus the areas it chooses deserve some scrutiny. At times, transparency can be difficult, and at times even researchers don't know where all their funding comes from. Without disclosure of what motives lie behind each project and thinking through the implications of research, technology risks outstripping the wisdom of society. A classic example is the use of chemical weapons in WWI, a new technology adopted by the military that was later deemed inhumane. Technology that makes its way into society can also have harmful effects: radioactive novelties were widespread before radiation poisoning was understood. Outside of health effects, imagine a world where soldiers were given brain implants that allowed them to access knowledge and memory at will and then the technology entered society. What if we had a society where the rich could afford mental enhancements and the poor could not? While a scenario of brain chips is extreme, it and the other technologies in this article are all either already possible or going to be in the next 50 years. Our society must decide where we want to be focusing our efforts and which capabilities we want to gain or avoid. Answers to these questions should direct what research is going on in the present, not the military.

So what should be done? It would be foolish to put a stop to research with obvious benefits to the unhealthy, and possibly immoral to deny those for whom help could be available. But putting more types of research through ethical review, pressuring for transparency in what research the military is funding and raising awareness of dual use possibilities in the scientific community could go a long way toward creating a wiser and more peaceful future. Ethical reviews of neuroscience research could be conducted by the institution carrying out research, using the system already in place for reviewing other research, the Institutional Review Board (IRB). The IRB could also require scientists to submit the dual−use potential of their research along with the rest of their forms for evaluation, ensuring that each researcher has thought through the possibility of the use of their technology by the military. While the military is often cagey about revealing budget information, it's necessary that at least for civilian research the public be informed about who funds each study and legislation should be enacted requiring the military to do so. Perhaps all it will take to avoid making irreversible mistakes is a bit more awareness on all our parts.

This topic, and others, is being discussed at the EPIIC Symposium (this year's topic is "Conflict in the 21st Century"), from now until Feb. 26.

--

Laurel Woerner is a sophomore majoring

in international relations.