We’ve talked about free will on these screens before. We’ve referred to consciousness as the pointy-haired boss who takes credit for decisions made endless milliseconds before it was even aware of them; tumors that turn people into pedophiles, and do violence to the very concept of “culpability’; military hardware that bypasses conscious thought entirely and takes its cues from the far-faster processes of the visual cortex. We’ve gone down that inevitable, what-if road (scroll down to June 30 in the right-hand column): if I’m not responsible for my behavior when my brain was hacked by a tumor, why should I be held responsible for my behavior at other times? Is a “normal” individual any more responsible for the wiring of his brain than a sociopath? In a world of mechanical minds, how can anyone be held accountable for anything?
Such arguments are both unassailable and inadmissible: unassailable because the very concept of “free will” doesn’t make sense in a cause-effect world unless one invokes supernatural elements, and inadmissible because society has to pretend that it does regardless. Our whole legal system is predicated on the culpability of individuals for their actions: we’re not going to throw out our whole worldview just because science shows us that it’s wrong.
As it turns out, though, I spoke far too soon on that score; I gave the legal profession too little credit. Because, of course, not being responsible for one’s actions is a venerable favorite amongst defense lawyers whose mass-murdering clients have been caught red-handed; and not guilty by reason of predestination works even better than not guilty by reason of insanity in some cases.
High-tech war crimes using neuroelectronic weaponry, for example.
An article over on Technovelgy pointed me to “Brave New World: Neurowarfare and the Limits of International Humanitarian Law“, by Stephen E. White. Thirty-four heavily-footnoted pages of legal opinion on the implications of brain/weapons interfaces in the battlefield. I learned a bunch of neat things poking around in there (did you know that the Geneva Conventions include a protocol that proscribes the use of weapons which could cause “widespread, long-term and severe damage to the natural environment”?), but primarily I was reminded that to be found guilty of a criminal act, you have to have intended to commit that act. And since certain toys already in DARPA’s stockpile act on preconscious neural cues, it’s easy to imagine a scenario in which a soldier massacres unarmed civilians — using weaponry that responds directly to the activity of her own brain — and yet remains innocent of any crime for the simple reason that she had no conscious intent.
The most obvious application of this involves any interface (such as these brainoculars) that improve reaction time by bypassing the conscious thought process. The whole point of the exercise is to act faster than you can think, faster than the conscious self can veto1. But there is other technology that acts not by bypassing conscious thought, but by anticipating it, by acting on predicted intent. As White says:
“…a computer can make a correct prediction of what a subject will do 71% of the time by analyzing the electrical activity generated by the subject’s medial prefrontal cortex when he or she makes a decision. Theoretically, a brain-machine interface weapon could fire a weapon based on such a predictive response, thereby making it uncertain whether or not a volitional act actually took place.
“…a brain-interface guided weapon could circumvent the pilot’s normal volitional processing signals and rely solely on the recognition activity, thereby making it impossible for courts to determine whether a volitional act occurred before weapon targeting. Alternatively, a brain-interface guided weapon could employ a combination of sensory recognition of the pilot’s incipient volitional thought and probabilistic software calculations in such a way that a prosecutor could never definitively prove anything more than the most attenuated guilt for misdirected attacks on protected persons.”
Who’s culpable in these cases? A drunk who runs someone over doesn’t get to invoke his own impairment as an alibi; presumably there was some point at which he made a sober, conscious decision to start drinking, knowing the risks. But what if your CO orders you to get behind the wheel impaired? What if you get courtmartialled for refusing?
Blame the CO, then. But does the CO have the technical expertise to assess the weaponry under his command? Probably not; the stuff shows up in crates one day, along with an instruction manual and the usual upbeat spin about “game-changing technology” and being “master of your domain”. So charge the geeks who designed the stuff. Some grunt wipes out a village in the blink of an eye, put the Tophers and the Bryces up against the wall; surely they must have known the limits of their own technology.
All of this stuff is disturbing enough as it is. But White takes the next obvious step down that road. Just as Big Pharma cranks up its prices two days before the new regs kick in, just as the logging industry finds out which forests are slated for protection and then clear-cuts them before that legislation passes, the military now has an incentive: not to limit the technology, not to improve its ability to discriminate foe from friend, but to deploy these weapons as widely as possible:
“…international humanitarian law would create perverse incentives that would encourage the development of an entire classes of weapons that the state could use to evade criminal penalties for even the most serious types of war crimes.”
So whatever you do, son, stay jacked in, keep online, because zombies— zombies can never be found guilty.
Silly me, thinking that our legal system was too hidebound to adapt to the latest neurological findings. I shouldn’t have worried: ultimately, we’re still totally fucked.
In the meantime, though, I bet I could wring a story or two out of this.
1Of course, this all depends on your definition of the “self”. If you define the self as the brain with all its running processes, whether conscious or not, then the “self” remains ultimately culpable even in this scenario. Any I didn’t know! bleatings by the pointy-haired boss would be true, but irrelevant; after all, no one has ever stayed an execution just because the tapeworm in the condemned’s gut was innocent of wrongdoing. But let us save that discussion for another time.