Technepathy

Intercontinental brain-to-brain interface to transfer cortical tactile information.

From Pais-Vieira et al

You’ve probably heard about the rat-brain network by now — it showed up in the popsci threads back at the end of February, provoking breathless comparisons with Vulcan mind melds and The Matrix. And I gotta say, the coverage certainly sucked me in: an actual (albeit rudimentary) network of brains, linked together to solve problems? Hive-Mind stuff; Mind-Hive stuff. Something very much like it shows up in Echopraxia. It makes a cameo in “Giants”. The talk I gave at last year’s SpecFic Colloquium got into it big-time. Right up my alley.

Then you read the actual research paper and, well… not so much.

This is how they sell it; this, technically, is how it was. The brains of two rats, each connected to the other by an array of microelectrodes implanted in the motor cortex. One of them is presented with a stimulus; the other, with the means to act on that stimulus1. If the second rat reacts correctly to the stimulus the first one perceives, both get a reward. Rat #2 — gifted with no clues or insights save those piped directly from the brain of Rat #1 — reacts correctly 70% of the time, far more often than the 50% hit rate that random chance would serve up.

Ergo, the rats’ brains are in direct communication via an electronic network. Technologicaly-mediated telepathy. Technepathy.

There are the rudiments of a B2B interface here, certainly. Motor cortices and embedded electrode microarrays. A computer that mediates signal transmission from one to the other. Options presented, choices made, a reward for pressing the right lever. The way the Results are worded you really get a sense of linked minds, that the signals received by decoder-rat were pretty much the signals generated by the motor cortex of the encoder (“The primary factor that influenced the decoder rat’s performance was the quality of spatial information extracted from the encoder rat’s M1,” Pais-Vieira et al tell us. “The performance was high if the chosen neuronal ensemble accurately encoded left versus right presses”.) You imagine the array reading the motor commands off the very cortex, sending them through the internet, inserting them into the recipient’s motor control system where — possessed by an alien command planted in her brain — the little rodent feels an irresistible urge to move her paw just so.

Only when you move on to the Methods (Yes, “Methods” come after the “Results” in this paper for some reason), do you discover: oh, wait. The recipient was trained beforehand to push one lever or the other, depending on the incoming stimulus. And the stimulus wasn’t a motor command copied-and-pasted from one brain to the other, it was an arbitrary signal sent by the computer after the computer had decoded the sender-rat’s neural activity. It’s the difference between experiencing an orgasm and watching a tiny figure on a faraway hill spell out oh-god-oh-god-oh-god-yes using signal flags.

In other words, this is a brain dyad only in the most trivial sense. There’s no real meeting of the minds here, no sensory input or motor commands flitting between cortices in native neuro-ratspeak. Sure, the recipient must be feeling something; but whatever that is, it’s not the vicarious feel of walls against whiskers or the urge to move a muscle. It’s more itch than insight; the little guy was just trained beforehand to push one lever in response to Itch X, and another in response to Itch Y. For all the telepathy involved, he might as well have reacted to a blinking LED.

I admit I’m curious as to what that itch actually felt like, mind you. After all, the electrodes were embedded in motor wiring, not sensory; I’m a bit surprised that Pais-Vieira’s decoders felt anything at all (I’d have expected some kind of uncontrollable muscle tremor). And this is the standard approach used by all those wired-up primates who’ve been making news by controlling robot and/or virtual limbs with their minds. You don’t move that thing on the screen by actually sending a motor command down your arm; you just kinda concentrate, and the computer takes those arbitrary brainwaves and interprets them as left or right. Somewhere out there, someone might have trained an on-screen avatar to jump whenever she thinks the first six notes of “Aqualung”. It’s all pretty cool.

But a mind-meld? A “brain-net“? Not even close, not unless I become part of the Borg Collective every time I have a two-sentence conversation with someone.

Which is not to say that I find the idea of brain-to-brain networks ludicrous in principle. I’m actually a bit scared by them. Everybody knows what happens when you split a brain down the middle, force the hemispheres to resort to the dial-up speeds of the hypothalamus instead of the broadband pipe of the corpus callosum: two distinct personalities emerge in the space where one had been before. Fewer people, I suspect, know that it works the other way around; when isolated hemispheres are reconnected (when an anesthetized hemisphere wakes back up, for example), the persona manifested by the lone hemisphere gets swallowed into the greater whole. I see no reason why that wouldn’t scale up. If you could build a fat enough pipe between two intact brains — an electronic corpus callosum, as it were — and if you could keep latency down to below the few-hundred milliseconds that seems necessary for coherent self-identity — would the result be two linked minds, or a single distributed one? Would the parts retain their identity, or would consciousness expand to fill the space available? If you joined such a network, would you retain any more autonomy, any more sense of self, than your parietal lobe enjoys now?

I feel a story coming on: a few decades from now, a glitch in the iMind servers inadvertently surpasses that magic bandency threshold and a few million streaming subscribers meld for a few minutes. But the subsequent class-action lawsuit fails when Apple argues that none of the affected individuals have standing to bring the case, since none of them actually existed as individuals during the alleged events. Maybe it even countersues on behalf of that vast and ephemeral mind who took their place, only to be torn apart and murdered after a measly ten minutes of life.

I bet I could keep it short enough to fit onto the back page of Nature. Still need a punchline, though.

 

 


1 Die-hard fans will remember that this is pretty much the same experimental protocol that the crew of the Theseus inflicted on the captive Scramblers in Blindsight, albeit without the active torture component.



This entry was posted on Wednesday, April 10th, 2013 at 10:23 am and is filed under Dumbspeech, neuro, sentience/cognition. You can follow any responses to this entry through the RSS 2.0 feed. Both comments and pings are currently closed.
19 Comments
Inline Feedbacks
View all comments
Charles R
Guest
Charles R
11 years ago

That’s an interesting lawsuit, but what would have been the harm that brought the suit in the first place?

Sheila
Guest
Sheila
11 years ago

Charles R,

If a person exists in its brain, and then it breaks in to two people due to a split brain, would it be the same person if the split brain could be repaired? Maybe they’d only be related.

So maybe relatives would have a class action lawsuit for all the people who died.

(Oh, and maybe the people in the same bodies would have to be defending against relatives who are suing for inheritance now that they can claim the person actually died)

Adam Etzion
Guest
11 years ago

Punchline: The super-brain personality was quick and intelligent enough to anticipate both lawsuits and issued a last will and testament screwing both sides over.

Adam Etzion
Guest
11 years ago

Thus proving, by being a vindictive, hateful asshole, that despite being spread over multiple brains and bodies, it was still human.

Seth
Guest
11 years ago

The punchline is that apple users were ever considered to be people in the first place. 😉

Will Sargent
Guest
11 years ago

“Thus proving, by being a vindictive, hateful asshole, that despite being spread over multiple brains and bodies, it was still human.”

That is a fascinating argument to make — if groupminds are people in the same sense of corporations, can groupminds be held to the same legal standards… and at what point can groupminds marry, adopt, etc.

Jon Evans
Guest
11 years ago

Write the NATURE piece as a short legal brief, and have the punch line be the-lawyer-is-a-hive-mind?

Juan
Guest
11 years ago

Thank you for another great, thought provoking article. It left me musing over the sort of therapy necessary for dealing with reintegration.

Whoever
Guest
Whoever
11 years ago

Punchline: they “lose” and have to pay damages to one plaintiff.

helpful hints
Guest
11 years ago

Hmm is anyone else encountering problems with the images
on this blog loading? I’m trying to determine if its a problem on my end or if it’s
the blog. Any suggestions would be greatly appreciated.

Ensley F. Guffey
Guest
11 years ago

You last couple of paragraphs about human brain networks reminded me of an only Theodore Sturgeon novella I read recently: To Marry Medusa. Long story short, an alien mass-mind, unable to grasp that any other form of mind could evolve naturally, “fixes” the human race in order to facilitate a take-over in the form of swallowing up the human network in the greater, alien whole.

I won’t spoil the story, but the human mass-mind ends up being frighteningly effective, but also ready and willing to sacrifice individual parts (people) in order to protect/serve the whole, and the people concerned, while they want to live, sacrifice themselves/are sacrificed by the whole willingly.

Sturgeon’s thought experiment is right down the alley you’re talking about, and I recommend it to you – plus, you know, Sturgeon!

nas
Guest
nas
11 years ago

I actually just looked up the rat paper today as part of story research. There goes my bed-time reading.

Legal point one: not existing during the period when they were a single consciousness doesn’t deprive the plaintiffs of standing. They have a general continuity of persona, having existed as discrete, identifiable persons before and after their shared experience, and their post-event selves have arguably suffered harm, including (but not limited to):
(a) not existing for a little while (not worth much in itself, but worth including),
(b) any sequilae that follow from the period of non-existence (this may be trivial or substantial, depending what went on during that time–if someone couldn’t exercise an option on a share purchase because they were all mind-melded and therefore missed a big payday, this could be worth a fair bit), and
(c) the pain and suffering of freaking the fuck out after they return to their individual existences because of having just been subjected to (and possibly retaining memories of) all the weird, screwed up, terrifying little secrets contained in the minds of all those other people, not to mention having their own secret thoughts and memories exposed to total strangers (the value of this will depend on the outlook of the judge, but I’d bet it would be reasonably substantial).

Legal point two: Apple doesn’t stand in loco parentis (or anything comparable) with respect to the joint mind of the affected people, so it’s Apple that has no standing to sue. Still, in order to preserve this part of the plot, it *is* possible that, once Apple has raised the issue, the Public Trustee (who represents the interests of persons who can’t act on their own behalf, for instance because of mental incompetence) would make an application to act for the joint persona. Then the PT’s office could launch or defend any action for that multiple-person consciousness.

As for the idea of a many-in-one consciousness, in my research I did find a paper that, while working at the very humble beginnings of the problem, does at least try to consider it seriously (and in the process alludes to the potential parallel between the way that a single personality results from the union of the two hemispheres of one brain and the possibility that joining brains could give rise to joint consciousness).

“Coalescing Minds: Brain Uploading-Related Group Mind Scenarios,” by Kaj Sotala and Harri Valpola. You can find it here: http://kajsotala.fi/Papers/CoalescingMinds.pdf

Alexey
Guest
Alexey
11 years ago

helpful hints:
Hmm is anyone else encountering problems with the images
on this blog loading? I’m trying to determine if its a problem on my end or if it’s
the blog. Any suggestions would be greatly appreciated.

You should have no problem viewing the image for this post, unless you cannot reach nature.com, but other posts require a browser with large data URI support, which excludes IE versions prior to 9.

Hugh
Guest
Hugh
11 years ago

What about the class action lawsuit from the people who want to be reconnected into the shared mind?

And no that’s not a crack at Apple consumers in particular: many religions have the goal of losing your individuality in some kind of communal bliss.

Recent Convert
Guest
Recent Convert
10 years ago

Not sure if related, but does anyone have any thoughts on this – Dr. Yang Dan’s cat experiments, wherein living conscious cats are basically used as furry camcorders:

http://www.youtube.com/watch?v=FLb9EIiSyG8
http://news.bbc.co.uk/2/hi/science/nature/471786.stm

This seemed crazier to me than the rat thing, yet … it’s 14 years old. Has it been exposed as fake? Why isn’t it famouser?

Recent Convert
Guest
Recent Convert
10 years ago

>Maybe she just doesn’t have as good a publicist as these Johnnie-come-latelies…

Paging one of the best hard SF writers alive … paging …

Well, I was amazed by it, not sure if it’s your kind of thing.

>Do the Dharmic systems say anything about coming back from Nirvana once you’ve been there? Anyone?

My understanding was that you’re supposed to help others get there:

http://en.wikipedia.org/wiki/Bodhisattva_vow
>http://en.wikipedia.org/wiki/Ten_Bulls

Also possibly relevant: Ramez Naam’s novel “Nexus” about a hive mind drug.

Jeremy
Guest
Jeremy
10 years ago

You have remarkable faith in the robustness of the law if you think it will be able to prospectively cope with fungible identity or gestalt consciousness.