So DARPA’s feeling a little overwhelmed by the blizzard of data at its disposal. All that telemetry. All those intercepted signals. All those eyes in the sky and ears to the ground, sucking up the terabytes so fast they can barely slap new storage into place in time to catch it all. And that’s just collecting the stuff. What about actually analyzing it?
Well, DARPA’s a government bureaucracy, so obviously the first step is to create a whole new department: the Mathematics of Sensing, Exploitation, and Execution Program. Then you call for bids on a specific deliverable, to wit: a unified mathematical language for everything the military sees or hears, to get the “economy and efficiency that derives from an intrinsic, objective-driven unification of sensing and exploitation”.
Wired puts it more eloquently:
“existence is, to a sensor, what William James called a “blooming, buzzing confusion“: an unmediated series of events to be vacuumed up, leaving an analyst overloaded with unsorted data. Wouldn’t it be better if a sensor could be taught how to filter the world through a perceptual prism, anticipating what the analyst needs to know?”
Um, no. Or at least, maybe not. At least I think I disagree.
The thing is, we’re pretty much dealing with a description of how the human brain works: a filter, which by definition excludes most of your data. A prism, which refracts and distorts reality for the sake of more elegant and conspicuous highlights. And the problem with that is, well, the same problem we have with our brains. Unseen blind spots in the middle of the visual field. Phantom limbs. The unconscious rejection of anything the brain’s front lines regards as anomalous or improbable; if you’ve hung around here long enough, you already know how easily the conscious mind simply ignores everything from disappearing buildings to people in gorilla suits (if you haven’t hung around long enough, check out these demos from the University of Illinois’s Visual Cognition Lab). Hell, if the wrong part of your brain bleeds out you’ll even deny the existence of half your own body.
Brains are optimized (insofar as natural selection “optimizes” anything) for short-term survival, not objective truth. If believing absurd falsehoods increase the odds of getting laid or avoiding predators, your brain will believe those falsehoods with all its metaphorical little heart. And there’s a difference between parsing the Pleistocene savannah with an eye to remaining uneaten, and processing a million tactical inputs from landlines and geostationary satellites and everything in between with an eye to maintaining political stability in the Middle East. It may even be a significant difference, requiring fundamentally different modes of pattern-matching.
Because that’s what DARPA’s really talking about here: not analysis per sé (which will still be done by the generals), but the preliminary massaging, the distillation of an Executive Summary to highlight the salient points. Our brains already filter out ninety percent of sensory input in the name of high-grading the Important stuff; now we’re going to stick software in front to filter out ninety percent of that raw input up front. Maybe it’s an inevitable consequence of information overload; the data pipe is so thick that by now it’s physically impossible to analyze even a fraction of it before changing events render it all irrelevant.
But sometimes buildings do disappear, and the world can change as a result. And I wonder if adding even more reflexive filters blinding us to such events is the best way to go.
If I ever commit a terrorist attack, maybe I’ll dress up in a gorilla suit first.