Monday 20 June 2005
I Have A Dream
We're seeing some amazing advances in neuroscience. Today people are taking the first steps towards mental input-output --- such as reading basic thoughts, mental control of joysticks, and stimulation of vision centers. Where could this lead?
Wouldn't it be amazing to have a surgically implanted computer capable of directly receiving thoughts and inducing sensory stimulation? Think of the applications!
But it doesn't stop there. Choose to share your experiences with others in real time. Experience group consciousness. Weave together many experiences of the same event into an incredibly detailed whole.
There are some risks. For protection, all this has to be optional. But who could shut themselves off from this for long? It's all humanity's dreams rolled into one. (Well, most of them.)
Now imagine that stolen nukes go off in New York, London, Beijing and Bangalore. Tens of millions are dead, hundreds of millions more at risk, and the world teeters close to an all-out nuclear exchange. World leaders are told it is possible to create a virus that will cover the world in hours and provide temporary access to the minds of the implanted, maximising the chance of detecting terrorists --- even disabling any who are already implanted --- and pulling the world back from the brink. It's the only ray of hope, and the plan is executed successfully.
After the immediate threat has passed, citizens are given the option of withdrawing from the emergency cooperative. Such objectors are a potential security threat, and become the focus of the rest of the network. They are, in fact, disconnected from the system lest they interfere with it. It's not a popular option, taken up only by a few eccentrics --- almost no-one who has adjusted to the extraordinary lifestyle of the implanted can bring themselves to withdraw.
Over time, it is judged prudent for the security of the system --- and therefore humanity --- for all implanted minds to be nudged towards approval of the system, to avoid those rare cases where there are doubts. Likewise, they are encouraged to have their children implanted at a young age. Other mental traits that cause social friction are removed ... or if they cannot be removed, their exercise is immediately detected and countered within the mind of the perpetrator. The only remaining external threat comes from the dwindling communities of unimplanted, who are therefore relocated to zones where they cannot endanger themselves or others.
Now unrestricted thought belongs only to the men and women at the root of the system, who control its software and thereby shepherd humanity. What happens to them? Do they automate their oversight and relinquish their power, joining the blissful masses? Do they turn to some new and terrible direction? Or do they err, and all succumb to some terrible catastrophe?
I admit to being paranoid --- but to my knowledge, nothing here is technically improbable. We desperately need a new Orwell for the twenty-first century, someone to terrorise us with plausible visions so we know to flee from the hint of them.
Wouldn't it be amazing to have a surgically implanted computer capable of directly receiving thoughts and inducing sensory stimulation? Think of the applications!
- Do away with the whole messy voice, keyboard, mouse etc
- Work at full efficiency in any environment doing any activity
- Differential GPS capability makes being lost an anachronism
- Cellphones become telepathy
- High-fidelity virtual experiences of all kinds (nudge nudge, wink wink)
- Hook into sensor and actuator networks make your devices direct extensions of yourself: your house, your car, your pet robotic dog, all at your direct mental command at any range, and all acting as additional senses
But it doesn't stop there. Choose to share your experiences with others in real time. Experience group consciousness. Weave together many experiences of the same event into an incredibly detailed whole.
There are some risks. For protection, all this has to be optional. But who could shut themselves off from this for long? It's all humanity's dreams rolled into one. (Well, most of them.)
Now imagine that stolen nukes go off in New York, London, Beijing and Bangalore. Tens of millions are dead, hundreds of millions more at risk, and the world teeters close to an all-out nuclear exchange. World leaders are told it is possible to create a virus that will cover the world in hours and provide temporary access to the minds of the implanted, maximising the chance of detecting terrorists --- even disabling any who are already implanted --- and pulling the world back from the brink. It's the only ray of hope, and the plan is executed successfully.
After the immediate threat has passed, citizens are given the option of withdrawing from the emergency cooperative. Such objectors are a potential security threat, and become the focus of the rest of the network. They are, in fact, disconnected from the system lest they interfere with it. It's not a popular option, taken up only by a few eccentrics --- almost no-one who has adjusted to the extraordinary lifestyle of the implanted can bring themselves to withdraw.
Over time, it is judged prudent for the security of the system --- and therefore humanity --- for all implanted minds to be nudged towards approval of the system, to avoid those rare cases where there are doubts. Likewise, they are encouraged to have their children implanted at a young age. Other mental traits that cause social friction are removed ... or if they cannot be removed, their exercise is immediately detected and countered within the mind of the perpetrator. The only remaining external threat comes from the dwindling communities of unimplanted, who are therefore relocated to zones where they cannot endanger themselves or others.
Now unrestricted thought belongs only to the men and women at the root of the system, who control its software and thereby shepherd humanity. What happens to them? Do they automate their oversight and relinquish their power, joining the blissful masses? Do they turn to some new and terrible direction? Or do they err, and all succumb to some terrible catastrophe?
I admit to being paranoid --- but to my knowledge, nothing here is technically improbable. We desperately need a new Orwell for the twenty-first century, someone to terrorise us with plausible visions so we know to flee from the hint of them.
Comments
As long as the world's split up into sovereign countries scrabbling for influence and internal stability, there'll always be an outside. Personally I don't see countries or sovereignty going away in the foreseeable future; witness the reactions of many Europeans to the loss of even a small part of their sovereignty in their individual states to the EU. Of course, when that happens it's worth starting to worry, although I think then pervasive technologies such as you describe will be the least of our worries.
If such technologies come true there will be not one system but several. And these systems will have complex subsystems. Society is a complex mess and always will be and such a system will be just another reflection of society and its complexity.
A book I find very interesting in this regard is Shirow Masamune's Ghost in the Shell. Yes, that's japanese manga.
But the world he describes is a step beyond Philip K Dick and Gibson. It takes their concepts as a given and builds a complex and sometimes quite political reality on top of it. On the one hand there is little distinction between AI and human consciousness, on the other it's carrier can be virtual, mechanical or biological. All interchangable and hackable.
Highly recommended :)
The scenario I described starts with a unified technology base and suggests how tyranny could emerge from it. It is not overtly a top-down political imposition.
As to whether it could be long-term stable ... obviously I don't know, but it's hard to argue that it *could not* be, if it wields powers of a scope and quality we have never seen before.
I guess one of my points, which was also Orwell's point, is that technology puts us off history's map. We can't look to Jared Diamond for reassuring universal truths of human civilization, because the rules are different now. And my hypothetical mind interface device takes us a lot further into the unknown than Orwell went. Orwell himself wrote that the one thing his Party desired that they still lacked was the ability to read minds.
Does that mean you'd see people with a BFOD: Blue Face Of Death?
/be
It seems to me that, in the spirit of the internet and those who believe �Information wants to be free,� any control of such a system by an oligarchy �at the root of the system� would be somewhat unlikely, though not necessarily impossible. Leaks and hacks in such a system, whether accidental or willful, benevolent or malicious, would only help to keep the system more open, much like the internet today with its widespread sharing of information (often in the form of piracy). The hackers of the day would surely not rest until all possible avenues of restoring the equality of the system had been explored and exploited. After all, on today�s internet, do the authorities have any tools at their disposal that hackers do not and why should they in the world of the future. The system you describe sounds to be an eerie mix of my personal paradise and the collective consciousness of the Borgs. That being the case, it would be hard for an elite group at the top to make decisions independent of the rest of society as it seems that many, if not all, decisions of public policy would be instantaneously carried out through pure democratic polling of everyone in the system. Such a society would become adherent to the whims of the majority (or at least the plurality). The likelihood seems remote of even a plurality thinking it best to take away the voice of the individual, despite the ease with which it can be heard, only to hand power over to a small few when every member of society necessarily has the same capabilities as this would-be elite.
Now, that being said, I would appeal to the incompleteness theorem and say that there is no way that we can ever flawlessly duplicate the human mind in any system since we are constrained by the bounds of the human mind itself when creating such a system. Even after creating AI and enlisting it in our endeavor, it seems logical that the same limitation would apply to them as a product of our own minds. This would, of course, leave an un-sealable door open to the outside world, giving members of the system the potential for escape even if only on a temporary basis. This is reminiscent of the premise of �The Matrix� where �if you are not one of us, you are one of them� but it is still possible to become �free� through purely mental hacking of the system. This, though, sounds more likely to be the means through which a terrorist strike would occur (short-term escape) than the path of Big Brother, since, seemingly, once he rejoined the system to take control, his intentions would become known to everyone and would instantly be vetoed.
Now, like many others, my views are must seem quite optimistic but just as we need doomsayers to warn us of tending down the wrong path, we need others who have a firm faith in the heart of humanity so that we are not afraid to explore potentially beneficial avenues for fear of what else lies beyond the good. A balanced argument is always what is needed and for that, we all need to be heard.
> have any tools at their disposal that hackers do
> not and why should they in the world of the
> future.
Today we have open hardware; most of the important network endpoints are ultimately under the control of the users, if you're willing to work hard enough. In the future most important network endpoints will contain "rights management" hardware that is ultimately under the control of the hardware and/or software vendors and in practice cannot be subverted by users or hackers. There will still be bugs and hacks, but it will be a lot more difficult to repurpose the infrastructure the way, say, the Internet was turned into an MP3 distribution network, or the way the WWW grew up in the face of commercial opposition.