Wednesday 18 February 2009
Hazardous Biology
This is the scariest thing I've read in a long time. If the leading lights in synthetic biology are this naive, humanity is toast. The idea that we can prevent abuse while democratising access, by bonding researchers into a loving utopian community, is laughable. The comparison to computers is completely inappropriate. Thomas Lord's comment near the bottom is a pretty good summary of the problems here.
I especially dislike the rhetorical technique that treats suppressing a technology as de facto absurd. On the contrary, if a technology poses a serious danger to humanity, then it absolutely should be suppressed, or at least tightly controlled and not "democratised". We've done it for nuclear weapons with moderate success.
God have mercy on us.
Comments
The fears suggested are hollywood-stuff. Rather than Thomas Lord, consider the immediately preceding comment of Roy Batty... Why is a computer virus harmless in the wild? Because pigs will fly before it'll run outside of a computer.
To a non-specialist, it might sound plausible that these bio-engineered organisms might "take over the world", but in actual fact, the chance of that occurring is nonexistent. There's all kinds of reasons why it's unlikely that'll occur, but consider that there are far more prosaic forms of far more dangerous bio-engineering which we constantly engage in with far less oversight and with far more dangerous repercussions: namely plain centuries old selective breeding, or the transport of invasive species.
Bio-engineered organisms are a Hollywood threat; lab-grown organisms are fragile and quite likely to succumb to infection or competition (and the more engineered they are, the less threatening they tend to be).
A software engineers instincts suggest that a subtle bug can cause widespread problems. That's a problem caused by the utter lack of competition and the relative vaccuum in which software lives.
In biology? If a simple fix could miraculously allow a particular strain such an advantage, it would exist. The idea that humans might make a microorganism that can out-compete existing ones (outside of an engineered situation) is laughable.
I'm a programmer; By happenstance many of my friends are from a medical or biological background, and a year ago I considered participating in a project much like what Ms. Shetty describes, so I looked into it a little.
That experience would lead me to believe that there's no cause for alarm. If, at some time in the future, the state of the art advances and somewhat survivable organisms are possible...
Right now, the much less sexy plain old breeding still is much _much_ more likely to be dangerous.
Knowing the way these conversations go, I doubt I convinced you, but you never know...
IIRC many more herbicide resistant plant strains have been created by deliberate but random disruption of DNA by radiation, than by deliberate ans specific modification.
Developing such a pathogen would probably require an understanding of biology we're not even close to. We've been messing about with DNA manipulation for years, we can still barely come up with things that copy functions that nature already does.
Every technology bring catastrophies. always. The real question is : are we prepared for them ?
Are we prepared to deal with car/train/plane crashes ? yes I think we are.
Are we prepared to deal with computers crashes ? yes we definitely are.
Are we prepared to deal with biological crashes ? no we're not!
are we prepared to deal with the possibility that God doesn't exist ? yes we are. Are you Robert ? ;-)
sorry, i'm a deep agnostic.
This is _identical_ to the LHC fear-mongering from last year (http://hasthelargehadroncolliderdestroyedtheworldyet.com/)
I doubt that any of these three is going to happen.
The LHC is about evaluating risks. The risk has been calculated to be less than what is happening in nature.
Biotech creates things that don't happen naturally. There _will_ be hyper-resistant bacterias. There _will_ be over-aggressive rats. There _will_ be carcinogen fungus. There _will_ be something bad I can't think of.
If you don't believe it, you're underestimating that technology.
There _will_ also be bad things happening with LHC, but not the monster black hole thing. Things we're prepared to deal with.
It's not fear. It's just how it goes. Every technology know catastrophies. And it happens sooner if the technology is democratized.
you're not digging a giant ring in your backyard. are you ?