Nanotechnology Now

Our NanoNews Digest Sponsors
Heifer International

Home > Interviews > Nick Bostrom - December 2001

Nick Bostrom Interview December 2001

The following is an interview with Dr. Nick Bostrom, Department of Philosophy, Yale University. Currently teaching at Yale, Dr. Bostrom specializes in the: Philosophy of science; foundations of probability theory; ethics and strategic implications of technology and science. Earning a PhD in philosophy at the London School of Economics, he has published many intriguing papers, such as The Simulation Argument -- Are You Living In a Computer Simulation?, How Long Before Superintelligence? and When Machines Outsmart Humans, and co-founded the World Transhumanist Association (WTA) " interdisciplinary approach to understanding and evaluating the possibilities for overcoming biological limitations through technological progress."

You would do well to invest some of your precious time reading about Dr. Bostrom's ideas and passions. He speaks from the heart, clearly, forthrightly, and with boldness. And don't be surprised if you see him around come the next century, and beyond...

Regarding Molecular Nanotechnology, the associated sciences, and philosophies [such as Transhumanism]:

1. Considering where you thought we'd be by 2002, how do you feel about current technological progress?

Nanotechnology has been moving a little faster than I expected, virtual reality a little slower.

2. Besides getting themselves informed about MNT, what proactive steps can the general public take to help reduce the negative possibilities of these future technologies [such as the Gray Goo or "runaway" scenario]?

Seriously to ask oneself this question, is an excellent first step!

One can then adopt something like the following action plan:

(1) Support moves toward a world-order where it is possible to regulate dangerous uses of MNT effectively -- at least for an interim period until adequate defenses can be developed. This could mean working for peace, transparency, democracy, and some sort of world security pact (which in an ideal world might be based on a world federation but in the actual case may have to be an US-led coalition of the most powerful nations).

(2) Work to accelerate beneficial technologies -- such as nanotech "active shields" or immune systems, surveillance technologies, Friendly AI, human intelligence augmentation, and information technologies that enable us to work better as epistemic communities.

(3) Promote public awareness and informed debate.

(4) Support individuals and groups that are working on (1)-(3), such as the World Transhumanist Association and the Foresight Institute.

What we need in this instance is not pro-tech or anti-tech ideology or radical political agendas, but rather a constructive, pragmatic approach. We also need good will, and foresight.

3. Are you seeing as much cooperation among the sciences as you expected? If so, how can it be further improved? If not, how can we get everyone involved synergistically?

We could do more to make interdisciplinary work practicable, but even more important is that transhumanism becomes a legitimate area of academic research. So far, most of the development in this field has been done outside academia (partly by scholars writing for popular audiences and partly by non-academics). Academia has inertia, but it also has some really powerful and creative minds, and once a topic is recognized as legitimate and is funded, progress can go beyond what is otherwise possible. I'm stressing academic research not because it is more important than the contributions by people working in technology, business, non-profits, or government (it isn't!), but because I think we've been lagging in this regard (although I've seen signs in the past couple of years that this may be about to change).

Any entrepreneur or social benefactor reading this may also want to look at Robin Hanson's Idea Futures concept. It would be worth giving it a try.

4. Based upon where we are now, do you anticipate any dramatic breakthroughs in the near future? In 5 years. In 10. In 25. ....

If you look at a single field, breakthroughs look like discontinuities, yet if you zoom out and look at the overall picture, breakthroughs occur all the time. But they add up to a roughly constant growth rate.

Superintelligence may be a true global discontinuity when it appears. A singularity.

But we should also think about how we measure the "magnitude" or "importance" of some development. Looking at the technological gee-whiz impact factor is a poor proxy. Looking at its effect on economic growth is better but still woefully inadequate. What we are ultimately concerned with is its implications for human well-being, our capacity to live great lives. Much of the fancy stuff doesn't have much bearing on that level, especially not for people who are already materially well off.

5. Where do you see the most dramatic changes occurring with the advent of mature MNT?

Uploading and superintelligence, or gray goo that is the question. I think it will take several decades to get there. But we will get there if technological development continues.

6. And how can society and industry prepare for it?

It's premature to view these developments as business opportunities. Yet society should begin to prepare, e.g. by implementing the action plan outlined above and especially by starting a serious debate about our long-term prospects.

7. What event has caused you the greatest concern? the greatest hope? or is the most contentious?

Paradoxically, the tragedy of September 11th could be a blessing in disguise. Suppose for the price of a few thousand lives that we become more alert to the threat of rogue weapons of mass destruction, and we consequently take decisive action to reduce the likelihood of much worse attacks in the future... If it leads us to take a more proactive approach down the road to preventing destructive nanotech, the gain could be enormous. It all depends on what lessons we choose to learn.

8. Do you belive in the tenets of Transhumanism, and the Extropian viewpoint?

Not surprisingly (since I've played a part in formulating them) I do agree with the tenets of transhumanism.

We can't imagine how wonderfully good life will become (if things go well..) when we succeed in overcoming many of our current biological limitations through technology. Material abundance, clean environment etc. are not that bad -- but the really profound thing is when we improve ourselves. When we finally abolish aging and disease, when we expand our minds, when we step off the hedonic treadmill and get to explore new realms of well-being and emotional richness. Don't think you can intuit what these things will be like "from the inside" because we have little clue.

There are some problems that technology can't solve. Many human preferences are for so-called positional goods, which are essentially scarce. For example, if two persons want the exclusive love of person X, at least one of them is going to be frustrated. Technology could give one the illusion of being loved by X, or the emotions that being loved by X would trigger, or a loving replica of X, or even the option of getting rid of the desire to be loved by X. But none of those alternatives may be what we want.

9. Are you prepared to become Transhuman?

Would I want right now to jump into some totally alien world leaving those I care about behind? No.

But taking steps - slowing the aging process, getting a little smarter - and then gradually, after having lived much of a normal human life, upgrading our capacities together to become transhumans and one day posthumans - Yes, absolutely!

If you have a comment, please us.

The latest news from around the world, FREE

  Premium Products
Only the news you want to read!
 Learn More
Full-service, expert consulting
 Learn More

Nanotechnology Now Featured Books


National Space Society

Project Mind
The Hunger Project


Building Gods

Quantum leap

Inapplicable Intuitions