David J Johnson (talk | contribs) m Undid revision 535917957 by 128.40.128.74 (talk) No reference, poor spelling |
92.2.72.72 (talk) rv, the trilemma doesn't follow because H is large, but because N is large. See the original paper. |
||
Line 35: | Line 35: | ||
:<math>f_\textrm{sim}</math> is the fraction of all humans who live in virtual realities. |
:<math>f_\textrm{sim}</math> is the fraction of all humans who live in virtual realities. |
||
Because |
Because N will be such a large value, at least one of the three proximations will be true: |
||
:<math>f_\textrm{p}</math> ≈ 0 |
:<math>f_\textrm{p}</math> ≈ 0 |
Revision as of 05:49, 24 February 2013
![](https://upload.wikimedia.org/wikipedia/commons/thumb/7/75/Nick_Bostrom%2C_Stanford_2006_%28square_crop%29.jpg/220px-Nick_Bostrom%2C_Stanford_2006_%28square_crop%29.jpg)
Nick Bostrom (born Niklas Boström on 10 March 1973[1]) is a Swedish philosopher at St. Cross College, University of Oxford known for his work on existential risk and the anthropic principle. He holds a PhD from the London School of Economics (2000). He is currently the director of both The Future of Humanity Institute and the Programme on the Impacts of Future Technology as part of the Oxford Martin School at Oxford University.[2]
He is the author of some 200 publications, including Anthropic Bias (Routledge, 2002), Global Catastrophic Risks (ed., OUP, 2008), and Human Enhancement (ed., OUP, 2009). He has been awarded the Eugene R. Gannon Award and has been listed in the FP 100 Global Thinkers list. His work has been translated into more than 20 languages, and there have been some 100 translations or reprints of his works.[citation needed]
In addition to his writing for academic and popular press, Bostrom makes frequent media appearances in which he talks about transhumanism-related topics such as cloning, artificial intelligence, superintelligence, mind uploading, cryonics, nanotechnology, and the simulation argument.
Philosophy
Ethics of human enhancement
Bostrom is favourable towards "human enhancement", or "self-improvement and human perfectibility through the ethical application of science",[3][4] as well as a critic of bio-conservative views.[5] He has proposed the reversal test for reducing status quo bias in bioethical discussions of human enhancement. [6]
In 1998, Bostrom co-founded (with David Pearce) the World Transhumanist Association[3] (which has since changed its name to Humanity+). In 2004, he co-founded (with James Hughes) the Institute for Ethics and Emerging Technologies. In 2005 he was appointed Director of the newly created Future of Humanity Institute in Oxford. Bostrom is the 2009 recipient of the Eugene R. Gannon Award for the Continued Pursuit of Human Advancement [7][8] and was named in Foreign Policy's 2009 list of top global thinkers "for accepting no limits on human potential." [9]
Existential risk
Bostrom has addressed the philosophical question of humanity's long-term survival.[10] He defines an existential risk as one in which an "adverse outcome would either annihilate Earth-originating intelligent life or permanently and drastically curtail its potential." In the 2008 volume "Global Catastrophic Risks", editors Bostrom and Cirkovic offer a detailed taxonomy of existential risk, and various papers link existential risk to observer selection effects[11] and the Fermi paradox.[12]
Simulation hypothesis
Bostrom contends that at least one of the following statements is overwhelmingly likely to be true:
- No civilization will reach a level of technological maturity capable of producing simulated realities, or such simulations are physically impossible.
- No civilization reaching aforementioned technological status will produce a significant number of simulated realities, for any of a number of reasons, such as diversion of computational processing power for other tasks, ethical considerations of holding entities captive in simulated realities, etc.
- Any entities with our general set of experiences are almost certainly living in a simulation.
The following is the equation which is used to quantify those three statements:[13]
where:
- is the fraction of all human civilizations that will reach a technological capability to program reality simulators.
- N is the average number of ancestor-simulations run by the civilizations mentioned by .
- H is the average number of individuals who have lived in a civilization before it was able to perform reality simulation.
- is the fraction of all humans who live in virtual realities.
Because N will be such a large value, at least one of the three proximations will be true:
- ≈ 0
- N ≈ 0
- ≈ 1
Books
- Anthropic Bias: Observation Selection Effects in Science and Philosophy, ISBN 0-415-93858-9
- Global Catastrophic Risks, edited by Nick Bostrom, ISBN 978-0-19-857050-9
- Human Enhancement, edited by Julian Savulescu and Nick Bostrom, ISBN 0-19-929972-2
See also
References
- ^ nickbostrom.com
- ^ http://www.oxfordmartin.ox.ac.uk/people/22
- ^ a b Sutherland, John (9 May 2006). "The ideas interview: Nick Bostrom; John Sutherland meets a transhumanist who wrestles with the ethics of technologically enhanced human beings". The Guardian.
- ^ Bostrom, Nick (2003). "Human Genetic Enhancements: A Transhumanist Perspective" (PDF). Journal of Value Inquiry. 37 (4): 493–506. doi:10.1023/B:INQU.0000019037.67783.d5.
{{cite journal}}
: Unknown parameter|authormask=
ignored (|author-mask=
suggested) (help) - ^ Bostrom, Nick (2005). "In Defence of Posthuman Dignity". Bioethics. 19 (3): 202–214. doi:10.1111/j.1467-8519.2005.00437.x.
{{cite journal}}
: Unknown parameter|authormask=
ignored (|author-mask=
suggested) (help) - ^ Bostrom, Nick; Ord, Toby (2006). "The reversal test: eliminating status quo bias in applied ethics" (PDF). Ethics. 116 (4): 656–679.
{{cite journal}}
: Unknown parameter|authormask=
ignored (|author-mask=
suggested) (help) - ^ http://gannonaward.org/The_Gannon_Award/The_Gannon_Group.html
- ^ http://www.fhi.ox.ac.uk/archive/2009/eugene_r._gannon_award_for_the_continued_pursuit_of_human_advancement
- ^ "73. Nick Bostrom". The FP Top 100 Global Thinkers. Foreign Policy. December 2009.
- ^ Bostrom, Nick (March 2002). "Existential Risks". Journal of Evolution and Technology. 9.
{{cite journal}}
: Unknown parameter|authormask=
ignored (|author-mask=
suggested) (help) - ^ Tegmark, Max; Bostrom, Nick (2005). "Astrophysics: is a doomsday catastrophe likely?" (PDF). Nature. 438 (7069): 754. doi:10.1038/438754a.
- ^ Bostrom, Nick (May/June 2008). "Where are they? Why I Hope the Search for Extraterrestrial Life Finds Nothing" (PDF). MIT Technology Review: 72–77.
{{cite journal}}
: Check date values in:|date=
(help); Unknown parameter|authormask=
ignored (|author-mask=
suggested) (help) - ^ Bostrom, Nick (19 January 2010). "Are You Living in a Computer Simulation?".
{{cite web}}
: Unknown parameter|authormask=
ignored (|author-mask=
suggested) (help)
External links
- Nick Bostrom's homepage.
- Bostrom's Anthropic Principle page, containing information about the anthropic principle and the Doomsday argument.
- Online copy of book, "Anthropic Bias: Observation Selection Effects in Science and Philosophy" (HTML, PDF)
- Bostrom's Simulation Argument page.
- Bostrom's Existential Risk page.
- Oxford Future of Humanity Institute
- The Guardian interviews Bostrom about the World Transhumanist Association
- Interview on transhumanism
- TED Talks: Nick Bostrom on our biggest problems at TED Global in 2005
- http://www.fhi.ox.ac.uk/archive/2009/eugene_r._gannon_award_for_the_continued_pursuit_of_human_advancement
- http://sexgenderbody.com/content/interview-nick-bostrom-and-david-pearce-about-transhumanism
- http://www.consensuspoint.com/prediction-markets-blog/prediction-marketsand-the-creation-of-government-policy
- http://www.edge.org/q2010/q10_4.html
- http://www.theatlantic.com/technology/archive/2012/03/were-underestimating-the-risk-of-human-extinction/253821/