Human extinction
1528711
226085075
2008-07-16T19:32:03Z
90.199.47.188
/* See also */
{{Refimprove|date=March 2008}}
'''Human Extinction''' is the assured [[end]] of the [[human]] [[species]]. Various scenarios have been discussed in [[science]], [[popular culture]], and [[religion]] (see [[End time]]). The breadth of this article is on [[existential risks]].
==Possible scenarios==
''See also [[Risks to civilization, humans and planet Earth]]''
* Severe forms of known or recorded disasters
** Warfare, whether [[Nuclear warfare|nuclear]] or [[Biological warfare|biological]]; see [[World War III]].
** Universal [[pandemic]] involving a [[gene]]tic disease, [[virus]], [[prion]], or [[antibiotic]]-resistant [[bacteria|bacterium]].
** [[Famine]] resulting from [[overpopulation]] (see [[Malthusian catastrophe]])
* Environmental collapses
** Catastrophic [[climate change]] as a result of [[global warming]] or the effects of extensive [[deforestation]] and [[pollution]]. (E.g. the warnings of [[James Lovelock#Mass human extinction|James Lovelock]])
** Loss of a breathable [[Earth's atmosphere|atmosphere]], for example due to an [[anoxic event]].
** Occurrence of a [[supervolcano]].
** Extreme [[ice age]] leading to [[Snowball Earth]]
**The destruction of the [[ozone layer]] causing higher [[ultraviolet radiation]].
* Long term habitat threats
** In 1.4 million years [[Gliese 710]] will be only 1.1 [[Light year]]s from Earth, and might catastrophically perturb the [[Oort cloud]].
** In about 3 billion years, our [[Milky Way galaxy]] is expected to [[Andromeda-Milky Way collision|collide with the Andromeda galaxy]]. Collisions of individual bodies will likely be scarce; however, the consequences for orbits of stars and planets are unclear, and impossible to predict for individual stellar systems.
** In 5 billion years hence the [[Sun]]'s [[stellar evolution]] will reach the [[red giant]] stage, in which it will expand and engulf Earth. But before this happens it will already have changed Earth's climate and its radiated spectrum may alter in ways Earth-bound humans could not survive.[http://www.space.com/scienceastronomy/080226-vaporized-earth.html]
**In the far future the main risks to human survival could be [[heat death]] and cooling with the [[expansion of the universe]].
* [[Evolution]] of humanity into a [[Posthuman (Human evolution)|posthuman]] life-form or [[existence]] by means of [[technology]], leaving no trace of original humans
** Commentators such as [[Hans Moravec]] argue that humanity will eventually be supplanted and replaced by [[artificial intelligence]] or other forms of [[artificial life]]; others such as [[Kevin Warwick]] point to the possibility of humans evolving by linking with technology<ref>[[Kevin Warwick|Warwick, K]]: “I,Cyborg”, University of Illinois Press, 2004</ref>; while others have argued that humanity will inevitably experience a [[technological singularity]], and furthermore that this outcome is desirable (see [[singularitarianism]]).
** [[Transhumanism|Transhumanist]] [[genetic engineering]] could lead to a species unable to inter-procreate, accidentally resulting in actual (rather than [[pseudoextinction|pseudo]]) extinction.
**[[Isaac Asimov]]'s '[[The Last Question]]' provides a diversion on this theme.
* [[Evolution]] of humanity into another hominid species. Humans will continue to evolve via traditional natural selection over a period of millions of years, and homo sapiens will gradually transition into one or more new species.
* Extinction in a [[Population decline|whimper]]
** Preference for fewer children; if [[developed world]] [[demographics]] are extrapolated they mathematically lead to 'soft' extinction before 3000 AD. ([http://lifeboat.com/ex/bios.john.leslie John Leslie] estimates that if the reproduction rate drops to the [[Germany|German]] level the extinction date will be 2400{{ref|2400}}).
** ''Political intervention in reproduction'' has failed to raise the birth rate above the [[Demographics of Russia#Figures and age structure|replacement level]] in the rich world, but has dramatically succeeded in lowering it below the replacement level in [[China]]{{Fact|date=December 2007}} (see [[One child policy]]). A [[World government]] with a [[eugenic]] or small population policy could send humanity into 'voluntary' extinction.
** [[Infertility]]: Caused by [[hormone|hormonal]] disruption from the chemical/[[pharmaceutical]] industries, or [[biological process|biological]] changes, such as the ([[controversial]]) findings of falling [[spermatozoon|sperm cell]] count in human males. (in fiction: see movie or novel [[Children of Men]])
** A disruption, chemical, biological, or otherwise, in humans' ability to reproduce properly or at all
** Disease: The 'weak-gened' and birth-defected are kept alive by medicines. This is the opposite of nature, where the weak are less likely to survive and successfully reproduce, leaving the species genetically 'strong'. Eventually everyone has weak/flawed genes, and these defects become increasingly severe, until the human body is unable to fight diseases, even with the help of advanced medicine. In the end, disease ends the human species. Arguably however if this point was reached natural selection would again become a factor, potentially reversing this 'decline'.
** [[Voluntary human extinction movement|Voluntary extinction]]
* Scientific accidents
** In his book ''[[Our Final Hour]]'', Sir [[Martin Rees]] claims that without the appropriate regulation, scientific advancement increases the risk of human extinction as a result of the effects or use of new technology. Some examples are provided below.
*** Uncontrolled nanotechnology ([[grey goo]]) incidents resulting in the destruction of the Earth's ecosystem ([[ecophagy]]).
*** Creation of a [[naked singularity]] (such as a "[[micro black hole]]") on Earth during the course of a scientific experiment, or other foreseeable scientific accidents in [[high-energy physics]] research, such as [[Vacuum metastability disaster|vacuum phase transition]] or [[stranglet]] incidents. There were worries concerning the [[Large Hadron Collider]] at [[CERN]] as it is feared that collision of protons at a speed near the speed of light will result in the creation of a black hole, but it has been pointed out that much more energetic collisions take place currently in Earth's atmosphere.
**Accidental contact of a [[Extraterrestrial life|alien civilization]] by Earth's radio and TV signals, radar, Internet tech dependent on radio, TV signals, other signals.
** Biotech disaster (E.g. the warnings of [[Jeremy Rifkin]])
* Scenarios of extraterrestrial origin
** Major [[impact event]]s.
** [[Gamma-ray burst]] in [[Gamma ray burst#Mass extinction on Earth|our part]] of the [[Milky Way]] (Bursts observable in other galaxies are calculated to act as a "sterilizer", and have been used by some [[astronomer]]s to explain the [[Fermi paradox]]). The lack of fossil record interruptions, and relative distance of the nearest [[Hypernova]] candidate make this a long term (rather than imminent) threat.
** [[Invasion]] by militarily superior aliens (see [[alien invasion]]) — often considered to be a scenario purely from the realms of [[science fiction]], professional [[SETI|SETI researchers]] have given serious consideration to this possibility, but conclude that it is unlikely. {{ref|AlienConquestUnlikely}}
** [[Gerard O'Neill]] has cautioned that [[first contact (anthropology)|first contact]] with alien intelligence may follow the precedent set by historical examples of contact between human civilizations, where the less technologically-advanced civilization has inevitably succumbed to the other civilization, regardless of its intentions.
** [[Solar flare]]s may suddenly heat the earth, or the light from the sun may be blocked by dust, slowly freezing it (eg. the dust and vapour may come from a [[Kuiper belt]] disturbance).
**It is possible that the space of our universe, the [[Big Bang]], and all its consequences are events taking place within a computer or other device on another cosmological plane, if this process were to end then everything within the universe would summarily vanish (see [[Simulated Reality]]).
* Philosophical scenarios
** ''See [[End of the world (philosophy)]]''
== Attitudes to human extinction ==
Attitudes to human extinction vary widely depending on beliefs concerning [[Spirituality|spiritual]] [[survival]] (souls, heaven, [[reincarnation]], and so forth), the value of the human race, whether the human race evolves individually or collectively, and many other factors. Many [[religions]] [[prophesy]] an "[[end times]]" to the [[universe]]. Human extinction is therefore a part of the [[faith]] of many humans to the extent that the end time means the absolute end of their physical humanity but perhaps not an internal soul.
However not all faiths connect human extinction to the end times, since some believe in cyclical regeneration, or that end times actually means the beginning of a new kind of existence (see [[eschatology]] and [[utopianism]]).
== Perception of human extinction risk ==
The general level of fear about human extinction, in the near term, is very low, despite the pronouncements of some fringe groups. It is not an outcome considered by many as a credible risk. Suggested reasons for human extinction's low public visibility:
# There have been countless prophesies of extinction throughout history; in all cases the predicted date of doom has passed without much notice, making future warnings [[cry wolf|less frightening]]. However, a [[survivor bias]] would undercut the credibility of accurate extinction warnings. [[John von Neumann]] was probably wrong in having “a certainty”{{ref|Putnam}} that nuclear war would occur; but our survival is not proof that the chance of a fatal nuclear exchange was low (or indeed that such an event could not occur in the future).
# Extinction scenarios (see below) are speculative, and hard to quantify. A [[frequentist]] approach to probability cannot be used to assess the danger of an event that has never been observed by humans.
# [[Nick Bostrom]], head of the James Martin 21st Century School [[Future of Humanity Institute]], has suggested that extinction risk-analysis may be an overlooked field because it is both too psychologically troublesome a subject area to be attractive to potential researchers and because the lack of previous human species extinction events leads a depressed view of the likelihood of it happening under changing future circumstances (an 'inverse [[survivorship bias]]').
# There are thousands of [[public safety]] jobs dedicated to analyzing and reducing the risks of individual death. There are no full-time ''existential safety commissioners'' partly because there is no way to tell if they are doing a good job, and no way to punish them for failure. The inability to judge performance might also explain the comparative governmental apathy on preventing human extinction (as compared to [[panda]] extinction, say).
# Some [[anthropology|anthropologists]] believe that risk perception is biased by social structure; in the "[[Cultural Theory of risk]]" typography "[[Cultural Theory of risk#Individualist|individualist]]" societies predispose members to the belief that nature operates as a self-correcting system, which will return to its stable state after a disturbance. People in such cultures feel comfortable with a "trial-and-error" approach to risk, even to unsuitably rare dangers (such as extinction events).
# It is possible to do something about dietary or motor-vehicle health threats. Since it is much harder to know how existential threats should be minimized{{ref|Minimize}}, they tend to be ignored. High technology societies tend to become "[[Cultural Theory of risk#Hierarchist|hierarchist]]" or "[[Cultural Theory of risk#Fatalist|fatalist]]" in their attitudes to the ever-multiplying risks threatening them. In either case, the average member of society adopts a passive attitude to risk minimization, culturally, and [[psychology|psychologically]].
# The bias in popular culture is to relate extinction scenario stories with non-extinction outcomes. (None of the 16 'most notable' [[World War III in popular culture#Film and television|WW3 scenarios in film]] are resolved by human extinction, for example.{{ref|JournalOfReligionAndFilm}})
# The threat of nuclear annihilation actually was a daily concern in the lives of many people in the 1960s and 1970s. Since then the principal fear has been of localized [[terrorism|terrorist]] attack, rather than a global war of extinction; contemplating human extinction may be out of fashion.
# Some people have philosophical reasons for doubting the possibility of human extinction, for instance the [[final anthropic principle]], [[plenitude principle]] or [[intrinsic finality]].
# [[Amos Tversky|Tversky]] and [[Daniel Kahneman|Kahneman]] have [[experimental economics|produced evidence]] that humans suffer [[cognitive bias]]es which would tend to minimize the perception of this unprecedented event:
## ''Denial'' is a negative "[[availability heuristic]]" shown to occur when an outcome is so upsetting that the very act of thinking about it leads to an increased refusal to believe it might occur. In this case, [[imagination|imagining]] ''human extinction'' probably makes it seem less likely.
## In cultures where human extinction is not expected the proposition must overcome the "[[disconfirmation bias]]" against heterodox theories.
## Another reliable [[psychology|psychological]] effect relevant here is the "[[Positive outcome bias (prediction)|positive outcome bias]]".
## [[Behavioural finance]] has strong evidence that [[recency effect|recent]] evidence is given undue significance in [[risk analysis]]. Roughly speaking, "100 year storms" tend to occur every twenty years in the [[stock market]] as traders become convinced that the current good times [[Dot-com bubble#Free spending|will last forever]]. Doomsayers who hypothesize rare [[stock market crash|crisis-scenarios]] are dismissed even when they have statistical evidence behind them. An extreme form of this bias can diminish the [[Bayesian probability#Varieties of Bayesian probability|subjective probability]] of the unprecedented{{ref|unprecedented}}.
In general, humanity's sense of [[self preservation]], and [[intelligence (trait)|intelligence]] are considered to offer safe-guards against extinction. It is felt that people will find [[creative]] ways to overcome potential threats, and will take care of the [[precautionary principle]] in attempting dangerous [[innovations]]. The arguments against this are; firstly, that the management of destructive technology is becoming difficult, and secondly, that the precautionary principle is often abandoned whenever the reward appears to outweigh the risk. At least one instance where the principle may have been overruled was when prior to the [[Trinity test|Trinity]] nuclear test, one of the project's scientists ([[Edward Teller|Teller]]) speculated that the [[Nuclear fission|fission]] explosion might destroy New Mexico and possibly the world, by causing a reaction in the nitrogen of the atmosphere. A calculation by [[Hans Bethe]] proved such a possibility theoretically impossible, but the fear of the possibility remained among some until the test took place. (See ''Ignition of the atmosphere with nuclear bombs'', LA-602, [http://www.fas.org/sgp/othergov/doe/lanl/docs1/00329010.pdf online] and [[Manhattan Project]]).
==Observations about human extinction==
The fact the majority of species that have existed on Earth have become extinct, has led to the suggestion that all species have a finite lifespan and thus human extinction would be inevitable. Dave Raup and Jack Sepkoski found for example a twenty six million year periodicity in elevated extinction rates, caused by factors unknown (See [[David M. Raup]]. "Extinction: Bad Genes or Bad Luck" (1992, Norton). Based upon evidence of past extinction rates Raup and others have suggested that the average longevity of an invertebrate species is between 4-6 million years, while that of vertebrates seems to be 2-4 million years. The shorter period of survival for mammals lies in their position further up the food chain than many invertebrates, and therefore an increased liability to suffer the effects of environmental change. A counter-argument to this is that humans are unique in their adaptive and technological capabilities, so it is not possible to draw reliable inferences about the probability of human extinction based on the past extinctions of other species. Certainly, the evidence collected by Raup and others suggested that generalist, geographically dispersed species, like humans, generally have a lower rate of extinction than those species that require a particular habitat. In addition, the human species is probably the only species with a conscious prior knowledge of their own demise, and therefore would be likely to take steps to avoid it.
Another characteristic of the human that may be unique is its religious belief, which in most situations encourages respect for life. On the other hand, it may also create conditions for warfare and genocide. As a result, thinkers as [[Albert Einstein]] believed that "We shall require a substantially new manner of thinking if mankind is to survive."<ref>[http://nobelprize.org/nobel_prizes/peace/laureates/1985/press.html The Nobel Peace Prize 1985 - Presentation Speech]</ref>
Humans are very similar to other [[primate]]s in their propensity towards intra-species [[violence]]; [[Jared Diamond]]'s [[The Third Chimpanzee]] (ISBN 0-09-980180-9) estimates that 64% of hunter-gather societies engage in warfare every two years. Although it has been argued (e.g. in the [[UNESCO]] [[Seville Statement]]) that warfare is a cultural artifact, many [[anthropology|anthropologists]]{{Fact|date=February 2007}}<!--Who are the "many" anthropologists?--> dispute this, noting that small human tribes exhibit similar patterns of violence to [[chimpanzee]] groups, the most murderous of the primates, and our nearest living [[genetics|genetic]] relatives. The '[[neopallium|higher]]' functions of reason and speech are more developed in the brain of ''[[human|Homo sapiens]]'' than other primates, but the relative size of the [[limbic system]] is a constant in [[ape]]s, [[monkey]]s and [[human]]s; as human rational faculties have expanded, so has the [[wetware]] of [[emotion]]. The combination of inventiveness and urge to violence in humans has been [[cite]]d as evidence against its long term survival{{ref|HumanSelfDestructionCitation}}.{{lopsided}}<!--This paragraph paints a dark portrait of humans, what counter-views exist?-->
===Omnicide===
Omnicide is human extinction as a result of human action. Most commonly it refers to extinction through [[nuclear warfare]],<ref>Somerville, John. 1981. ''Soviet Marxism and nuclear war : an international debate : from the proceedings of the special colloquium of the XVth World Congress of Philosophy''. Greenwood Press. Pg.151</ref><ref>Goodman, Lisl Marburg and Lee Ann Hoff. 1990. ''Omnicide: The Nuclear Dilemma''. New York: Praeger.</ref><ref>Landes, Daniel (ed.). 1991. ''Confronting Omnicide: Jewish Reflections on Weapons of Mass Destruction''. Jason Aronson Publishers.</ref> but it can also apply to extinction through means such as global [[anthropogenic]] [[Environmental disaster|ecological catastrophe]].<ref>Wilcox, Richard Brian. 2004. ''The Ecology of Hope: Environmental Grassroots Activism in Japan''. Ph.D. Dissertation, Union Institute & University, College of Graduate Studies. Page 55.</ref>
Omnicide can be considered a subcategory of [[genocide]].<ref>Jones, Adam. 2006. "A Seminal Work on Genocide", in ''Security Dialogue'', vol. 37(1), pp. 143-144.</ref> Using the concept in this way, one can argue that, for example
<blockquote>the arms race is genocidal in intent given the fact that the United States and the Soviet Union are knowingly preparing to destroy each other as viable national and political groups.<ref>Santoni, Ronald E., 1987. "Genocide, Nuclear Omnicide, and Individual Responsibility" in ''Social Science Record'', vol. 24(2), pp.38-41.</ref></blockquote>
As this claim illustrates, the concept of omnicide raises issues of [[human agency]] and, hence, of [[moral responsibility]] in discussions about large-scale social processes like the [[nuclear arms race]] or ecologically destructive industrial production. That is, part of the point of describing a human extinction scenario as 'omnicidal' is to note that, if it were to happen, it would result not just from natural, uncontrollable [[evolution]]ary forces, or from some random catastrophe like an asteroid impact, but from deliberate choices made by human beings. This implies that such scenarios are preventable, and that the people whose choices make them more likely to happen should be held morally accountable for such choices. In this context, the label 'omnicide' also works to de-[[Normalization (sociology)|normalize]] the course of action it is applied to.
==Scenarios of the world without humans==
The book ''[[The World Without Us]]'' by [[Alan Weisman]] deals with a [[thought experiment]] on what would happen to the planet and especially man-made infrastructures if humans suddenly disappeared. Alan said that apes, with the highest IQ amongst animals other than humans, may be the species that succeeds humanity. The [[Discovery Channel]] film ''[[The Future is Wild]]'' shows the possible future of [[evolution]] on Earth without humans. The [[History Channel]] 2-hour special ''[[Life After People]]'' examines the possible future of life on Earth without humans.
==See also==
*[[Disaster]]
*[[Doomsday event]]
*[[Extinction]]
*[[Extinction event]]
*[[Law of Limited Competition]] (If violated, [[Daniel Quinn]] predicts coextinction for humanity, in the book [[Ishmael (novel)|Ishmael]].)
*[[Novelty Theory]] (Mathematically(numerologically?) derived [[eschatology]], with arbitrary extinction mechanism.)
*[[Risks to civilization, humans and planet Earth]]
*[[Voluntary Human Extinction Movement]]
*[[mediation]]
*[[Mutual Assured Destruction]]
==Further reading==
*Cawthorne, N. (2004). ''Doomsday''. Arcturus Publishing Limited. ISBN 1-84193-238-8
*Leslie, J. (1999). [http://lifeboat.com/ex/risking.human.extinction ''Risking Human Extinction'']
*Leslie, J. (1996). ''The End of the World: The Science and Ethics of Human Extinction''. Routledge. ISBN 0-415-18447-9
== Notes ==
<div class="references-small">
<references />
{{note|Putnam}} [[John von Neumann|Von Neumann]] said it was ''"absolutely certain (1) that there would be a [[Nuclear warfare|nuclear war]]; and (2) that <u>everyone would die in it</u>"'' (underline added to quote from: ''The Nature of the Physical Universe'' – 1979, John Wiley & Sons, ISBN 0-471-03190-9, in H. Putnam’s essay ''The place of facts in a world of values'' - page 113). This example illustrates why respectable scientists are very reluctant to go on record with extinction predictions: they can never be proven right. (The quotation is repeated by Leslie (1996) on page 26, on the subject of [[Nuclear warfare|nuclear war]] annihilation, which he still considered a significant risk – in the mid 1990s.)
{{note|Minimize}} Although existential risks are less manageable by individuals than health risks, according to [[Ken Olum]], [[Joshua Knobe]], and Alexander Vilenkin the possibility of human extinction ''does'' have practical implications. For instance, if the “universal” [[Doomsday argument]] is accepted it changes the most likely source of disasters, and hence the most efficient means of preventing them. They write: ''"...you should be more concerned that a large number of asteroids have not yet been detected than about the particular orbit of each one. You should not worry especially about the chance that some specific nearby star will become a supernova, but more about the chance that supernovas are more deadly to nearby life then we believe."'' Source: “Practical application” page 39 of the [[Princeton University]] paper: [http://www.princeton.edu/~jknobe/physics.pdf Philosophical Implications of Inflationary Cosmology]
{{note|JournalOfReligionAndFilm}} The 2000 review [http://www.unomaha.edu/jrf/armagedd.htm Armageddon at the Millennial Dawn] from ''The Journal of Religion and Film'' finds that ''"While end of the world threats perhaps are not avoidable, the cinematic formulation of millennial doom promotes the notion that the end can be averted through employing human ingenuity, scientific advance, and heroism."'' Since this review was conducted, there had been a Hollywood production which postulates a (far future) outcome where humans are extinct (at least in the wild): [[A.I. (movie)|A.I.]].
{{note|unprecedented}} For research on this, see ''Psychological science'' volume 15 (2004): ''[http://www.psycho.unibas.ch/fakultaet/angewandt/articles/hertwig-psysci04.pdf Decisions From Experience and the Effect of Rare Events in Risky Choice]''. The under-perception of rare events mentioned above is actually the opposite of the phenomenon originally described by [[Daniel Kahneman|Kahneman]] in "[[prospect theory]]" (in their original experiments the likelihood of rare events is over-estimated). However, further analysis of the [[cognitive bias|bias]] has shown that both forms occur: When judging from ''description'' people tend to over-estimate the described probability, so this effect taken alone would indicate that reading the extinction scenarios described here should make the reader over-estimate the likelihood of any probabilities given. However, the effect that is more relevant to common consideration of human extinction is the bias that occurs with estimates from experience, and these are in the opposite direction: When judging from personal experience people who have never heard of or experienced their species become extinct would be expected to dramatically under-estimate its likelihood. [[Sociobiologist]] [[E. O. Wilson]] argued that: "''The reason for this myopic fog, evolutionary biologists contend, is that it was actually advantageous during all but the last few millennia of the two million years of existence of the [[genus]] Homo... A premium was placed on close attention to the near future and early reproduction, and little else. Disasters of a magnitude that occur only once every few centuries were forgotten or transmuted into myth.''" (Is Humanity Suicidal? ''New York Times Magazine'' [[May 30]][[1993]]).
{{note|HumanSelfDestructionCitation}} [http://www.abrupt.org/EDITORIAL/despair.html Abrupt.org 1996 editorial] lists (and condemns) the arguments for human’s tendency to self-destruction. In this view, the [[history]] of humanity suggests that humans will be the cause of their own extinction. However, others have reached the opposite conclusion with the same data on violence and hypothesize that as societies develop armies and [[weapon]]s with greater destructive power, they tend to be used less often. It is claimed that this implies a more secure future, despite the development of [[Weapons of mass destruction|WMD]] technology. As such this argument may constitute a form of [[deterrence theory]]. Counter-arguments against such views include the following: (1) All weapons ever designed have ultimately been used. States with strong military forces tend to engage in military aggression, (2) Although modern states have so far generally shown restraint in unleashing their most potent weapons, whatever rational control was guaranteed by government monopoly over such weapons becomes increasingly irrelevant in a world where individuals have access to the technology of mass destruction (as proposed in ''[[Our Final Hour]]'', for example).
{{note|PlanForDestruction}} [http://www.religioustolerance.org/destruct.htm ReligiousTolerance.org says that ''Aum Supreme Truth'' is the only religion known to have planned Armageddon for non-believers]. Their intention to unleash deadly [[virus]]es is covered in ''[[Our Final Hour]]'', and by [http://www.rickross.com/reference/aum/aum276.html Aum watcher, Akihiko Misawa]. The [[Gaia Liberation Front]] advocates (but is not known to have active plans for) total human [[genocide]], see: [http://www.churchofeuthanasia.org/resources/glf/glfsop.html GLF, A Modest Proposal]. Leslie, 1996 says that Aum’s collection of nuclear physicists presented a doomsday threat from nuclear destruction as well, especially as the cult included a rocket scientist.
{{note|Unbiased}} Leslie (1996) discusses the [[survivorship bias]] (which he calls an "observational selection" effect on page 139) he says that the ''[[A priori and a posteriori (philosophy)|a priori]]'' certainty of observing an "undisasterous past" could make it difficult to argue that we must be safe because nothing terrible has yet occurred. He quotes [[Holger Bech Nielsen]]’s formulation: ''“We do not even know if there should exist some extremely dangerous decay of say the [[proton]] which caused eradication of the earth, because if it happens we would no longer be there to observe it and if it does not happen there is nothing to observe.”'' (From: Random dynamics and relations between the number of fermion generations and the fine structure constants, ''Acta Pysica Polonica B'', May 1989).
{{note|BillJoy}} For example, in the essay ''Why the future doesn't need us'', computer scientist [[Bill Joy]] argued that human beings are likely to guarantee their own extinction through [[transhumanism]]. See: [http://www.wired.com/wired/archive/8.04/joy_pr.html Wired archive, ''Why the future doesn't need us''].
{{note|2400}} For the “West Germany” extrapolation see: Leslie, 1996 (''The End of the World'') in the “War, Pollution, and disease” chapter (page 74). In this section the author also mentions the success (in lowering the birth rate) of programs such as the [[Sterilization (surgical procedure)|sterilization]]-for-[[INR|rupees]] programs in [[India]], and surveys other [[infertility]] or falling birth-rate extinciton scenarios. He says that the voluntary small family behaviour may be counter-[[evolution]]ary, but that the [[meme]] for small, rich families appears to be spreading rapidly throughout the world. In 2150 the world population is expected to start falling.
{{note|AlienConquestUnlikely}}See estimate of contact’s probability at [http://www.galactic-guide.com/articles/8R88.html galactic-guide]. Former [[NASA]] consultant [[David Brin]]'s lengthy rebuttal to [[SETI]] enthusiast's optimism about alien intentions concludes: "The worst mistake of first contact, made throughout history by individuals on both sides of every new encounter, has been the unfortunate habit of making assumptions. It often proved fatal." ([http://www.setileague.org/iaaseti/brin.pdf See full text at SETIleague.org].)
</div>
{{Doomsday}}
<!--Categories-->
[[Category:Eschatology]]
[[Category:Extinction]]
<!--Other languages-->
[[he:אומניסייד]]
[[ru:Гибель человечества]]