Longtermism is the idea that because humanity's future is potentially vast in size, we could have a massive altruistic impact by positively influencing it. In this video, we illustrate the papers "The Case for Strong Longtermism" by Hilary Greaves and William MacAskill and "Astronomical Waste: The Opportunity Cost of Delayed Technological Development" by Nick Bostrom (links below). We'll examine two main ways in which we might most positively influence the far future: accelerating technological development and reducing existential risk, which is the risk of human extinction and of catastrophes so large that would curtail humanity's potential forever. Advancing technological progress and preventing existential risk look much more compelling under a totalist view of population ethics, but they still look extremely important even under a person-affecting view.
Interested in donating to safeguard the long-term future of humanity? You can donate to an expert managed fund at: www.givingwhatwecan.org/chari...
Clarification: Toby Ord estimates that the chance of an existential catastrophe (not just extinction) this century (not across all time) is around 1/6; his estimate for extinction only might be lower, and his estimate across all time is higher. For more on the concept of existential catastrophe, see www.existential-risk.org/conc... and forum.effectivealtruism.org/t...
Many thanks to Robert Miles, who narrated this video. He has a channel about AI safety, go check it out: / @robertmilesai
A huge thank you to Matthew Barnett, who made me read the papers I mentioned in this video. He's great and he has a blog on Substack: matthewbarnett.substack.com/
🟠 Patreon: / rationalanimations
🟢Merch: crowdmade.com/collections/rat...
🔵 Channel membership: / @rationalanimations
🟤 Ko-fi, for one-time and recurring donations: ko-fi.com/rationalanimations
If you enjoyed this video help others enjoy it by adding captions in your native language: support.google.com/youtube/an...
-----------------------------------------------------------------------------------------------------------------
Sources:
The Case for Strong Longtermism, by Hilary Greaves and William MacAskill: globalprioritiesinstitute.org...
Astronomical Waste: The Opportunity Cost of Delayed Technological Development: www.nickbostrom.com/astronomi...
-------------------------------------------------------------------------------------------------------------------
Many thanks to our first patron, Francisco Lillo ^^
20 июн 2021