From Wikipedia, the free encyclopedia

Computer scientist Timnit Gebru and philosopher Émile P. Torres coined the acronym "TESCREAL" in 2023.

TESCREAL is an acronym neologism, proposed and advocated by computer scientist Timnit Gebru and philosopher Émile P. Torres, standing for transhumanism, extropianism, singularitarianism, cosmism, rationalism, effective altruism, and longtermism. [1] Gebru and Torres argue that these ideologies should be treated as an "interconnected and overlapping" group with shared origins. [1] Gebru and Torres allege this movement allows its proponents to use the threat of human extinction to justify societally expensive or detrimental projects. They consider it pervasive in social and academic circles in Silicon Valley centered around artificial intelligence. [2] As such, the acronym is sometimes used to criticize a perceived belief system associated with Big Tech. [3] [2] [4] [5]

Origin

Gebru and Torres coined the "TESCREAL" acronym in 2023, first using it in a draft of a paper titled "The TESCREAL bundle: Eugenics and the promise of utopia through artificial general intelligence". [1] [3] The paper was later published in First Monday in April 2024, though Torres and Gebru popularized the term elsewhere prior to the paper's publication. According to Gebru and Torres, transhumanism, extropianism, singularitarianism, (modern) cosmism, rationalism, effective altruism, and longtermism are a "bundle" of "interconnected and overlapping ideologies" that emerged from twentieth-century eugenics, with shared progenitors. [1] They use the term "TESCREAList" to refer to people who ascribe to, or appear to endorse, most or all of the ideologies captured in the acronym. [1] [2]

Analysis

According to critics of these philosophies, TESCREAL describes overlapping movements endorsed by prominent individuals in the tech industry to provide intellectual backing to pursue and prioritize projects including artificial general intelligence (AGI), life extension, and space colonization. [1] [3] [6] Science fiction author Charles Stross, using the example of space colonization, argued that the ideologies allow billionaires to pursue massive personal projects driven by a right-wing interpretation of science fiction by arguing that not pursuing such projects presents an existential risk to society. [7] Gebru and Torres write that, using the threat of extinction, TESCREALists can justify "attempts to build unscoped systems which are inherently unsafe". [1] Media scholar Ethan Zuckerman argues that by only considering goals that are valuable to the TESCREAL movement, futuristic projects with more immediate negatives, such as racial inequity, algorithmic bias, and environmental degradation, can be justified. [8]

Philosopher Yogi Hale Hendlin has argued that by both ignoring the human causes of societal problems, and over-engineering solutions, TESCREALists ignore the context from which many problems arise. [9] Camille Sojit Pejcha wrote in Document Journal that TESCREAL is a tool to concentrate power by tech elites. [6] Dave Troy described TESCREAL in The Washington Spectator as an " ends justifies the means" movement that is antithetical to "democratic, inclusive, fair, patient, and just governance". [3] Gil Duran wrote that "TESCREAL", "authoritarian technocracy", and "techno-optimism" were phrases being used in early 2024 to describe a new ideology emerging in the tech industry. [10]

Gebru, Torres, and others have likened TESCREAL to a secular religion due to its parallels to Christian theology and eschatology. [1] [2] [7] [11] Writers in Current Affairs compared these philosophies and the ensuing techno-optimism to "any other monomaniacal faith... in which doubters are seen as enemies and beliefs are accepted without evidence". They argue pursuing TESCREAL would prevent an actual equitable shared future. [12]

Ozy Brennan, writing in a magazine affiliated with the Centre for Effective Altruism, criticized Gebru and Torres's approach of grouping different philosophies as if they were a "monolithic" movement. Brennan argues Torres has misunderstood these different philosophies, and has taken philosophical thought experiments out of context. [13] James Pethokoukis, of the American Enterprise Institute, disagrees with criticizing proponents of TESCREAL. He argued that the tech billionaires criticized in a Scientific American article for allegedly espousing TESCREAL had achieved significant accomplishments with their wealth to advance society. [14]

Artificial General Intelligence (AGI)

Much of the discourse around an existential risk from AGI occur among supporters of the TESCREAL ideologies. [3] [8] [15] TESCREALists are either considered "AI accelerationists", in that they consider AI the only way to pursue a utopian future where problems are solved, or "AI doomers", in that they consider AI likely to be unaligned to human survival and likely to cause human extinction. [8] [11] Despite the risk, many "AI doomers" argue that only by developing and aligning AGI first can existential risk be averted. [16]

Gebru has likened the conflict between "AI accelerationists" and "AI doomers" to a "secular religion selling AGI enabled utopia and apocalypse". [11] Torres and Gebru argue that both groups use hypothetical AI-driven apocalypses and utopian futures to justify unlimited research, development, and deregulation of technology. By considering only far-reaching future consequences, creating hype for unproven technology, and fear-mongering, Torres and Gebru allege TESCREALists distract from the impacts of technologies that disproportionately harm minorities and negatively impact society and the environment. [1] [4]

Bias against minorities

Gebru and Torres argue that TESCREAL ideologies directly originate from twentieth-century eugenics, [1] and argue that the bundle of ideologies advocates for a second wave of new eugenics. [1] [17] Others have similarly argued that the TESCREAL ideologies developed from earlier philosophies that have been used to provide justification for mass murder and genocide. [6] [16] Some prominent figures who have contributed to TESCREAL ideologies have been alleged to be racist and sexist. [1] [15] [18] [19]

Alleged "TESCREALists"

Venture capitalist Marc Andressen has self-identified as a TESCREAList. [8] He published "The Techno-Optimist Manifesto" in October 2023, which has been described by Jag Bhalla and Nathan J. Robinson as a "perfect example" of the TESCREAL ideologies. [12] In the document, he argued that more advanced artificial intelligence could save countless future potential lives, and that those working to slow or prevent its development should be condemned as murderers. [8] [6]

Elon Musk has been described as sympathetic to some TESCREAL ideologies. [4] [20] [18] In August 2022, Musk tweeted that MacAskill's longtermist book What We Owe the Future was a "close match for my philosophy". [21] Some writers consider Elon Musk's Neuralink to pursue TESCREAList goals. [4] [20] Some AI experts have complained about the focus of Musk's XAI company on existential risk, arguing that it and other AI companies have ties to TESCREAL movements. [22] [23] Dave Troy considers Musk's natalist views as originating from TESCREAL ideals. [3]

Sam Altman and much of the OpenAI board has been described as supporting TESCREAL movements, especially in the context of his attempted firing in 2023. [22] [24] [11] [19] Gebru and Torres have urged Altman against pursuing TESCREAL ideals. [5]

Self-identified transhumanists Nick Bostrom and Eliezer Yudkowsky, both influential in discussions around existential risk from AI, [19] have also been described as leaders of the TESCREAL movement. [2] [4] [15] [19]

Sam Bankman-Fried, former CEO of the FTX cryptocurrency exchange, was a prominent and self-identified member of the effective altruist community prior to his arrest, [25] and has been described as a TESCREAList [1] [3] or influenced by the ideologies. [18] After FTX's collapse, administrators of the bankruptcy estate raised concerns over Bankman-Fried's donations to help purchase property used for conferences and workshops associated with longtermism, rationalism, and effective altruism. The property was used for controversial conferences that hosted liberal eugenicists and other speakers with racist and misogynistic histories. According to The Guardian, people within the TESCREAL movement may have continued to benefit from money obtained via the fraud at FTX even after bankruptcy proceedings commenced. [18]

Longtermist and effective altruist William MacAskill, who had frequently collaborated with Bankman-Fried to coordinate philanthropic initiatives, has been described as a TESCREAList. [1] [3] [8]

See also

References

  1. ^ a b c d e f g h i j k l m n Gebru, Timnit; Torres, Émile P. (April 14, 2024). "The TESCREAL bundle: Eugenics and the promise of utopia through artificial general intelligence". First Monday. 29 (4). doi: 10.5210/fm.v29i4.13636. ISSN  1396-0466. Archived from the original on July 1, 2024. Retrieved June 27, 2024.
  2. ^ a b c d e Torres, Émile P (June 15, 2023). "The Acronym Behind Our Wildest AI Dreams and Nightmares". TruthDig. Retrieved October 1, 2023.
  3. ^ a b c d e f g h Troy, Dave (May 1, 2023). "The Wide Angle: Understanding TESCREAL — the Weird Ideologies Behind Silicon Valley's Rightward Turn". The Washington Spectator. Archived from the original on June 6, 2023. Retrieved October 1, 2023.
  4. ^ a b c d e Ahuja, Anjana (May 10, 2023). "We need to examine the beliefs of today's tech luminaries". Financial Times. Archived from the original on December 11, 2023. Retrieved October 1, 2023.
  5. ^ a b Russell, Melia; Black, Julia (April 27, 2023). "He's played chess with Peter Thiel, sparred with Elon Musk and once, supposedly, stopped a plane crash: Inside Sam Altman's world, where truth is stranger than fiction". Business Insider. Archived from the original on October 11, 2023. Retrieved October 1, 2023.
  6. ^ a b c d Pejcha, Camille Sojit (May 23, 2024). "Techno-futurists are selling an interplanetary paradise for the posthuman generation—they just forgot about the rest of us". Document Journal. Archived from the original on June 29, 2024. Retrieved June 29, 2024.
  7. ^ a b Stross, Charles (December 20, 2023). "Tech Billionaires Need to Stop Trying to Make the Science Fiction They Grew Up on Real". Scientific American. Archived from the original on June 26, 2024. Retrieved June 27, 2024.
  8. ^ a b c d e f Zuckerman, Ethan (January 16, 2024). "Two warring visions of AI". Prospect. Archived from the original on July 1, 2024. Retrieved June 29, 2024.
  9. ^ Hendlin, Yogi Hale (April 1, 2024). "Semiocide as Negation: Review of Michael Marder's Dump Philosophy". Biosemiotics. 17 (1): 233–255. doi: 10.1007/s12304-024-09558-x. ISSN  1875-1342. Archived from the original on July 1, 2024. Retrieved June 29, 2024.
  10. ^ Duran, Gil (February 12, 2024). "The Tech Plutocrats Dreaming of a Right-Wing San Francisco". The New Republic. ISSN  0028-6583. Retrieved July 2, 2024.
  11. ^ a b c d Piccard, Alexandre (November 30, 2023). "The Sam Altman saga shows that AI doomers have lost a battle". Le Monde. Archived from the original on July 1, 2024. Retrieved June 30, 2024.
  12. ^ a b Bhalla, Jag; Robinson, Nathan J. (October 20, 2023). "'Techno-Optimism' is Not Something You Should Believe In". Current Affairs. ISSN  2471-2647. Retrieved July 2, 2024.
  13. ^ Brennan, Ozy (June 2024). "The "TESCREAL" Bungle". Asterisk. Archived from the original on June 12, 2024. Retrieved June 18, 2024.
  14. ^ Pethokoukis, James (January 6, 2024). "Billionaires Dreaming Of a Sci-Fi Future Is a Good Thing". American Enterprise Institute. Archived from the original on June 27, 2024. Retrieved July 1, 2024.
  15. ^ a b c Helfrich, Gina (March 11, 2024). "The harms of terminology: why we should reject so-called "frontier AI"". AI Ethics. doi: 10.1007/s43681-024-00438-1. ISSN  2730-5961. Archived from the original on July 1, 2024. Retrieved June 29, 2024.
  16. ^ a b Van Rensburg, Wessel (June 7, 2024). "AI and the quest for utopia". Vrye Weekblad. Archived from the original on June 30, 2024. Retrieved June 30, 2024.
  17. ^ Torres, Émile P. (November 9, 2023). "Effective Altruism Is a Welter of Lies, Hypocrisy, and Eugenic Fantasies". Truthdig. Retrieved June 30, 2024.
  18. ^ a b c d Wilson, Jason; Winston, Ali (June 16, 2024). "Sam Bankman-Fried funded a group with racist ties. FTX wants its $5m back". The Guardian. ISSN  0261-3077. Archived from the original on July 1, 2024. Retrieved June 29, 2024.
  19. ^ a b c d Brownell, Claire (November 27, 2023). "Doom, Inc.: The well-funded global movement that wants you to fear AI". The Logic. Retrieved July 2, 2024.
  20. ^ a b Kandimalla, Sriskandha (June 5, 2024). "The dark side of techno-utopian dreams: Ethical and practical pitfalls". New University. Archived from the original on June 30, 2024. Retrieved June 30, 2024.
  21. ^ Kulish, Nicholas (October 8, 2022). "How a Scottish Moral Philosopher Got Elon Musk's Number". The New York Times. ISSN  0362-4331. Retrieved July 2, 2024.
  22. ^ a b Goldman, Sharon (July 24, 2023). "Doomer AI advisor joins Musk's xAI, the 4th top research lab focused on AI apocalypse". VentureBeat. Archived from the original on June 29, 2024. Retrieved June 29, 2024.
  23. ^ Torres, Émile P. (June 11, 2023). "AI and the threat of "human extinction": What are the tech-bros worried about? It's not you and me". Salon. Archived from the original on June 30, 2024. Retrieved June 29, 2024.
  24. ^ Melton, Monica; Mok, Aaron (November 23, 2023). "'Black Twitter' asks 'What if Sam Altman were a Black woman?' in the wake of ouster". Business Insider. Archived from the original on March 3, 2024. Retrieved June 29, 2024.
  25. ^ Wenar, Leif (March 27, 2024). "The Deaths of Effective Altruism". Wired. ISSN  1059-1028. Retrieved July 2, 2024.
From Wikipedia, the free encyclopedia

Computer scientist Timnit Gebru and philosopher Émile P. Torres coined the acronym "TESCREAL" in 2023.

TESCREAL is an acronym neologism, proposed and advocated by computer scientist Timnit Gebru and philosopher Émile P. Torres, standing for transhumanism, extropianism, singularitarianism, cosmism, rationalism, effective altruism, and longtermism. [1] Gebru and Torres argue that these ideologies should be treated as an "interconnected and overlapping" group with shared origins. [1] Gebru and Torres allege this movement allows its proponents to use the threat of human extinction to justify societally expensive or detrimental projects. They consider it pervasive in social and academic circles in Silicon Valley centered around artificial intelligence. [2] As such, the acronym is sometimes used to criticize a perceived belief system associated with Big Tech. [3] [2] [4] [5]

Origin

Gebru and Torres coined the "TESCREAL" acronym in 2023, first using it in a draft of a paper titled "The TESCREAL bundle: Eugenics and the promise of utopia through artificial general intelligence". [1] [3] The paper was later published in First Monday in April 2024, though Torres and Gebru popularized the term elsewhere prior to the paper's publication. According to Gebru and Torres, transhumanism, extropianism, singularitarianism, (modern) cosmism, rationalism, effective altruism, and longtermism are a "bundle" of "interconnected and overlapping ideologies" that emerged from twentieth-century eugenics, with shared progenitors. [1] They use the term "TESCREAList" to refer to people who ascribe to, or appear to endorse, most or all of the ideologies captured in the acronym. [1] [2]

Analysis

According to critics of these philosophies, TESCREAL describes overlapping movements endorsed by prominent individuals in the tech industry to provide intellectual backing to pursue and prioritize projects including artificial general intelligence (AGI), life extension, and space colonization. [1] [3] [6] Science fiction author Charles Stross, using the example of space colonization, argued that the ideologies allow billionaires to pursue massive personal projects driven by a right-wing interpretation of science fiction by arguing that not pursuing such projects presents an existential risk to society. [7] Gebru and Torres write that, using the threat of extinction, TESCREALists can justify "attempts to build unscoped systems which are inherently unsafe". [1] Media scholar Ethan Zuckerman argues that by only considering goals that are valuable to the TESCREAL movement, futuristic projects with more immediate negatives, such as racial inequity, algorithmic bias, and environmental degradation, can be justified. [8]

Philosopher Yogi Hale Hendlin has argued that by both ignoring the human causes of societal problems, and over-engineering solutions, TESCREALists ignore the context from which many problems arise. [9] Camille Sojit Pejcha wrote in Document Journal that TESCREAL is a tool to concentrate power by tech elites. [6] Dave Troy described TESCREAL in The Washington Spectator as an " ends justifies the means" movement that is antithetical to "democratic, inclusive, fair, patient, and just governance". [3] Gil Duran wrote that "TESCREAL", "authoritarian technocracy", and "techno-optimism" were phrases being used in early 2024 to describe a new ideology emerging in the tech industry. [10]

Gebru, Torres, and others have likened TESCREAL to a secular religion due to its parallels to Christian theology and eschatology. [1] [2] [7] [11] Writers in Current Affairs compared these philosophies and the ensuing techno-optimism to "any other monomaniacal faith... in which doubters are seen as enemies and beliefs are accepted without evidence". They argue pursuing TESCREAL would prevent an actual equitable shared future. [12]

Ozy Brennan, writing in a magazine affiliated with the Centre for Effective Altruism, criticized Gebru and Torres's approach of grouping different philosophies as if they were a "monolithic" movement. Brennan argues Torres has misunderstood these different philosophies, and has taken philosophical thought experiments out of context. [13] James Pethokoukis, of the American Enterprise Institute, disagrees with criticizing proponents of TESCREAL. He argued that the tech billionaires criticized in a Scientific American article for allegedly espousing TESCREAL had achieved significant accomplishments with their wealth to advance society. [14]

Artificial General Intelligence (AGI)

Much of the discourse around an existential risk from AGI occur among supporters of the TESCREAL ideologies. [3] [8] [15] TESCREALists are either considered "AI accelerationists", in that they consider AI the only way to pursue a utopian future where problems are solved, or "AI doomers", in that they consider AI likely to be unaligned to human survival and likely to cause human extinction. [8] [11] Despite the risk, many "AI doomers" argue that only by developing and aligning AGI first can existential risk be averted. [16]

Gebru has likened the conflict between "AI accelerationists" and "AI doomers" to a "secular religion selling AGI enabled utopia and apocalypse". [11] Torres and Gebru argue that both groups use hypothetical AI-driven apocalypses and utopian futures to justify unlimited research, development, and deregulation of technology. By considering only far-reaching future consequences, creating hype for unproven technology, and fear-mongering, Torres and Gebru allege TESCREALists distract from the impacts of technologies that disproportionately harm minorities and negatively impact society and the environment. [1] [4]

Bias against minorities

Gebru and Torres argue that TESCREAL ideologies directly originate from twentieth-century eugenics, [1] and argue that the bundle of ideologies advocates for a second wave of new eugenics. [1] [17] Others have similarly argued that the TESCREAL ideologies developed from earlier philosophies that have been used to provide justification for mass murder and genocide. [6] [16] Some prominent figures who have contributed to TESCREAL ideologies have been alleged to be racist and sexist. [1] [15] [18] [19]

Alleged "TESCREALists"

Venture capitalist Marc Andressen has self-identified as a TESCREAList. [8] He published "The Techno-Optimist Manifesto" in October 2023, which has been described by Jag Bhalla and Nathan J. Robinson as a "perfect example" of the TESCREAL ideologies. [12] In the document, he argued that more advanced artificial intelligence could save countless future potential lives, and that those working to slow or prevent its development should be condemned as murderers. [8] [6]

Elon Musk has been described as sympathetic to some TESCREAL ideologies. [4] [20] [18] In August 2022, Musk tweeted that MacAskill's longtermist book What We Owe the Future was a "close match for my philosophy". [21] Some writers consider Elon Musk's Neuralink to pursue TESCREAList goals. [4] [20] Some AI experts have complained about the focus of Musk's XAI company on existential risk, arguing that it and other AI companies have ties to TESCREAL movements. [22] [23] Dave Troy considers Musk's natalist views as originating from TESCREAL ideals. [3]

Sam Altman and much of the OpenAI board has been described as supporting TESCREAL movements, especially in the context of his attempted firing in 2023. [22] [24] [11] [19] Gebru and Torres have urged Altman against pursuing TESCREAL ideals. [5]

Self-identified transhumanists Nick Bostrom and Eliezer Yudkowsky, both influential in discussions around existential risk from AI, [19] have also been described as leaders of the TESCREAL movement. [2] [4] [15] [19]

Sam Bankman-Fried, former CEO of the FTX cryptocurrency exchange, was a prominent and self-identified member of the effective altruist community prior to his arrest, [25] and has been described as a TESCREAList [1] [3] or influenced by the ideologies. [18] After FTX's collapse, administrators of the bankruptcy estate raised concerns over Bankman-Fried's donations to help purchase property used for conferences and workshops associated with longtermism, rationalism, and effective altruism. The property was used for controversial conferences that hosted liberal eugenicists and other speakers with racist and misogynistic histories. According to The Guardian, people within the TESCREAL movement may have continued to benefit from money obtained via the fraud at FTX even after bankruptcy proceedings commenced. [18]

Longtermist and effective altruist William MacAskill, who had frequently collaborated with Bankman-Fried to coordinate philanthropic initiatives, has been described as a TESCREAList. [1] [3] [8]

See also

References

  1. ^ a b c d e f g h i j k l m n Gebru, Timnit; Torres, Émile P. (April 14, 2024). "The TESCREAL bundle: Eugenics and the promise of utopia through artificial general intelligence". First Monday. 29 (4). doi: 10.5210/fm.v29i4.13636. ISSN  1396-0466. Archived from the original on July 1, 2024. Retrieved June 27, 2024.
  2. ^ a b c d e Torres, Émile P (June 15, 2023). "The Acronym Behind Our Wildest AI Dreams and Nightmares". TruthDig. Retrieved October 1, 2023.
  3. ^ a b c d e f g h Troy, Dave (May 1, 2023). "The Wide Angle: Understanding TESCREAL — the Weird Ideologies Behind Silicon Valley's Rightward Turn". The Washington Spectator. Archived from the original on June 6, 2023. Retrieved October 1, 2023.
  4. ^ a b c d e Ahuja, Anjana (May 10, 2023). "We need to examine the beliefs of today's tech luminaries". Financial Times. Archived from the original on December 11, 2023. Retrieved October 1, 2023.
  5. ^ a b Russell, Melia; Black, Julia (April 27, 2023). "He's played chess with Peter Thiel, sparred with Elon Musk and once, supposedly, stopped a plane crash: Inside Sam Altman's world, where truth is stranger than fiction". Business Insider. Archived from the original on October 11, 2023. Retrieved October 1, 2023.
  6. ^ a b c d Pejcha, Camille Sojit (May 23, 2024). "Techno-futurists are selling an interplanetary paradise for the posthuman generation—they just forgot about the rest of us". Document Journal. Archived from the original on June 29, 2024. Retrieved June 29, 2024.
  7. ^ a b Stross, Charles (December 20, 2023). "Tech Billionaires Need to Stop Trying to Make the Science Fiction They Grew Up on Real". Scientific American. Archived from the original on June 26, 2024. Retrieved June 27, 2024.
  8. ^ a b c d e f Zuckerman, Ethan (January 16, 2024). "Two warring visions of AI". Prospect. Archived from the original on July 1, 2024. Retrieved June 29, 2024.
  9. ^ Hendlin, Yogi Hale (April 1, 2024). "Semiocide as Negation: Review of Michael Marder's Dump Philosophy". Biosemiotics. 17 (1): 233–255. doi: 10.1007/s12304-024-09558-x. ISSN  1875-1342. Archived from the original on July 1, 2024. Retrieved June 29, 2024.
  10. ^ Duran, Gil (February 12, 2024). "The Tech Plutocrats Dreaming of a Right-Wing San Francisco". The New Republic. ISSN  0028-6583. Retrieved July 2, 2024.
  11. ^ a b c d Piccard, Alexandre (November 30, 2023). "The Sam Altman saga shows that AI doomers have lost a battle". Le Monde. Archived from the original on July 1, 2024. Retrieved June 30, 2024.
  12. ^ a b Bhalla, Jag; Robinson, Nathan J. (October 20, 2023). "'Techno-Optimism' is Not Something You Should Believe In". Current Affairs. ISSN  2471-2647. Retrieved July 2, 2024.
  13. ^ Brennan, Ozy (June 2024). "The "TESCREAL" Bungle". Asterisk. Archived from the original on June 12, 2024. Retrieved June 18, 2024.
  14. ^ Pethokoukis, James (January 6, 2024). "Billionaires Dreaming Of a Sci-Fi Future Is a Good Thing". American Enterprise Institute. Archived from the original on June 27, 2024. Retrieved July 1, 2024.
  15. ^ a b c Helfrich, Gina (March 11, 2024). "The harms of terminology: why we should reject so-called "frontier AI"". AI Ethics. doi: 10.1007/s43681-024-00438-1. ISSN  2730-5961. Archived from the original on July 1, 2024. Retrieved June 29, 2024.
  16. ^ a b Van Rensburg, Wessel (June 7, 2024). "AI and the quest for utopia". Vrye Weekblad. Archived from the original on June 30, 2024. Retrieved June 30, 2024.
  17. ^ Torres, Émile P. (November 9, 2023). "Effective Altruism Is a Welter of Lies, Hypocrisy, and Eugenic Fantasies". Truthdig. Retrieved June 30, 2024.
  18. ^ a b c d Wilson, Jason; Winston, Ali (June 16, 2024). "Sam Bankman-Fried funded a group with racist ties. FTX wants its $5m back". The Guardian. ISSN  0261-3077. Archived from the original on July 1, 2024. Retrieved June 29, 2024.
  19. ^ a b c d Brownell, Claire (November 27, 2023). "Doom, Inc.: The well-funded global movement that wants you to fear AI". The Logic. Retrieved July 2, 2024.
  20. ^ a b Kandimalla, Sriskandha (June 5, 2024). "The dark side of techno-utopian dreams: Ethical and practical pitfalls". New University. Archived from the original on June 30, 2024. Retrieved June 30, 2024.
  21. ^ Kulish, Nicholas (October 8, 2022). "How a Scottish Moral Philosopher Got Elon Musk's Number". The New York Times. ISSN  0362-4331. Retrieved July 2, 2024.
  22. ^ a b Goldman, Sharon (July 24, 2023). "Doomer AI advisor joins Musk's xAI, the 4th top research lab focused on AI apocalypse". VentureBeat. Archived from the original on June 29, 2024. Retrieved June 29, 2024.
  23. ^ Torres, Émile P. (June 11, 2023). "AI and the threat of "human extinction": What are the tech-bros worried about? It's not you and me". Salon. Archived from the original on June 30, 2024. Retrieved June 29, 2024.
  24. ^ Melton, Monica; Mok, Aaron (November 23, 2023). "'Black Twitter' asks 'What if Sam Altman were a Black woman?' in the wake of ouster". Business Insider. Archived from the original on March 3, 2024. Retrieved June 29, 2024.
  25. ^ Wenar, Leif (March 27, 2024). "The Deaths of Effective Altruism". Wired. ISSN  1059-1028. Retrieved July 2, 2024.

Videos

Youtube | Vimeo | Bing

Websites

Google | Yahoo | Bing

Encyclopedia

Google | Yahoo | Bing

Facebook