Not As Crazy As You Think Podcast
Mental Health is attainable for anyone--especially those labeled with mental illness. Join artist, memoir writer, and bipolar psychiatric survivor, Jen Gaita Siciliano, as she challenges our world's current limited understanding of mental illness in interviews with artists, healers, educators and shamans, who offer fresh perspectives on mental health and creativity. Episodes also include Jen's personal writing on living as a bipolar creative, as well as news commentary that exposes psychiatry as an incompatible paradigm with the true landscape of the human mind. If you are ready for a new narrative on the mental realm in a place that celebrates crazy and cool without penalty, then Not As Crazy As You Think is for you!
Not As Crazy As You Think Podcast
A.I. News: Open Letter To Pause, Existential Risks to Humanity, Its Supporters & Deniers (S5, E8)
In the episode "A.I. News: Open Letter To Pause, Existential Risks to Humanity, Its Supporters & Deniers (S5, E8)," I discuss the recent Open Letter to Pause Giant AI Experiments recently composed by the Future of Life Institute, and present the arguments for taking the risk analysis more seriously. With signers of the letter including Elon Musk, Emad Mostaque, Steve Wozniak, Max Tegmark, Tristan Harris and Aza Raskin, I share some of their positions including some recent articles, podcasts and videos discussing the dilemma.
#deeplearning #AIrevolution #humanextinction #generativeAI #blackbox #dontlookup #LivBoeree #Danielschmachtenberger #tristanharris #azaraskin #maxtegmark #eliezeryudkowsky #centerforhumanetechnology #lexfridman #Moloch #machinelearning #AGI
References:
Pause Giant AI Experiments: An Open Letter
https://futureoflife.org/open-letter/pause-giant-ai-experiments/
The A.I. Dilemma - March 9, 2023 by Center for Humane Technology
https://youtu.be/xoVJKj8lcNQ
Meditations On Moloch By Scott Alexander
https://slatestarcodex.com/2014/07/30/meditations-on-moloch/
Misalignment, AI & Moloch | Daniel Schmachtenberger and Liv Boeree https://youtu.be/KCSsKV5F4xc
Pausing AI Developments Isn't Enough. We Need to Shut it All Down
https://time.com/6266923/ai-eliezer-yudkowsky-open-letter-not-enough/
Live: Eliezer Yudkowsky - Is Artificial General Intelligence too Dangerous to Build?
https://www.youtube.com/live/3_YX6AgxxYw?feature=share
The 'Don't Look Up' Thinking That Could Doom Us With AI
https://time.com/6273743/thinking-that-could-doom-us-with-ai/
Max Tegmark: The Case for Halting AI Development | Lex Fridman Podcast #371
https://youtu.be/VcVfceTsD0A
Please visit my website at: www.jengaitasiciliano.com
Don't forget to subscribe to the Not As Crazy As You Think YouTube channel @SicilianoJen
Connect:
Instagram: @ jengaita
LinkedIn: @ jensiciliano
Twitter: @ jsiciliano