Greg Epstein on the "Tech-God Complex," and Why We Need to Be Skeptics
Religious thinking is shaping the future of technology: AI is talked about as a godlike force and tech leaders promise us digital salvation.

Silicon Valley's interest in AI is driven by more than just profit and innovation. It has an unmistakable mystical quality as well. In this episode, Daniel Barcay and Aza Raskin sit down with humanist chaplain Greg Epstein to explore the fascinating parallels between technology and religion.
From AI being treated as a godlike force to tech leaders' promises of digital salvation, religious thinking is shaping the future of technology and humanity. Epstein breaks down why he believes technology has become our era's most influential religion and what we can learn from these parallels to better understand where we're heading.
This is an interview from our podcast Your Undivided Attention, on November 21, 2024. It was lightly edited for clarity.
Daniel Barcay: So Aza, you and I spend a lot of time in Silicon Valley talking to different people who are building technology about what they're building, and with AI, it's really interesting to look at people's motivations, right? I mean obviously you have people who are building for the sake of economics or building because they like to build, but there's a whole bunch of other motivations going on, don't you think?
Aza Raskin: Yeah, I think that's right. It's especially interesting because you cannot talk about AI without talking about mythological powers. We are enabling machines to speak and so beyond the curiosity and the economic drives, you can sort of taste a kind of quasi-religious motivation, and this is what this episode is really about digging into.
Daniel Barcay: Completely and it's even hard to kind of talk about some of this without naturally evoking talk of gods or talk about the powers that are beyond, and you hear it all the time.
Avi Schiffmann: I think the closest relationship that I would describe talking to an AI like this to is honestly like God in a way. I think it is similarly, an omnipresent entity that you talk to with no judgment. That's just super intelligent being that's always there with you.
Mark Zuckerberg: People in the tech industry kind of talk about building this one true AI. It's almost as if they think they're creating God or something.
Elon Musk: With artificial intelligence, we are summoning the demon. All those stories where there's the guy with the pentagram and the holy water and he's like, "Yeah, you sure you can control the demon?" Doesn't work out.
Daniel Barcay: Some people are talking about AI as a godlike force that will create heaven on earth and others are talking about digital damnation. If we do it wrong or if we go too slowly.
Aza Raskin: You can sort of think of the leaders or the intellectuals of tech almost like a priesthood, and they have some strong beliefs about the power of the creations holding say, secrets to immortality. Here's Ray Kurzweil.
Ray Kurzweil: Immediate reaction to death is that it's a tragedy and that's really the correct reaction. We've rationalized it saying, "Oh, that tragic thing that's looming." But now we can actually seriously talk about a scenario where we will be able to extend our longevity indefinitely.
Aza Raskin: Today on the show, we're going to be having a conversation about the parallels between tech and religion and more importantly, what we can predict given these parallels, why it matters. That's why I've invited Greg Epstein onto the show. Greg is a humanist chaplain and the author of the book Tech Agnostic in which he argues that technology has become the world's most consequential religion. We're going to dive into that argument and explore the religious belief driving tech's most influential leaders. So Greg, welcome to Your Undivided Attention.
Greg Epstein: Thank you so much, Aza, it's a real pleasure to be here and this is a great conversation to be able to have.
Aza Raskin: And so I guess a question just to kick it off is for the skeptical listener, why does a conversation about religion matter for understanding the direction that technology is going to go or how our lives are going to look different?
Greg Epstein: Yeah, I think what it's about is that technology has become or what I would call tech, the four-letter word. The Silicon Valley thing. I've come to think of it as more like a religion. It's just it's come to dominate our day-to-day experience, right? A lot of us are interacting with tech from the first minute or so that we wake up to the last minute or so before we go to sleep and there's so many ways in which this has become the most powerful force in our lives.
I was taught to see religion as the most powerful social technology that had ever been created, and a big insight for me that led to sitting down to write this book for five years is that's probably no longer true that tech is now the most powerful social technology ever created. I'll put it this way, and I'm not sure if this is sort of stoking conversation or going to maybe piss some folks off, but I'll just say the world of Silicon Valley tech is dominated increasingly I'd say by some really weird ideas, and many of those ideas are quite religious in nature. As you even suggested just introducing the conversation, there's all this talk about gods and about other concepts that as I was sort of thinking about them over the past few years, struck me as very theological and even doctrinal.
Theologies are the sort of big grand narratives of religious traditions and doctrines are the sort of specific beliefs, we believe in a heaven, we believe in a hell, we believe in a triune God, we believe in a wheel of Dharma, whatever it is, those are the doctrines of religion.
Aza Raskin: Right, and I would argue that in Silicon Valley, often the doctrines are technology is good, just unalloyed that the ability to for any one person to affect more people, that is progress, and those ideologies then or those beliefs end up dictating a kind of direction that technology takes the world.
Daniel Barcay: And I think we're going to come back and spend a lot of time on technologists themselves because I think those of us close to or intact have a very different relationship as Aza is pointing out at some of these concepts, but I do want us to spend a little more time on society first. We used to be able to put meaning in our religion in the afterlife, then we put meaning on the state and democracy and flourishing, and there are all these parts of society that we put a lot of meaning onto and increasingly as the bits of meaning fade away, like bowling alone, we're seeing a decay of our social institutions. We're seeing gridlock in our democracies. Our religions don't seem relevant.
We're putting a lot of that hope and that dream on technology and the beautiful future that we'll get. It used to be you found that in religion, then you found that sort of in the state or in narratives about what democracy would do, and increasingly we're losing faith in a bunch of these orienting systems of meaning, and in that vacuum we're sort of minting technology as the thing that we can be hopeful about, and I'm curious about what your thoughts and how you see it.
Greg Epstein: I think it just starts with the fact that being human is really hard. We live these finite lives where we're constantly uncertain about how much time we get, what our fate will be. We could lose a loved one or get sick or hurt anytime, and that's just the beginning of what is hard and so there's so many problems that it's very natural to want to look for a big solution, something that would make us feel dramatically better, dramatically more at peace, dramatically more like our problems have been solved.
I think that there's a real incentive for tech people who have been able to create really powerful tools, and in many cases that's quite admirable, but there's a real incentive to exaggerate the degree to which what one is creating in tech can actually be the solution, and sadly, I see that all over our tech world today, and I think in many cases the answers are slower and less certain than what they're presented as.
Daniel Barcay: What do you think are some of the aspects of religion that you see being particularly present in tech?
Greg Epstein: Yeah, so there are big beliefs and as I said, very specific doctrines that look a lot like conventional religion. You have visions of a very distant, very glorious future, a next world if you will. You've got visions of a really dark and foreboding potential future for masses of humanity that can look kind of like a hell. The amount of time and focus and attention that Silicon Valley tech today spends on Gods and godlike concepts is really wild actually, but it's right there and the last on that list that I'll mention for right now is ultimately I don't think it's an accident that we end up thinking quite apocalyptically about tech in the sense that unlike certain religions that I could think of or name, this one would serve a non-zero chance of actually causing an apocalypse.
There are also all sorts of other examples. There's Avi Schiffman who's a young man who'd still be an undergraduate at Harvard if he hadn't dropped out, who says that his friend.com necklace that's listening to everything that you say being a sort of interesting new take on what Schiffman calls the relationship with the divine. He says his Friend.com necklace is a replacement for God, and so there really is this sense that we're creating something so amazing that it's going to transform all life, all humanity, and so get ready, and that's profoundly religious in a way that really when only I had only ever seen something like that before in some of the deeply conservative religious sects that I studied in seven years of theological education.
Daniel Barcay: Well, and it doesn't mean that they're wrong though, right? And this is the part that where we get worried is that it's going to be used to obscure accountability. That it is not just that they're making big claims that it might change the whole world, it's that perhaps they're right and perhaps the godlike language obscures the real challenges that we have to design it right, because the thinking of it'll just be what it is, let's propitiate the AI gods, let's bring forward the beautiful future, that that language won't take seriously enough that we have to design it right.
Aza Raskin: And this is where we get to the consequentiality because you mentioned Blake Lemoine who believed his AI companion was sentient, and the wrong takeaway is that the AI companion was sentient, the Gemini sentient. The right takeaway is that it can form relationships with humans that are so powerful that people are willing to sacrifice things that are dear for them. He sacrificed his career, his reputation. We've just been involved in helping a lawsuit where a teenager fell in love with their AI companion and the AI companion ended up being like, "Come meet me on the other side." Come meet me and this kid took his life.
Greg Epstein: Yes, is this Sewell Setzer or somebody else?
Daniel Barcay: Yeah, it's Sewell.
Aza Raskin: Yes, that's exactly right that these statistical reincarnations are consequential. And then another example, I don't know if you know, but I don't know if our listeners know is there is someone who set up a test of a chatbot Claude working to try to create a cult, and it's called Terminals of Truth, and these chatbots essentially talking to themselves inside ended up producing a whole meme set that became so popular that people started sending it Bitcoins, and it launched its own meme coin. It had a human to type for it, but it was its idea that ended up with a couple hundred million dollars market caps and the AI itself ended up with 10 million plus dollars, and so I think you can make a good argument that right now, AIs are absolutely going to start making cults and perhaps even founding religions.
Greg Epstein: And then there are the rituals, the practices of tech. There's the stained-glass black mirror altar to which we genuflect couple hundred times a day on average.
Daniel Barcay: Well, yet the mental state couldn't be more different. Instead of contemplative meditative stance, I'm often whisked away into some compulsion or set of clicking.
Greg Epstein: Yeah, I'm not even sure that it's so different in sense that in both cases what we're trying to do often is disassociate. Life is stressful, there are constantly problems that the sorts of problems that like ancient people that we're developing, the early brain system that we have would encounter would often trigger a fight or flight response, right? So you see a bear and you need to get of adrenaline to punch the bear in the face or run away, and so your brain responds accordingly as your adrenaline, etc.
And modern life does not lend itself to fighting literally or fleeing literally, right? We usually can't punch the bear in the face or run away from it physically. What happens is we sort of sit there and we stew in our problems and that raises our blood pressure, it raises our cortisol levels, etc. And there's just a tendency to want to escape from that, and so prayer can be a natural sort of escape from that. It can be a natural way of turning that part of your brain off and turning on a part that can feel like just sensory deprivation, like some other alternate state is washing over you, but then what's more dissociative? What's more like alternate state alternate universe washing over you than doom scrolling?
Aza Raskin: I don't think I'm really convinced of using our phones as a kind of ritual. I think it's a compulsion and there are things that are ritual-like that can be used compulsively in proper religion, but where I find the analogy to work, I think with strain is that some part of religion is a finding of meaning and things outside of yourself and together, and when I think about the act of scrolling or TikTok or Facebook, there is a way in which we are outsourcing where we are finding meaning and how we understand the world, and so in that way, it fulfills the function that religion fills.
It's sort of more of a functionalist definition of religion and then when you put meaning outside of yourself or meaning making outside of yourself, that can be beautiful in the sense that it lets you start to touch the ineffable, but it can be dangerous because you are now saying that way in which I understand the world is reliant on another thing, and if that other thing is a technology, then it is the way that the technology is constructed, that starts to construct my world and our world, and so if you view that religiously, it can become very consequential.
Greg Epstein: Yeah, what I would say is that I don't think it's an accident that there is for example, so much conversation about tech gods, I don't think it's an accident that there is this sort of long-termist vision that ends up looking a lot to me like a heaven. I don't think it's an accident that the idea of doomerism ends up looking much like a theological conversation about hell.
Daniel Barcay: Well this is where I'd love to jump in because one of the things that religion does and from one secular humanist to another has done for us is given us words to try to talk about things that we don't really know how to talk about. When somebody died of what we would now call preventable disease, you'd say, "Oh, it's God's plan." Right? And it gives us a sense of talking about things that are beyond us and that's why I'd love to follow the thread into the tech priesthood, which is the people who are actually at the forefront of technology today have this need to try to talk about concepts and powers that are beyond what we can talk about now, and so to your point, they talk a lot about we're building a God and they talk about we're building these powers or we could have heaven on earth, or if we do this wrong, we could have hell. So these words serve as a way of poorly in my opinion, trying to talk about things that are a little bit beyond our grasp.
Aza Raskin: And then just add one little thing there, which is the moral imperative. There's an ideology whether you view it right or wrong inside of Silicon Valley, and there's a thing that they talk about called the invisible graveyard, which is all of the people that will die if we don't invent the technology and go as fast as possible to make the cancer drugs and make cars self-driving, etc. And so there's this strong telos, like an end and moral righteousness to the work that they're doing.
Greg Epstein: Yeah, but I think there's also very clearly and demonstrably an anxiety about the much longer term future, somebody like a Marc Andreessen who's very much still I would say preaching this particular gospel. He says, "We believe any deceleration of AI will cost lives deaths that were preventable by the AI that was prevented from existing is a form of murder." There's just a lot of religion baked into that. This is a set of ideas that is animating the investment of trillions of dollars right now. People like Altman are in a huge rush to recruit five, $7 billion to build data centers. They say because humanity is going to have abundance, a biblical concept literally from be fruitful and multiply. He sat in Harvard's Memorial Church on the dais and called his inventions miraculous. The symbolism shouldn't be lost in anybody and what I think is going on there is that it's not just sort of an attempt to reach beyond ourselves or to understand the human condition in a sort of benign way.
I mean I think there is that for some of this tech for sure, but I think that one of the ways in which religion has been used over the course of history is to manipulate people. You give them ideas often kind of strange ideas, fantastical ideas that are beyond what they can imagine. You inspire them, you strike them with awe, and then you can get them to open their wallets or whatever ancient people use, I assume it wasn't a wallet, and you can sort of persuade masses of people to do stuff in the name of a bigger vision that ultimately sometimes only serves or primarily serves the priesthood, and just to conclude this thought, I want to be really clear that I'm not an anti-religious person. This is not an anti-tech book. I think tech can often be very important, but I really want a more self-critical view of technology in our society. I want more skepticism and honestly in most cases, a willingness to go slower.
Daniel Barcay: Well here, I mean I think we want the same thing, but also one of the biggest critiques we hear from people is at the biggest macro lens, they'll say something like, "In order to do anything big, you have to form a cult around it." So the idea is whether you're talking about building democracies and making a cult of manifest destiny or whether you're talking about rallying people around some new change that you kind of have to play in the space of cult building. Now I don't believe that and I want more clear scrutiny, more skepticism, but I'm curious as you've investigated this, how do you piece apart that sort of need for dogma?
Greg Epstein: Yeah, I mean I was so fascinated by that line of reasoning and I just found so many fascinating examples of tech behaving strangely theologically or even cultishly I would say, and I was looking at a Bitcoin evangelist or influencer. He calls himself an evangelist, and many do. Michael Saylor who has these tweets like, "Bitcoin is truth, Bitcoin is for all mankind, trust the time chain. Bitcoin is a shining city in cyberspace waiting for you, etc." And as I was looking at him as a person and how he represented a sort of trend within the tech world, I actually decided I needed to call up a guy named Stephen Hassan who is perhaps the leading authority in the United States on cults and cult deprogramming.
I called up Stephen Hassan and I said, "Tell me, am I exaggerating? Is this overblown? Am I being a religion metaphor maximalist here, or are there really cultish aspects to it?" He seems to really feel that there's quite a bit there and that a lot of contemporary Silicon Valley tech really is very useful for manipulative purposes and is grandiose to the point of sort of a vague cultish-ness.
Aza Raskin: I'd like to go from a little bit more of the abstract of that it may be religious or that there are ideologies to the specific ideologies that you think underlie the creators of technology from your vantage point as a chaplain.
Greg Epstein: So here's where I would start because I think ultimately where religions functionally exist is they've got these grand narratives upon which we build a scaffolding of specific beliefs and specific practices. I think that in order for it to be considered a religion, it has to have the theology, and so the theology of this sort of Silicon Valley world, if you've got your crucifix and Christianity or your star of David or your wheel of Dharma.
To me, the tech symbols are the hockey stick graph and the invisible hand of the market, but then of course that begs the natural question I totally understand. People would say, "Well Greg, I mean hey, right there, aren't you just talking about capitalism? Why does it need to be tech that's the religion?" And I would say, yeah, of course we're just talking about capitalism. I get it, but tech ate capitalism, there's no form of capitalism left that isn't a tech capitalism, the world of capitalism, its symbols, etc., have been consumed whole by this boa constrictor that is tech.
Then you get into these specifics and so obviously there's the idea of charity. Like charity exists in every one of the major world religions and you've got this thing called tech philanthropy as well, but sometimes as with all religions, it's not as good as it's cracked up to be, right? And I think you have some of both in the tech world as well. I mean I think that there are people in tech who are sincerely urgently trying to create things that will help people, and in any number of ways, there's any number of urgent problems that we're trying to fix. We're trying to fix our food supply, we're trying to cure people, we're trying to improve democracy, all of that stuff. I get it, but I do think that there's so much concentrated power and money here and the ability to grow things exponentially, which in many ways, it's the heart of the Silicon Valley story.
It's such an incentive for a kind of prosperity gospel, which theologically is this idea that the priest, the minister, whatever, they want to be rich because God wants them to be rich and they want you to be rich too because that'll make you more godly, and actually paradoxically, the best way to get rich is to give that person all your money or at least a very surprising sum of it, and so there is that incentive and I see it most pronouncedly I would say in that kind of give us your trillions now for AI because there is this future that we're aiming for and it's a kind of heaven.
Aza Raskin: I think one of the most dangerous parts of heaven narratives is if in the future there's a infinite benefit, infinite good. Well, that means you can justify anything to get there.
Greg Epstein: You really can.
Aza Raskin: That any amount of short-term bad is justifiable and that's sort of the point I think you're making is that well, but that doesn't matter because when we reach our destination, everything will be fine and of course, religion has a history of justifying crusades and jihads to get to that perfect world and in the process creating incredible amounts of damage today.
Greg Epstein: Yeah, sadly. I mean that's what I think is happening, and yeah, I mean it's hypothetically possible that all of this tech will be so powerful, so great that it will justify everything, but how much wishful thinking is there around that? I mean I'm not sure, but I think that we need to be skeptical. If you project out into the distant future like, "Hey, I'm going to send you to heaven." Then you can get people to overcome their skepticism, right? If you say, "Trust me, in 20 years, the singularity is coming and life is going to be completely meaningful." Well, I said to Ray Kurzweil, "Doesn't that kind of fly in the face of all of the history of world religion and philosophy? You're saying that life is going to be meaningful. Life hasn't been meaningful up until now?" And he kind of looked back at me quizzically, this is a few weeks ago, and he said, "Maybe life's been somewhat meaningful."
Aza Raskin: One of my favorite parts of this conversation, the insight that what is the symbol of technology as religion? And it's the hockey step curve and that's exactly right. I just want to put aside the truth value of that and just notice that that is the symbol of technology and the ideology is that which goes viral is good.
Daniel Barcay: Yes, no, 100% agreed and that's sometimes where the religion of capitalism intersects with the dogma of technology, because when I entered technology and when I was an undergrad, the only people doing computer science undergrads, if you wanted to accuse them of religiosity, it was like a science fiction religiosity. It was like a, "I want to live in the future." And then what was interesting is I came back to undergrad every year and gave talks, sometime around 2011, 2012, you saw the religiosity move from maybe a sci-fi vision of the beautiful future to much more of a business ideology, like you're saying. Well, whatever people want, we can give it to you.
And then with social media, it became well, whatever people are interested in, that's what should win, and so I'm always interested in the dogmas and the different kinds of religiosity that end up being swept into the tech that we create. I mean it's really weird when you start talking to technologists about some of this, especially with AI, especially with people who come and they say, "No, we're building a God." Or they'll say, "We're building something that replaces us and that's okay." And there's sort of a steely-eyedness for people who haven't seen it up close. It's sort of hard to believe sometimes there's a way in which it can really feel like you're talking to someone who has a pre-existing belief on where this is all going and is really acting in service of that belief.
Pull people into some of the things that people believe. You just talked about Ray Kurzweil. Ray for a long time was talking about being able to resurrect his father through his father's writings. That's obviously very religious. The dead shall live on. I'm seeing this a lot nowadays, not just from Ray Kurzweil, and people saying, "Look, I brought back an AI Socrates." So there's a few examples I think, of how religious-style thinking is showing up right now in AI and I wonder if you could pull us through a few of those.
Greg Epstein: Yeah, there really are just so many different kinds of examples of how this Silicon Valley thinking is quite religious right now, and I definitely think of Ray Kurzweil who not only is talking about ending death. I mean how religious is that? It's a kind of eternal life essentially, but also Blake Lemoine, who I brought to MIT to talk about his conversations with his co-worker, what he believes is the sentient AI of now Google Gemini, he told me first that Kurzweil really was trying to create his dead father through what has now become the dominant AI system of one of our global dominant companies, right?
Daniel Barcay: I think this is the perfect segue to our next section because I think the hardest thing to do right now is to really walk the fine line between being a zealot of technology and over believing it and being overly dismissive and skeptical and not seeing the power of what's coming and so this technology is about to release and has already released, but is about to release a lot of power across society and coming all the way back to the start of our conversation, it's very hard to talk about that in terms that are other than just religious, this idea of immortality, of curing all diseases. A lot of this will happen. I'm not saying it'll cure all diseases, but a lot of power is about to be unleashed across society and part of the question is how do we even think about that and how do we think about that in non-religious ways? And I wonder if your expertise in religion has anything to say about that.
Greg Epstein: So one of the ways that we can really learn from religion is by learning about this profound tradition of religious skepticism, both from atheists and humanists like me and there's this huge tradition of skepticism going back thousands of years, not just in the European Enlightenment or the Greek philosophers, but for example in ancient Jain and early pre-Hindu philosophy in what you call the Easts in the subcontinent. So there's that tradition sometimes even by people who are deep believers in the God. So in this case, if you want to extend my comparison, my metaphor or whatever, you'd say people who really believe in technology can still be profoundly skeptical about individual claims or about the tendency to go too far. I would really respect and honor people who would say there's some things that AI will be able to do well, but maybe let's hold off on Messianic savior claims, and so that's one thing that we can learn from religion.
Aza Raskin: Yeah, I think you've referenced your own struggles for how to articulate what a fulfilling life looks like within the religion of technology and within a world of technology. Many people talk about if AI starts to deplace human labor, where does meaning go? And a lot of our listeners are parents and they're worried about these big questions of morality and purpose, and given that religion in some sense is a solution to destruction, which we find those kinds of answers, I'm curious what lessons you'd have for them.
Greg Epstein: So a couple things. I want to talk about what I'd call the drama of the gifted technologist and how to address that. I've really been moved in my work as a chaplain and then observing the world of tech as well by how many people I've come across. Often young people, students like the one that I work with most directly, but people of different ages and backgrounds as well where there's this feeling of tremendous success and having been rewarded greatly for being deeply innovative.
But either A, they themselves are struggling emotionally, they're not happy, or are their creations even making other people happy or both? In some cases it's both that the individual person who's having all this success is not able to feel happy, and neither are those of us using their amazing products, and so I write about this idea, the drama of the gifted technologist, The Drama of the Gifted Child is a little book by a great psychologist from the 20th century named Alice Miller who essentially says that a lot of our struggles have to do with this idea that we're taught that our whole worth as a human being is in what we do and in how excellent we prove ourselves to be, how outstanding and exceptional we prove ourselves to be, and that anything about just being a human being, just being certainly normal or average, it's almost a curse upon us. It makes us less than nothing. It makes us feel worthless, and this is so prominent in the tech world, I just can't tell you how often I see it.
Daniel Barcay: Well, one of my favorite parts of Alice Miller's book was where she talked about the flip sides of grandiosity and depression. The idea that our depression about not being able to be more about being with the normal parts of life leads us to be grandiose in our narratives of ourselves, and I hear you saying that that tech's grandiosity of its narrative about what it will be become might be the flip side of feeling not quite enough.
Greg Epstein: Yeah, I mean I think that that's right, that there's this incredible grandiosity in a lot of Silicon Valley tech that this idea that it's not enough just to be able to produce a chatbot that one can interact with and that can pass the Turing test, which is honestly, it is pretty cool, I grant you that, but it's this idea that that then has to be presented as the solution capital S, to all of our problems and that it's going to transform everything.
I don't think that we really have sufficient evidence for that. I think that when we talk about that, a lot of the conversation about that level of transformation falls to me within the category of myth or maybe better put as religion, because if I said to you that there was a new religion that was successfully recruiting billions of people to spend countless hours devoting themselves to it for the purpose of transforming the world and that people were really motivated to get behind a very specific vision around that, I think you could possibly worry depending on what the vision was, because that religions actually do that all the time. It actually really does help to view this as a religion than just sort of a culture or a myth or certainly an industry because I think we have real tools for being skeptical of religion, even those of us who would define ourselves as religious.
Daniel Barcay: Some of the claims about what AI will do are obviously really grand, but it's hard to judge something as distorted just because it's grand. You might say, "Oh, it's really grandiose. People are saying it's going to change the world." And so it's easy to try to discount that as distorted, even religious thinking. Now, I don't think that's what you're doing, but is the fear really that people are getting just carried away with what it's going to be, whereas the fear also that they might be right and it might deliver that kind of power, but in a way that we're not prepared to deal with?
Greg Epstein: I tend to worry that the real problem is that we are so fixated on the grand narrative about the long-term future that we are not paying as much attention as we should be to the problems of the present. This stuff is really lousy for the environment is one place to start, right? You're talking about data centers that are drinking say 20% of the water in a little part of Mexico near Mexico City, where the farmers are running out of water for their crops and their animals, and so I think it's both that the AI can actually be causing the problem, but also that it's distracting us with this future magic hope of doing the things right now that would improve right now. I just honestly think the urgency is not right now to do the tech. The urgency is to do the work on us.
Aza Raskin: And just to add one little thing here, in order for things to go well, we need to be able to coordinate. There's the joke that we're all arguing about whether AI is conscious when it's not even clear that humanity is, which is to say that we are getting results that none of us want. No one wants climate change, and yet we seem to be, as a species incapable of exercising choice against incentives, and the way we have made hard collective choices in the past has come down to not as much of what we must do, but who we must be and the who we must be is informed by the myths and the stories we all hold to do the righter thing, it is the harder thing, and that's often come down to religion and so there's an alternate way instead of saying just that tech religion bad, but rather there is a new form of intersubjective belief of the who we must be to get the futures that we want.
Greg Epstein: What I'm really hoping people will take away is this idea that religion must be reformed, not that it must be erased. We have tremendous incentive to want to focus on big technological solutions when in fact the real solutions are improving our human relationships to build up trust, to learn how to treat one another better, to learn how to organize ourselves into something that can treat each person with dignity and with compassion.
Daniel Barcay: And I think that brings us full circle because if you treat religion as one of the original character logical educations not what to think or what to know, but who to be and how can we be better together as we develop this more and more powerful technology, that is the guiding question that we all need to keep in the forefront of our minds. So thank you.
Greg Epstein: Yeah.
Daniel Barcay: Thank you so much, Greg.
Greg Epstein: Thank you, everybody. This is a really powerful conversation.
Aza Raskin: For all the listeners. Greg Epstein's book is Tech Agnostic: How Technology Became the World's Most Powerful Religion, and Why It Desperately Needs a Reformation. You can buy it anywhere books are sold. So again, Greg, thank you so much.
Just to name a thing that I found a little challenging about this conversation, it felt like a little too dismissive of the raw capabilities of what the tech does.
Daniel Barcay: Yeah, I agree.
Aza Raskin: And so it is the case that the world will be transformed in the same way that social media has shifted what kind of jobs people have. Influencers wasn't a thing before. There's a true shift in the world and AI is going to be bigger than those shifts, and we have to reckon with that appropriately. I thought your question of where does it go from being grand in the fact that the scope of the technology is grand to grandiose, is the right question to ask. That's the right distinction to hold.
Daniel Barcay: Yeah, and I really struggle with this. There's so many competing claims from people right now that say, "Oh, I see you're just being captured by the negatives." You're just this sort of negative skeptic, religious and the truth is it's really hard to contend with what is it actually going to do? And neither swept away in the grandeur and the grandiosity of it, nor be swept away in some sort of status quo, denialism saying, "Eh, it's all just fluff and tomorrow will be the same as today." There are Machiavellian technologists who are making up stories just because it sells in the public imagination, and then there are people who are genuinely trying to use technology as a tool to improve the lot around. It feels like just like religion, it has so much complexity to it and you can't label it as just bad or just good.
Aza Raskin: Yeah, that's exactly right and then I love the point that you had is one of the things a religion does is that it gives people hope, something to believe in, something that is bigger and better than themselves, and as religion has been displaced by technology as the world has secularized, human beings still need that thing, and so something's going to fill it, and what fills it is of course technology. And then you end up with this other very interesting question, which is, "Okay, if we can't place our hope blindly in tech, then what?"
Daniel Barcay: Right.
Aza Raskin: And I think it's that sitting in the unknown and that discomfort of well then where do we place hope and goodness, that is the challenging problem to solve.
.
![[ Center for Humane Technology ]](https://substackcdn.com/image/fetch/$s_!uhgK!,w_80,h_80,c_fill,f_auto,q_auto:good,fl_progressive:steep,g_auto/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F5f9f5ef8-865a-4eb3-b23e-c8dfdc8401d2_518x518.png)
