Episode Transcript
[00:00:04] Hi everybody. Welcome along to another episode of the Dispatchers podcast. My name is Brendan Malone. It is great to be back with you again. And today we are going to be talking about once again artificial intelligence, digital narcissism and human dignity. And this is our sixth and final episode in the six part series. So let's just jump straight in, shall we? And as per usual, all glory goes to God. For anything good that you receive in this episode. All the rubbish that's my fault, and please God, you forget it before you've even logged off today. Now you remember last episode number five. We were talking about AI and I really wanted to make the point that I think a good, healthy approach to AI, certainly from a Christian perspective, is one where we are able to recognize places where it could actually be utilized for the good of the human person. So it could be utilized in moral ways and, and ways which contribute to human flourishing and don't harm human flourishing. I think an important distinction to make here too is that you can have an action that isn't necessarily immoral in and of itself, like a use of AI, but that could fall into excess. And if it falls into excess, then it starts to destroy human flourishing or harm human flourishing. I guess it's just like social media or, you know, watching beautiful films or viewing beautiful works of art.
[00:01:25] If that's all you do with your time and you don't attend to your family's needs and their wellbeing because all you're doing is sitting in front of beautiful works of art all day, then there's a problem there. And so I think it's important to make that distinction. So even with cases of, or uses of AI, which might not be morally problematic, I think we also need to ask that question, is this contributing to human flourishing as well? What I want to do in this episode is I want to now move into some areas where I think we, we can see some clear harms and where the dangers actually exist with AI. And these are things that I think we need to be aware of. And again, as with last time, this is by no means intended to be an exhaustive list. I'm sure you can think of other potential harms as well. It seems that every other week there is some new development with AI and the way that it is being utilized, which raises moral questions. And so this is not exhaustive by any means, but I think it's a good place to start. But before I get into that though, I want to lay a little bit of a framework and the two things I want to point out are these. Number one is that AI is not like other technology. And I think this is really important to understand. There's two differences I see, and these are two really important things to understand.
[00:02:38] Number one is that artificial intelligence has arrived into an already overly tech dependent society.
[00:02:47] Now obviously it's not like AI would have suddenly arrived out of nowhere, just evolved out of nothing. You know, some sort of magical appearance of AI 50, 60, 70 years ago. It's part of a developmental chain of progress. And, and, and as technology has evolved, you know, AI has sort of arrived on the end of that chain.
[00:03:08] But the point is that if it had arrived 50, 60, 70 years ago, where we weren't so mired in the problem of tech dependency, personal dev addictions, all that kind of stuff, then I think it would have been a very different picture. I think the way that or the risks would not have been the same. And I think that's important to understand. And I think it's important for us to get our head around that, because before we even engage with AI itself, we already need to sort of acknowledge the fact that we have a problem, a deeper problem, a foundation, if you like, that is a bit mushy. And so when you add AI onto that, you're compounding an already existing problem. So an overly tech dependent society that now has what seems like technological sorcery at its disposal and you got a recipe for a very serious potential disaster in that, that perfect storm. So I think that's important to keep in mind. The other thing I think it's important to recognize is that we need to think of AI technology as different from other types of technology. Now this is also true of social media. But because we're talking about AI in particular here, we're obviously going to apply it in this case. But I think it's important to consider AI and also as I said, social media both as different from other forms of technology. They are not like typical benign forms of technology. And this is where some of the comparisons that people are making are actually flawed and they fall down. So I've heard people saying, oh no, no, no, don't panic. This is just like the advent of the printing press or the invention of the automobile. No, there's something very different going on here with this particular type of technology.
[00:04:48] Those former types of technology, they are benign. They require a human agent to actually be utilized for good or for evil. And so, for example, you can get in a car and you can drive it to go and see an elderly sick relative that previously you hadn't been able to see as regularly because they were too far away and it was inaccessible to you. And now with a motorcar, you can do that very quickly and efficiently. You can create ambulances which get people to hospital very efficiently and quickly, or you can use a motor vehicle to commit armed robberies or hit and runs or other forms of evil. But the technology itself, once the human operator has left the vehicle, there is nothing there. It's just dormant, it's not doing anything. It's not interacting in any way at all. It's just present. Now I get it. You can, I think, make a reasonable argument about the fact that even in its dormancy there is a certain interaction with the environment, but it's very low level.
[00:05:48] Whereas something like AI and social media as well, that's not how they interact. I think it's better to think of them like an animal rather than an axe. When you think about an ax. An axe is a tool just like AI, just like social media. But that tool will only be utilized and can only come into effect once a human agent picks it up and begins swinging it to either chop down a tree or to attack someone with right for good or for evil. But that's not what AI. It's not what social media is like. Think of it more like a species that has been like some sort of living organism that is being introduced into an already existing ecosystem.
[00:06:29] That particular technology is not simply dormant when human agents are not utilizing, is active and it's interacting with us and it has an effect on us and acts. Doesn't really have any meaningful effect upon dopamine levels, for example. Social media, AI and other things do. There's a. There's a different way in which this technology is present in the world. And I think that's important to understand because it helps us to avoid falling into these fallacious traps where we say things like, oh no, it's just like the printing press or it's just like an automobile. You know, everyone panics about new technology. I mean, first of all, I think it's highly debatable how much panic there really was. You can find pockets of it for sure, but I don't think it's widespread. But what, leaving that aside, even the simple truth is these are not the same kinds of technologies. And I think it's important to acknowledge that. Otherwise you'll get in trouble again. I want to, before we jump into specific harms, and I've tried to categorize the harms, by the way, and then cite examples for these Harms. Let me just quote again from this important document, Antiqua et Nova, that was published a year ago, just over a year ago by the Vatican, about artificial intelligence. Like any product of human creativity, AI can be directed toward positive or negative ends when used in ways that respect human dignity and promote the well being of individuals and communities. And it can contribute positively to the human vocation.
[00:07:58] Yet, as in all areas where humans are called to make decisions, the shadow of evil also looms here, where human freedom allows for the possibility of choosing what is wrong. The moral evaluation of this technology will need to take into account how it is directed and used.
[00:08:15] So there is a moral component here. I am not particularly taken in by people who like to make claims like, oh, this is just an economic issue, or this is just a political question, or this is just a technological issue. It's got nothing to do with morality. In actual fact, human persons, we are moral agents, we are moral beings. It's part of who we are and who we are created to be. So the idea that you just separate and sever off morality as something secondary to technological concerns is just not logically sound at all. In actual fact, I would argue that the moral consideration is the primary consideration, probably alongside anthropological as well, like human dignity, but certainly human dignity. Again, it's really about that question of am I respecting the dignity of another? Which is ultimately a moral question. So morality is the primary question and those other things are actually secondary. So every action, in a sense, there is a moral engagement of some kind or other with the world. So that in mind, let's have a look now at some of the potential harms when we think about artificial intelligence that we need to be aware of. And as I've said, I have grouped these into categories and then I've tried to cite examples where I can to give you a deeper sense of what I'm referring to, if you're not sure. So first of all, when we think about the harms of AI, there are things like the environmental impact of artificial intelligence. This is absolutely a high level impact upon the environment.
[00:09:42] You are talking here about a lot of electricity that's required. You are talking about a lot of cooling that is required to keep servers running to actually do the computing that is needed to drive AI. And, and that is not a tiny ecological footprint at all. It is actually quite massive. And I think this is important to consider because already there's a conversation that's been going on for a couple of years now about some of the really pro AI adherents about ways to try and work around this problem. Some have suggested, I think this might have been Elon Musk's idea, that perhaps servers should be kept on the moon and that solves the cooling problem and then you find a way to transmit data back and forth between Earth and the moon. I know there has been conversations about can we harness solar energy in some way, can we figure out other ways that don't utilize so much water for cooling, all of that kind of stuff. By the way, the environmental question is also one of the reasons why I think that at times some of the scaremongering about AI is quite overhyped. And it's also a way in which practically legislation could keep controls on this. Because if you don't give consent to build lots of servers and you know, take up land with service base and require all the water and the cooling and the electricity, etc, then AI is actually kept in check. You know, it requires more computing power. If you don't give it that computing power, then you do have a harness on it in a sense. So environmental impacts is the first one. Next one is the issue of pornography and the era of AI. Porn is here, as this article rightly declares. And I think this is really important to understand. Let me read from this article. Sam Altman, the CEO of Chat GPT owner OpenAI, recently announced that his software would soon allow users to generate erotic content on X. Altman posted in December, as we roll out age gating more fully and as part of our treat adult users like adults principle. So that's the liberal ideology at play there. We will allow even more like erotica for verified adults. Some users on X criticized this decision and Altmant doubled down. You won't get it unless you ask for it, he said to one such critic. Now, I don't know about you, but I've heard that many times before. This whole claim of, you know, my body, my choice, and if you don't like porn, just don't look at it as if somehow pornography has not swamped our culture and is now not coming at us. So irregardless of whether people are choosing it or not, porn is turning up in their feeds all over the place. It is actively being targeted at them as a result of the technology. So this whole idea of, well, you won't get it unless you ask for it, I think it's just that is part of the liberal mythology. It's part of the liberal myth that liberals tell themselves about their ideology.
[00:12:21] This idea that somehow you could have a local public swimming pool where they have A urinating and a non urinating section. And when you say, hey, hold on, I don't want to swim in urine, they say, well, you know, if you don't like urine, swim down the other end of the pool, the non urinating section. That's not how that works at a pool and it's not how it works in the real world either. We live, we act in community and our actions flow out into the community. This is one of the great flaws of the liberal ideology.
[00:12:49] There is a bigger concern here though, and if this is our starting point, well, if they're consenting adults, you might have heard me talk about this previously when talking about human sexuality, that the only remaining sexual moral norm now is consent, where people claim that anything is permissible as long as all of the parties involved are consenting. It's a pretty low bar, as you can imagine. It's been a disastrously low bar bar. And it also doesn't really make sense unless that Christian version of human sexuality is true, that sex is a different type of act and is in fact sacred and therefore it requires consent. Leaving that aside though, if this is the low bar you're starting with, you can see, well certainly, I can see where this is almost certainly going to head.
[00:13:30] I don't think it'll take long for people to be generating what have previously been considered illegal forms of pornography on AI.
[00:13:39] They will create forms of pornography like for example, child porn, which is currently and rightly so, illegal, and they will use AI to create artificial versions of it. And at that point liberalism is going to have a problem. And I can see already what's going to happen. I think there will be people who will say, oh, what's the problem?
[00:13:57] Because there is no harm in their mind that's being done to an actual human being because it's fake, it's not real. The images are being generated and they are not real. This is where liberalism comes up against authentic human anthropology. It's why liberalism's anthropology is fundamentally flawed. It doesn't consider the communal nature of the human person and the fact that our actions, our deeds are always lived out and acted out in a context of community and so there is always spillover. So if they think, well, on an individual level, no child has actually been exploited, has been harmed, has been abused to make this content, and it's a consenting adult who's looking at this self created child porn or other forms of illegal pornography, what's the problem with it? There's no concept of how that spills over into A wider society. And I don't think you need to be a rocket scientist or a conspiracy theorist at all to see that that's probably going to be a likely outcome of all of this. The interesting thing too about AI porn is that for a long time now, pornography has relied upon the exploitation of its actors. And, and that has primarily been women.
[00:15:06] The predominance of porn addiction is in the male demographic. It's about 70% plus. There is an increasing growing number of female addicts to porn, but it still sits in the minority, around sort of 30%. Somewhere around there, it's still the majority of pornography addicted people are actually males. And so it's a product that sort of is aimed in that direction. And obviously the normal human sexual modality is heterosexual in nature, so the majority of that porn is heterosexual in nature. And what you are seeing consistently is predominantly an exploitation of women. There are obviously exploitations of the male actors involved and in other forms of porn too. And obviously in the horrific, awful versions of it, it's all awful. But the really extreme, horrible stuff involving children and stuff like that, there's obviously exploitation going on there too. But for a long time now, porn has been an industry that has been highly exploitative of the female participants in it. And then onlyfans and other things have only opened up that level of exploitation even further, further to a whole new group of unsuspecting females. The, the horrific and tragic irony is that with AI porn, basically what they will do is now, after having suffered all of this exploitation for decades and being told that it's empowerment for women who participate in it, it will come to nothing because they will just be pushed aside and artificial porn will become the thing. And there's this awful, horrible, tragic irony, the full circle of exploitation, if you like, where you are finally discarded as nothing in this whole process after decades of awful, horrific objectification and exploitation as a part of this industry.
[00:16:44] There are also anthropological harms associated with AI. So what I mean by that is anthropological, pertaining to the human person, human anthropology.
[00:16:53] So, for example, think about something like the dignity of human work. As AI takes on more and more tasks.
[00:17:00] Now, some of those tasks are menial and you can see why it would not be a bad thing for AI to be doing those things. But clearly there is already a plan for AI to basically take over as much work as possible. And that is extremely concerning because the human person is actually made for dignified work. It's an essential part of our anthropology.
[00:17:21] All you have to do is just stop working in Any meaningful way. Now, I don't mean necessarily laboring, but have no sense of purpose. Purpose to your labors each day. You might be retired, but I guarantee you, you still do things. Volunteer, you know, serve, garden, serve your family, work in other places in the community where even if it's unpaid, there's still a sense of dignity in what you're doing. There's a sense of meaning, there's a sense of giving yourself in a tangible way. And I think this is also particularly important for men. We actually need to get out and touch grass.
[00:17:52] This is something I've learned in my life and it's advice that I share with other men in particular who might be struggling with a sense of burnout or anxiety, depression about their own life. I say get a regular routine and include in part of that routine actual stuff that's in the real world where it's like tending to a garden or building a deck or finding a hobby like model making or something like that, where you can actually see the work of your hands as it's progressing. It keeps your mind, I think, healthy and occupied. There's a dignity to your labor, there's a sense of progress in what you're doing. It's all very important.
[00:18:24] The human person is not meant to be sitting around doing nothing. And the response to that, by the way, from the so called tech geniuses, is to claim, oh, we'll just pay everyone the ubi, the universal basic income. You'll be paid a basic income to do nothing. That doesn't help the problem, that just makes it worse. I would suggest most people will have a sense of, well, what's the purpose of my life? I'm being handed cash to do what? And I'm being handed cash by whom? Exactly.
[00:18:52] So is the state giving out cash? Am I now just a chattel of the state? Is the state my nanny as I sit round and do nothing and have no sense of dignity with my life?
[00:19:00] Or I guess the other big problem is, and I think this is where the UBI thing falls down, by the way. Why the heck are greedy corporations who have perfected the art of shaving off as much off the bottom line as they can to make even more profits? Why would they suddenly start handing out cash to people that they're not even utilising and gaining the labour from? I just don't think this is realistic at all. There's nothing about this which is coherent or makes sense. And most importantly, the dignity of work is being lost. Now, I know some people say things like, oh, well, don't worry, you know, we'll need people to operate machines or work on the machines. You know, people's work will just change in nature. In some sectors, that is true, but for the majority of people, that is just absolute wishful thinking, because the reality is that the majority of people. First of all, we're not going to need that many mechanics of AI. That's just, you know, have we forgotten that the. The AI will probably start looking after the AI in this awful, you know, dystopian vision of reality. But secondly, also, there's just not enough people to do that job. There's too many people and not enough spaces available where that would be a reality. And then thirdly, the majority of people are actually not wired or equipped with the various faculties necessary to actually do that kind of a role. Not everyone is equipped with the necessary talents and abilities to be an aptitude and other things to be, for example, like an engineer. That's why not everyone is an engineer. And that goes across the board for all sorts of work. But you can't then say, well, everyone will just take on this new magical role of AI engineering. That's just not how the world works. The human person doesn't work that way. It is not a good or healthy approach to society.
[00:20:44] Human anthropology and the dignity of human work. The next one, and we are already seeing this, and I think we will see more of this is greater social dysfunction as more and more people seek AI companionship through loneliness, through lack of children, people who didn't get married, didn't have kids. Cultures and populations where community is broken down. We're already seeing various versions of this type of dysfunction, but people will seek out AI as a form of companionship.
[00:21:13] And there is something like, I've been using that word, dystopian, but there really is something truly dystopian about that. We've already seen versions of this in places like Japan, where they've had a very low birth rate, they've got an aging population, and they haven't had immigration that masks that problem. And so there have been companies that have created companionship robots. That's something that was happening years and years ago. And these robots program with a million and one different phrases and questions they would ask you. So elderly people who are alone would suddenly be deceived into thinking that they're not actually alone. And we've seen lots of socially dysfunctional versions of this already. People talking about, you know, falling in love with AI bots and all the harms and things that come from that now, I expect that we'll see more of that if we are not careful about how this technology is as permitted to exist in human society. And then, of course, there are the loss of essential skills as AI takes on more and more tasks. So previously, when you think about technology, like, for example, when you move from an axe to a chainsaw, there is a slightly different set of skills you need to develop, but the fundamentals are still the same. You need to recognize how and where to actually apply the cutting tool on the tree, how to cut the wedge properly so it falls correctly. All of those kinds of things, these essential skills, they are honed and they are developed, and it's through experience and by doing them, essentially, and then they're passed on. But if AI is doing all of this work, who is actually storing and passing on that knowledge? Because if we're not doing it, we won't. It's essential to do these things. And that is really serious because it raises two questions in my mind. Number one, again, is about human agency and meaning in the world and the dignity of work and the ability, like to have skills, you know, it doesn't matter how basic they are, but to have skills is actually an important thing. To have a sense of a skill set of some kind, knowledge about the world, knowledge about interacting with the natural world. To put a piece of technology between you and the natural world and the processes of the natural world, and to consistently have that technology between you is not a healthy thing. So you, to have zero involvement, to have a piece of technology that's aiding you in those processes, like a chainsaw, that's very different. But if the chainsaw is automated, it doesn't need you, then there's something very different going on. Now we are checking out effectively from nature itself. We, as part of nature, are separating ourselves even further from nature itself. And so there's something there that's harmful. And the loss of those essential skills, I think, is really key. Even just the act of passing on essential skills, say between father and son, for example, practical skills of any kind is an important thing. Between mother and daughter, these things really matter.
[00:24:01] Secondly, again, this has implications for us. And we think about our human dignity and how we interact with the world and what will we be doing with our time and all that kind of stuff. There's absolutely a fundamental factor here. And then lastly, there is the real potential here for what? If artificial intelligence actually collapses in some way, or is attacked and controlled in some way by another outside agent, be it a corporation or another country that's unfriendly to us, whatever it might be. It's irrelevant. What happens when AI has taken hold to such a degree that people actually can't live basic meaningful human lives and can't. That they've lost the skills essential for human society and for human flourishing and then the AI shuts down. What happens in that moment like it would effectively if you had, if you went on long enough with a society or a culture that had lived with a total dependency on AI or a large, even a large dependency on AI and then that AI collapsed or was shut down, that society would be thrown back into the, like into the stone Ages effectively. They wouldn't know what to do. A lot of people would die while they figured out how to, how to actually do those things. Again, it wouldn't be a good outcome. The next group of harms or category of harms from potential harms from AI is social harms.
[00:25:25] So these are things like how AI might impact upon society. Think about something like the reliability of evidence in criminal cases. When you sit before a jury, please God, we never get to the point where and. But this is, again, this is a real possibility where you have idiots who think that we could have jury trials with no judge and no jury. We just have an AI overseeing people's trials. That would be truly inhumane on multiple levels. But let's imagine our situation though, like a typical jury trial where you are looking at what appears to be irrefutable proof of a crime. You've got video of someone having committed the crime, you can see their face, they've got an identifying tattoo on their body and you can see it clearly on their arm and you can see the crime taking place. But here's the thing. It's all artificial. It's AI created and there's no way of telling that it's AI created. That is a very real possible future for us. So what happens at that point where you can't trust your eyes because of this technology any longer? That has big implications if you think about something like a criminal case. What about the reliability of information in general due to increasing deep fakes? And I'm seeing a lot of this already. People who are falling for deep fake video clips, they're usually not long or images that have been created and then they put a quote that actually was never said by the person. And we see this increasing number of these deep fakes and people just aren't like the reliability of information that's out there and which is shaping, informing people is just woefully dishonest and false or misleading. And this is already a problem.
[00:27:00] I'm looking at several recent major issues that have happened. I don't want to date this, this podcast series by naming them in particular, but some, you know, some recent big events that have happened where it's clear now that this sort of technology is creating grave harm and otherwise intelligent people are being pulled in and are accepting absolute falsehoods because it looks and seems real and plausible because of deep fakes. What about something like the risk of a deep fake technology that actually targets you, like a scammer utilizes AI clones, the voice of a relative, and you get a phone call asking you for money because they're in desperate need, and you think, oh my gosh, my mum's in need, I better transfer money. And they give you the details to transfer the money. And it's not really that person at all.
[00:27:47] These are not. We already have that kind of technology starting to evolve.
[00:27:51] These are real issues, and they are things that we absolutely need to consider. There is something that I don't think is considered much, though, and I think this is really important, though, and that is a disordered distribution of power and also a changing of power dynamics within societies.
[00:28:07] AI allows a very, very small group. Now, yes, corporate culture, as it currently stands, does allow a minority to take far too much of a control, I would argue, in societies. But this is even worse still. A tiny minority of people could now potentially wield huge levels of previously unknown power and control over a society, depending on how dependent that culture and society has become on AI and who's actually in charge of what the AI is doing. This is something that doesn't really get talked about, and I suspect some people are afraid to talk about it because they go, oh, are you a Marxist? Or are you a, you know, a post modernist? Because you're talking about power dynamics. No, power dynamics are a real thing.
[00:28:55] And the Marxists were actually correct. So were the postmodernists in their critique of power. Their solutions are awful and wrong. And some of their other ideas, like the postmodernists and their absolute rejection of truth and meta narratives, you know, big stories that explain everything. That's an absolute fundamental flaw. So their solutions are awful, but they're not wrong when they see power dynamics at play in society. Power is a real thing. Christianity talks about this as well. The temptation to be just like God. You remember that. We talked about that in the first episode. So this is a real issue, and we need to get smart and get our head in the game and what we shouldn't be doing is allowing people to say, turn around and say, oh, just shut up. That's Marxism. It's not Marxism. We're talking about here. We're talking about control of societies and large groups of people and even individuals. Speaking of individuals. There is of course the social issue of increasing atomization of persons. Imagine a world, and again, this is not too far away if we're not careful, a world in which a person is working in a job context where maybe even they're no longer required maybe to even come into work, if they are still working in a profession where, you know, AI hasn't totally dominated everything, but increasingly they find they're not interacting with as many people, or maybe not that many people at all. And then they stop on the way home from work and they go to a supermarket and they, maybe they've already punched in an order and a robot AI driven cart has loaded that car, has processed it through a machine driven checkout and they collect it and take it home with them. Their car. They're not interacting with that in any other meaningful way and no other motorists surrounding because their car is AI automated and self driven. And then on the way home they stop at McDonald's which is fully automated now, and they order a Big Mac and a side of fries and a drink. And all of that is done without human interaction. There is an increasing atomization of the human person. We already have a problem with this, this dysfunctional breakdown of human community.
[00:30:48] And there's a real danger here.
[00:30:50] This is not good. The human person.
[00:30:52] I'm sure you are smart enough to know this, but the human person is not actually designed and does not flourish without community. And so making that problem worse is not going to help anybody. Then of course there are, and I think this is even less perhaps of a thing that is talked about. But there are the spiritual questions about artificial intelligence. I'll give you some examples about some of the implications. Implications here.
[00:31:15] This was a story that popped up on my feed some months ago about someone who is apparently, well, they haven't written. The AI has produced, has crafted, has. I don't even like using the word crafted, but the, the AI has put together, has stitched together a worship song and forest franks Frost, Frank, sorry, came out, responded to this and he said, look, AI does not have the Holy Spirit. And then someone else came back and said, well, the Holy Spirit is not to be put in a box.
[00:31:48] Ironically, that's what you would be doing, wouldn't you? If you, if you're saying the Holy Spirit is boxed into AI anyway. He can use whatever he wants. If he can use a donkey, he can use an AI song. Now, there's a. There's a sort of something that's missing here in this. They're thinking about the potential effect of someone who's a hearer of that AI song. And could the Holy Spirit still work in that person's life?
[00:32:14] So it's not like the AI is. Is.
[00:32:16] Is working in the. The Holy Spirit is working in the AI. It's the Holy Spirit working in that person's life, utilizing even this very sort of flawed and. And deficient approach to still achieve some good. All things work together for the good, but that doesn't mean that we should be doing or endorsing all things. Even if God can use all things to produce a good outcome, which is a beautiful aspect of Christian hope.
[00:32:43] It seems to me the fundamental question here is, what inspired this song?
[00:32:49] Where is the inspiration?
[00:32:52] What is the higher good? What is the higher purpose? There is none.
[00:32:57] It's all missing. And on top of that, I would suggest to you that probably this album, I haven't heard it, but I'm pretty confident that it's going to be sloppy.
[00:33:06] You. I'm a musician myself and a great lover of music, and I know now that I've actually stopped relying on my Spotify recommended weekly recommended playlist. I used to just have it on the background as I was working at tasks, but I don't use it so much anymore because it basically become like it just started getting filled up a slop. It realized that, oh, he likes a particular style of music. He's been listening to a bit of classical this week, or he's been listening to some great sacred hymns, or he's been listening to some Johnny Cash. And then what would happen is I would get AI knockoffs of that kind of. So they'd take a style and then AI would produce a slop version of it. And I just stopped listening because of it. And. And so I'm pretty confident that the album is not going to be of that higher tier. But there's something fundamentally important.
[00:33:53] The artist, the sacred image bearer, the Imago day, when they create art in the world, there is something profound happening there that's completely missing with AI this another thing now that we're starting to see, called death bots.
[00:34:08] Basically, these are AI chat bots that are created and they will go through a person's history like a person who's died. It will look at their online presence and other information that it can obtain. And then the grieving relatives can speak to that death bottle as if they're speaking to the real person, as if they're still right here, present or alive in the machine somehow.
[00:34:31] Again, I know I'm using the word dystopian a lot, but this is ultimately dystopian.
[00:34:36] Here's what this article had to say, and I think this is a really good point. As the media theorist Wendy Chun has observed, digital technologies often conflate storage with memory. That's a really, really prescient insight. Storage and memory are not the same things.
[00:34:50] One is the capturing of something. The other is the holding onto a lived experience, promising perfect recall while erasing the role of forgetting, the absence that makes both mourning and remembering possible.
[00:35:05] Your mourning, your grieving, is completed, in a sense, by the loss as well. It's a process you must go through, is what she's saying here. And this is important in this sense. Digital resurrection risks misunderstanding death itself, replacing the finality of loss with the endless availability of simulation, where the dead are always present, interactive and updated. You can imagine just the potential for harm and psychological, emotional, spiritual harm that this could do to the person who's on the other end of this thing.
[00:35:37] There's a sense in which the grieving process and the suffering process, in order for it to be redemptive, we must go through. Through it.
[00:35:46] You don't.
[00:35:48] You don't pretend like you don't get to the resurrection. If you have a fake AI Jesus after he's died on the cross, and you stay talking to the fake AI Jesus, you've got to go through the darkness of the cross and the darkness and loneliness of Holy Saturday before you get to Easter Sunday. It's just so fundamental. That process is fundamental. And I can even imagine all sorts of horror scenarios too. Like imagine a family in which someone has been euthanized, for example, and then you've got a death bot they're talking to. And the whole thing is this sort of like, you can see, like, almost like a complete denial of death. So someone has not died. They've made themselves dead with euthanasia. And then the family members and relatives are now still pretending that the person is not dead as well. And that delusion is reinforced in a very powerful way by this technology. I don't see any good that can come from this. I really. I really don't see any good that can come from this, any minor, inconsequential good that could happen as a result of someone talking to One of these things, it would be purely incidental and it would be dumb luck that it happened. It would not be something that you could say, oh, that's a good idea. Let's recommend this as a tool for people going back to antika et nova. Again, antiqua sorry, et nova. Let me read this important point. Some even speculate that AGI so remember we talked about this. This whole idea of artificial general intelligence could achieve superhuman capabilities.
[00:37:13] At the same time as society drifts away from a connection with the transcendent, so with God, some attempted to turn to AI in search of meaning or fulfillment, longings that can only be truly satisfied in communion with God.
[00:37:28] However, the presumption of substituting God for an artifact of human making is idolatry, a practice scripture explicitly warns against.
[00:37:38] Moreover, AI may prove even more seductive than traditional idols. For unlike idols that have mouths but do not speak, eyes but do not see, ears but do not hear, and that's from the Psalms, AI can speak quote, unquote, or at least gives the illusion of doing so. And they reference Revelation 13, verse 15 there. I recommend going and checking that out for yourself.
[00:38:03] This, I think, is a really salient point.
[00:38:06] The potential for your idols now to actually talk back to you and the implication, the spiritual harm that could result from that, that's very real and very serious. And the way in which someone could be pulled into diabolical at minor levels, or even the diabolical at major and much more serious levels as a direct result of this technology. I think the. This is such a good point.
[00:38:32] I. I think also there's another interesting question. A friend of mine and I, he. He's a bit more suspicious of AI like really suspicious of AI on the. The spectrum of suspicion. Him and I were having a conversation a couple of months ago. We were talking about this, and he, he said, look, is there actually a moral harm simply from the use of AI? I don't think there is, because as you've seen already, I've cited examples where I think you can actually utilize AI and there wouldn't be moral harm. But he's a little bit more suspicious. But the one thing he did say was even just communicating with AI not using it as a tool, but a form of communication with AI he says, what exactly are you communicating with?
[00:39:12] There is no being there. There is nothing present. It is literally a binary gathering together of phrases and linguistic terminology and linguistic styles and, and putting that together in an output. But there's no being there. There's no person. It's very seductive. And it's very easy to fall into the delusion because it mimics the presence of a person on the other end. But he says, what exactly are you communicating with then in that situation?
[00:39:40] And his argument was that you are. You're not communing with anything other than the privation of being. What is the privation of a good like, the good of being? Well, privation is the very Augustinian description of what evil is. It is the absence of a good. So privation is an absence, like darkness is an absence of light. It's not a thing in and of itself. And so he said there's something here that he thinks is worth grappling with. Now, I don't agree that you can't have moral uses of AI, but I think there's a really key point that he's getting at here. And it also speaks to what Antikua at Nova was talking about as well, with idols that can actually speak back to us. And also, even if the idols are not so much, it's about them speaking back to us and sort of treating them as idols as such. It's the way in which that idolatry can occur, and we can become very fooled into something that's not really happening. There's a deception taking place here that matters. I don't know if you've seen the story, but in the last couple of days, there is now public news that a company has developed a Jesus chatbot. And you can communicate with this. This chatbot, not claiming this is effectively like communicating with Jesus. And I look at this, and I think, what are you doing here? This really, to me, seems like a very clear example of something that would fall into the category of idolatry. That is not Christ.
[00:41:02] There is no presence, let alone the substantial presence of Christ in that AI. So what exactly are you communicating with there? And I think that's a question worthy of consideration. Now, if this presentation ended here, you'd probably think, oh, my gosh, I need a Prozac, Brendan. Six episodes. I hope you're gonna send me some antidepressants in the mail as well. No, it doesn't end here. And I want to actually finish this episode by talking about ways in which I think we can take the power back. Now, this obviously would have application not just to AI technology, but also to social media. But I think it's important to consider what can we practically do, do as human persons? And again, there's different layers associated with this. But let's start with some obvious stuff. I think we need to Think about starting with some basic technology boundaries in our own life. Ensure that technology of any kind is kept in check. Now, obviously, again, you've got to account that caveat. You've got to account for the differences in technology. I've got no problem with you doing all of your garden work with a chainsaw. I'm not saying, oh well, no, you should resort to an axe.
[00:42:07] But what I'm saying here is that when you think about other types of technology, and this is really the digital technologies, smart technologies, interactive technologies, personal devices, et cetera, we need to keep that stuff in check.
[00:42:20] So often, like those of us who are parents who are watching, we tend to think, oh, I better keep an eye on my kids and their social media device usage. And we might even have that pretty well locked down while we're sitting beside them at the dinner table with our phone out. And that's destroying their lives as well. And it's coming through us because we don't have those boundaries and people.
[00:42:36] So technological boundaries matter. Keep it in check. Trust me. Don't allow it to dominate your life. Because I think if you can keep technology as it exists right now in check, then I think you'll be in a stronger position to keep AI in check. You won't be pulled into that as much. One app, just speaking of something, there are lots of different tools. One app that I know I've utilized myself and I've found helpful. It's a one off payment fee that you pay to get this app. It's called the Shift app. And what the Shift app will do is it will actually lock you out of your phone. It will brick your phone. But what it does is it bricks the phone and it leaves apps present that you actually need to use that are not social media apps. And also the Shift app will like it works in such a way as you can't unshift it. You set it up. So basically it's not like the brick. The brick. You can tap the brick and then you can tap it again to get back in again. Will shift will actually lock you out for longer periods where you can't get actually back into it. You can set up the periods and how that all works, but once you shift, boom. The smartphone does something that's really important. One is it's really hard to unshift.
[00:43:41] That's helping to break the addiction. And number two is you actually keep like, it's not, it doesn't make your phone a dumb phone. It keeps the apps like it keeps Uber, for example. If you Need Uber. That will still be available to you, but you won't be able to access Facebook.
[00:43:54] The other thing that it does, and I think this is really important, this is a big difference, is that the apps that have been bricked, they disappear off the phone during those shift periods, so you can't see them. And I've noticed that's a major difference. When you look at your phone and it's sitting there and you can't see like the Facebook app, all of a sudden, it's just not as alluring to you. So Shift has been a really good app. We use Custodio in our family home for general control and awareness about technology.
[00:44:20] I've actually started using that myself for my own phone, and I found that helpful. So I had an issue a couple of weeks ago where, you know, the technology was, it was just a bit too alluring. Was starting to get on top of me, social media and stuff like that. And so what I did was I just logged onto Custodio and I locked myself out of those social media apps, and then I just can't get into them. And that was enough. It was just too much of a hassle to go and unlock them again. So at only a day or so of that, and all of a sudden I was back on top of it again. So these kinds of things are definitely worth considering. But, but actually start with the, the basic control of technology in your own life. These, these devices in this particular technology live not by lies. This is absolutely essential. Now, now you might find yourself in a situation and I, I. This obviously comes from the great Alexander Salzhenitsyn. The, the, the Christian Russian dissident of Soviet communism who was imprisoned in the gulag system unjustly for many years, had a conversion while he was there. He's released from the gulags, and then they exile him to the West. They send him to America because he's such a problem for Soviet communism. The day that the KGB come to put him on the plane and send him away, he finishes a much shorter essay. He wrote the. Or put together the book the Gulag Archipelago. Highly recommend that you read it if you haven't. But if you don't want to read the Gulag Archipelago, then you should definitely read Live not by Lies, a much shorter essay. And the day that this essay was finished was the day they came and put him on the plane. And there's a whole other story about how someone actually lost their life protecting this essay and keeping it safe so that it could finally be published. But this Essay is a expounds on a pretty simple but very important principle. And basically he says, you know, you might not be in a position, you might be so surrounded by a lie, the lie might be so big and all consuming in your society. And that's potentially what we're talking about here with this sort of overuse over dependence on technology and AI that you can't actually do anything to turn off or to stop the lie. It's not like you can fight a fight. And then the lie comes to an end. The lie is powerful, it's all consuming, but you still have a choice. Sorry, you are not powerless. Alexander Solzhenitsyn would say, you still have the power to live not by lies. So ensure that the lie doesn't hold through you. It stops at your door, it stops in your family home, it stops in your business dealings, it stops in your heart and your mind. It stops in the way that you use and interact with the technology. So you always have the power to live not by lies. Even if the lies around you and your culture are so huge and consuming, you can't stop the lie itself. Next important principle is never have an empty house.
[00:46:48] And I think this really matters. And this is something I'm discovering more and more in my own life. There's a beautiful passage in the Gospels which is often, you know, you can read it and kind of think, I wonder what that's all about. Where Jesus says, you don't cast a demon out of a house and leave it empty because if you do, seven more demons will come along and take up residence.
[00:47:05] This is a really important principle. So it's not just enough to say, well, I reject the evil of AI or I reject the evil of excessive consumption of social media. Great, that's emptying the house. Good. But what are you actually going to fill your house with instead? If you don't fill your house with goodness, truth and beauty, with an authentic human anthropology that is lived, your dignity is lived daily through your deeds and your thoughts and your life and your dealings and your family life and all that kind of stuff. If you don't fill your house with that goodness, truth and beauty, then when the demons come knocking again, the evils come knocking. They're going to find an empty house. But if you're living a house filled with goodness, truth and beauty, it's a lot harder for them to actually find space and to get legroom in your heart and mind. So it's a very important principle. My nanny used to say, the devil makes light work of idle hands. So try and avoid having idle hands. And I think one of the traps perhaps we fall into in our effort not to burn ourselves out, we become too focused on leisure. I was talking to a friend recently actually about this very issue. And there's always a, there's a scale, it's always a balancing act. You've always got to be aware of this, but you've got to be aware you don't fall too far in either direction. So you're just obsessively busy and dominated and, and controlled by tasks and, and, and unending busyness as opposed to the dignity of work versus also just the other end is where you just spend no time, no meaningful work, it's just all leisure and nothing else. And, and so you've, there's a, there's a balancing act that has to take place here, of course, and that, you know, that's all part of this process. But the devil does make work of vital hands. We are, it's taken me, I think too long in my life to realize this, that in actual fact life is a cross that you must pick up each day. And that's okay.
[00:48:48] We tend to think, oh, but I want to do the cross, then get over with it and then find a nice comfortable retirement into a non cross based living. There's no such thing as apostropedic cross. It just doesn't work that way. So life is tough. Embrace it each day, but do it in a way where you begin your day with Christ. You begin your day with contemplation and silence. And so you are not dragged through the day by other things and you don't tend to fall into excess in either direction. Intentional community is absolutely essential in this endeavor.
[00:49:17] One of the things that these technologies tend to create within us is atomization and AI is the, the same. I think intentional community really matters. You need community around you. You need to give yourself to others. You need others to give themselves to you.
[00:49:28] The current western model is we tend to think, oh no, I shouldn't. No one else should actually have to, you know, come to my aid. I shouldn't have to become a dependent or, you know, people. I shouldn't depend on others, I shouldn't become a burden on others. No, you actually should because it teaches us how to love better. It teaches us and grows our relationships more deeply. Our human experience is fulfilled in a more complete way through that kind of dependency and that interdependency. Community is essential.
[00:49:57] It's absolutely intentional to this and I think it helps us to avoid extremes as well, helps us falling off the cliff of despair because of isolation, helps us avoid becoming a lone ranger or a conspiracy theorist because of isolation. You need intentional community and good people around you that you're actually going to live life, life with. It's just essential. That's a whole nother conversation all by itself. But community is essential to this project. If you want to counter the atomization of persistently and consistently intruding technology, you need intentional community in some form or other.
[00:50:30] I would suggest to you that liturgy is also essential in this.
[00:50:33] And look, I'm a Catholic, sure, but there are other denominations that have their own liturgical forms. And I think if you are not even in one of those denominations, then find some liturgical structure to your life, even if it's as simple as following the Christian liturgical calendar. Each year, the liturgical calendar begins on the first Sunday of Advent and it goes right through all the way to. Well, if you're in the Catholic Church, the Feast of Christ the King, that's the end of the liturgical year. So it doesn't go from January to December. It goes from usually maybe late November or the start of December right through until usually the end of the following November. And that calendar is different because it. And this is why I think liturgy and liturgical calendars matter. Because our own chronological calendar, which starts in January and ends in December, that's about us. It's about our plans, it's about our control of the temporal, of our control of our chronological time that we inhabit. It's about our schemes, dreams, plans and visions, right? But the liturgical calendar is not about us in our dominance of time and space. It's about Christ. Christ is the center of the liturgical calendar. Its sameness is actually its uniqueness and its profound wonder. It keeps us in a cyclical pattern that unbinds us from the powers and the control of the mechanistic machine of progress, the myth of progress.
[00:51:55] It keeps us in a profound routine of self emptying, of celebration, of moments of like Lent where we self empty and then moments of Easter where we feast and celebrate. It reminds us of the fundamental key components of that profound way in which our cosmology is reshaped. Our very history in its entirety, the universe as we know it is reshaped by the presence of Christ through the Incarnation and those, those key moments. Look, I know I'm waxing lyrical, but you get the point. I think this really, really matters. And I think this is so key because a big component of this is obviously the aesthetic component and the emptying out part of it, and a part of that is sort of letting go of control. Part of that is letting go of the busyness of the, the technological dominance of our own lives. And I think it's, yeah, just, it's fundamentally essential. Obviously prayer is also like, it's indispensable. It's just, it's a no brainer. You can't get away from this.
[00:52:49] Yosef Pieper, who wrote a phenomenal work on faith, hope and love.
[00:52:55] There's a great quote I love from him and is the, the work, the part of the work he talks about hope. Prayer and hope, he says, are naturally ordered to each other. Prayer is the expression and proclamation of hope. Hope itself speaks through it. And this is a really like, it's kind of simple, but it's very profound what's being said here.
[00:53:10] And there's something about the fact that we would actually stop to pray and that expresses hope on our part, but also just being present as well in that, that moment of prayer and desiring to reorder our life towards God. You know, that's what prayer is about. It's not simply a list of intentions. Please, God, give me the following.
[00:53:27] It's about, you know, thy kingdom come on earth in my life as it is ordered by you in heaven. That's what prayer is about. It's about, it's about reconforming our hearts and minds, not making God into our personal slave who gives us all the goodies. He's not Santa Claus up there for us. It's something profound and very different. And I think it's also essential because it allows us, I think you recover hope in the stillness of peace. Prayer. I really do believe that's something I've discovered in my life. And I think that's essential. Particularly you can look around in a technologically dominated society where you see these traps and pitfalls that people are falling into and, and as AI becomes more dominant, we can sort of easily fall into despair and think, oh my gosh, what's happening? Look, the reality is that AI would be a very convenient vehicle if you were going to have a final eschatological end of all things. If you, you were going to see the Apocalypse unfold with an Antichrist and a great delusion, there's no doubt AI would be a perfect vehicle for that. And I look at that now and I go, okay, God is still in charge. Irregardless of whether it is or not, God is still in charge. That's what Christian hope looks like. Even in the midst of suffering, of fear of difficulties. It is to be able to look through that and to look up like a little Samwise Gamgee does when he's on the side of the mountain in the darkness of Mountain Mordor in the Lord of the Rings story. And he's feeling the burden of the darkness. And he looks up and Tolkien writes it so beautifully. And. And he describes this moment where Sam looks up and he sees a star twinkling. The clouds part and suddenly the star twinkles. And he says it smote his heart. And in that instant, hope returned to him and he realized there was highlight, there was beauty.
[00:55:06] Be forever beyond the reach of the darkness. And I think that's what prayer does for us and allows us to keep ourselves grounded and not become consumed by the machine in either way. So one way we are consumed by the machine is obviously when we give too much of ourselves to technology. The other way the machine will consume ourselves is when we give ourselves to despair about the prevalence of the machine in our society. And so prayer, I think, helps us to avoid both. Let me finish now with firstly a quote. I want to return to Antiqua et Nova, which I think offers us a great little piece of wisdom to wrap this up. And then finally, another brief little video segment from Pope Leo xiv, specifically about AI that just so succinctly, I think, sums everything up, puts a nice bow on top of things. So let's start by looking at this quote. Years ago, the French Catholic author Georges Bernanos warned that the danger is not in the multiplication of machines, but in the ever increasing number of men, men accustomed from their childhood to desire only what the machines can give. This challenge is as true today as it was then, as the rapid pace of digitization risks a digital reductionism where non quantifiable aspects of life are set aside and then forgotten or even deemed irrelevant because they cannot be computed in formal terms.
[00:56:30] AI should be used only as a tool to complement human intelligence rather than replace its richness. Cultivating those aspects of human life that transcend computation is crucial for preserving an authentic humanity that seems to dwell in the midst of our technological culture almost unnoticed, like a mist seeping gently beneath a closed door. And I think that is a beautiful way to wrap things up. This, this just to give a good summary to what we've been talking about. AI should only be used as a tool to complement human intelligence and we should never be afraid of those non quantifiable things. The, the moment of stillness where there's no efficiency, there's no productivity, and just sitting and being present with others, or being present with others in their suffering, or being present as a beautiful sunset unfolds in front of you.
[00:57:31] There's. There's nothing productive, there's nothing that we can compute about that. But it is profoundly essential to the human person and their flourishing. And if our AI is overtaking and replacing human intelligence, if AI is dominating, if AI is robbing us of that, then we know straight away that something is terribly wrong. Let me leave you finally with a brief thought from Pope Leo xiv. This was a video call that he made into a group of young people last year and they had pre prepared questions for him. And one of the questions was about artificial intelligence. And so it seems to me to be a really good way just to sum things up. This is a beautiful and I think ultimately very practical summation as well of how we should consider and think about and interact with AI. So let's finish on this final note and as we listen to Pope Leo xiv, information quickly.
[00:58:27] But it cannot replace human intelligence.
[00:58:30] And don't ask it to do your homework for you.
[00:58:34] It cannot offer real wisdom. It misses a very important human element. AI will not judge between what is truly right and wrong.
[00:58:45] And it won't stand in wonder, in authentic wonder before the beauty, the beauty of God's creation.
[00:58:53] So be prudent, be wise, be careful that your use of AI does not limit your true human growth.
[00:59:01] Use it in such a way that if it disappeared tomorrow, you would still know how to think, how to create, how to act on your own, how to form authentic friendships.
[00:59:12] Remember, AI can never replace the unique gift that you are to the world.
[00:59:19] And on that note, thanks so much again for tuning in to this series on artificial intelligence. I hope that you have found it both meaningful and helpful in your grappling with this ever evolving and massive gargantuan question that now plagues humanity. Thanks again to all of our sponsors who have made this series possible.
[00:59:41] Don't forget, live by goodness, truth and beauty, not by lies. And I'll see you next time on the Dispatchers. Hi there. If you're enjoying our content, then why not consider becoming a paid supporter of our work? You can do that at either Substack or Patreon and the link for both are in the show notes for this episode. If you do become a supporter, then you'll get access to exclusive content, early release content, and also you'll be be helping to fund all of the offline work that we do as well all of the youth camps and the events that we speak at and all that other stuff that happens that you don't see online.
[01:00:16] A huge thank you to all of our paid subscribers. It's thanks to you that this episode is made possible.