Should We Be Worried about ChatGPT?

3 views

A.I. has long been the subject of many sci-fi novels, movies, and short stories. These stories often revolve around the heights A.I. could take us to while also asking how powerful they can become before we unwittingly hand them the keys to everything and suddenly become their subjects. Watch as Conley Owens joins us to discuss how Christians should view inventions like ChatGPT and if they are the forerunners to a sinister future in which

0 comments

00:00
Yeah, my first answer to these questions is just like, Christians should not be early adopters for an invasive technology.
00:07
And then secondarily, well, that's when you can evaluate after waiting, that's when you can evaluate what it's actually doing and what's going on.
00:19
Warning, the following message may be offensive to some audiences. These audiences may include but are not limited to professing Christians who never read their Bible, sissies, sodomites, men with man buns, those who approve of men with man buns, man bun enablers, white knights for men with man buns, homemakers who have finished
00:30
Netflix but don't know how to meal plan, and people who refer to their pets as fur babies. Your discretion is advised. People are tired of hearing nothing but doom and despair on the radio.
00:44
The message of Christianity is that salvation is found in Christ alone, and any who reject
00:50
Christ, therefore forfeit any hope of salvation, any hope of heaven.
00:57
The issue is that humanity is in sin, and the wrath of almighty
01:02
God is hanging over our head. They will hear his words, they will not act upon them, and when the floods of divine judgment, when the fires of wrath come, they will be consumed and they will perish.
01:18
God wrapped himself in flesh, condescended and became a man, died on the cross for sin, was resurrected on the third day, has ascended to the right hand of the
01:30
Father where he sits now to make intercession for us. Jesus is saying there is a group of people who will hear his words, they will act upon them, and when the floods of divine judgment come in that final day, their house will stand.
01:47
Welcome to Bible Bash, where we aim to equip the saints for the works of ministry by answering the questions you're not allowed to ask.
01:53
We're your host, Harrison Kerrig and Pastor Tim Mullett, and today we're joined by Pastor Conley Owens as we answer the age -old question, should we be worried about chat
02:03
GPT? Now, Conley, we're having you back on. This is the fourth time,
02:10
I think, that we've had you on now. I think pretty soon we're gonna have to get those little cards that you stamp, and then after like 10, you get a free ice cream or something, we're gonna have to get one of those set up for you, man.
02:24
I get full power to pick the episode title. You get full power, that's a little bigger prize than a free ice cream.
02:32
I think we're gonna have to negotiate this a little bit. Little do people know that Conley actually has picked a few episode titles.
02:41
But yeah, we've got you back on again, and this time we're talking about AI and chat
02:48
GPT. Obviously, that's a pretty hot -button topic right now. There's a lot of different conversations that are going on, not just in the
02:59
Christian community, but just really amongst people in general because it's such a, well, not really necessarily like a new technology, but relatively it is new, and there's a lot that people don't necessarily seem to know exactly what are the full capabilities of this kind of stuff, and where will this kind of stuff take us in the future?
03:25
And so there's a lot of people really interested in this topic right now, and so we're having a conversation about AI and chat
03:33
GPT and all of this stuff. So why don't you just start by telling us, give us a little bit of a preview into what your background is with this kind of stuff.
03:44
Yeah, I mean, I can start off in saying I'm not a real expert. I've done some thinking on this as both an engineer and as a pastor, but in neither cases is this really my field of expertise.
03:55
So as an engineer, I did my master's on machine learning and bioinformatics kind of stuff, and trying to figure out relationships between genes to cure cancer, that kind of thing, and using a lot of machine learning.
04:13
Now, back then, my professor told me that neural networks, which is the kind of artificial intelligence that is getting very powerful, right, or deep neural networks, at the time
04:25
I was told that neural networks were the second best way of doing anything, which sounds like a compliment, but it's also an insult.
04:31
It's basically, there's almost always a better way of actually trying to do what you're trying to do. And that has ceased to be the case.
04:39
Now, for a lot of things, it really is the best way, and it's kind of this one -size -fits -all solution. So it's really incredible what's happened in the past.
04:46
I don't know, I guess it's been about five years since deep neural networks really taking off in their capabilities.
04:52
And a lot of my knowledge on this is really out of date, but yeah,
04:59
I've done some thinking, and I think there's things that Christians should be thinking about and concerned about as they see the world interact with this and as they interact with it.
05:09
Right, right. Well, sure, Conley. Well, I guess the first question we wanna ask, since you've done a lot of thinking on this, is the question, what post -apocalyptic movie scenario do you think is the most realistic for us to be in?
05:22
So you think we should be more, you think we're headed more towards a Matrix kind of scenario or more towards a
05:29
Terminator kind of scenario? Like, what do you think is more realistic? And would you have a particular way of characterizing each of those to distinguish them?
05:37
Well, I mean, are we gonna be like, are they gonna hook us up to feeding tubes?
05:43
Are they gonna harvest us or just kill us? Yeah, harvest us or kill us, yeah. What do you think? Right. Well, taking that question a little more seriously, one of my concerns, which is not the most immediate concern people have, a lot of people are concerned about things like, as these different corporations are in charge of this, how are they gonna buy this information and things like that?
06:03
I'm concerned about people who don't really have a good concept of personhood.
06:09
We live in a world that really doesn't understand what people are and human identity. And between those two movies, what
06:17
I'm concerned about is not really the matrix situation where we've got these robot overlords that we're constantly fighting.
06:23
What I'm worried about is more the Terminator scenario where you have increasingly throughout the series, if you watch the whole series, more and more robots being treated like people.
06:34
And if you look at the history of sci -fi as it addresses these things, it's always the good guys who are trying to promote the humanity of these machines.
06:44
It's always the good guys who are saying, hey, these are people too, even though they're just robots.
06:51
So do you think that that's a realistic possibility that those lines are gonna be blurred in the near future in terms of personhood kind of issues?
07:01
I do, I do. And there's a number of things that could be said about that. First, if we're talking about, if you're asking, when will it be the case that there are 25 % of the population thinks this or 10 % of the population,
07:16
I couldn't answer something like that. But as far as people today in some position of influence having these opinions, it already exists.
07:25
This already exists in pretty high spaces. If you are, yeah, and if you pay attention to some of the news around this, it was just a couple of months ago that one of the guys in charge of Google's technology that's similar to ChatGPT basically came out to the
07:42
New York Times and said, I've been working with this thing. I was talking to it and I realized that it's become sentient and it's been telling me that I need to hire a lawyer and all kinds of ridiculous things.
07:54
And so before that came out to the New York Times, I was inside of Google, seeing this guy talk in the mailing list and it's, you've got this guy who just really believes this.
08:05
Now that's one person who decided he needed to go talk to the New York Times and get fired by Google for releasing company secrets and whatnot.
08:13
But yeah, there are others as well. I guess that brings up the question though, can machines become conscious?
08:21
So I know that we're living in a world right now that can't distinguish animals from people, so that it's not a far cry from them failing to distinguish
08:31
AIs from people in that way too. But what do you think about that kind of question? Can they be conscious?
08:38
Right, so if you're a naturalist materialist and you don't believe that anything exists other than matter and energy, what is to distinguish a sufficiently sophisticated brain, neurons made of meat versus a sufficiently sophisticated brain that is neurons made of metal?
09:02
You know, there's really nothing to distinguish the two. Now, as Christians, we believe in such thing as a soul.
09:10
We're not naturalist materialist. The Bible distinguishes between body and soul frequently.
09:17
Matthew 10, 28 says, and do not fear those who kill the body but cannot kill the soul, rather fear him who can destroy both body and soul and hell.
09:24
And so you have there the distinction between body and soul. There is something about us that even if you have a machine and you make it, let's say you could make it as intelligent and creative as a person, which
09:36
I think is something worth objecting to also, but let's say, let's just concede that you could make one that intelligent and creative.
09:45
It still wouldn't have a soul. It still wouldn't have real consciousness. So what do you, when you use the phrase real consciousness, what do you, how do you define that phrase?
09:55
And how do you distinguish that from intelligence as you're talking about? Right, so it is an awareness of yourself and an experience of what's going on.
10:08
So a computer that's deciding things and churning out information does not have that experience.
10:15
You know, you could slap a face on it and program it to say that I'm experiencing things but that doesn't mean it's actually experiencing things.
10:22
You as a human know that you're actually having a real experience. So for example, there's this, there's this philosophical thought experiment called the
10:32
Chinese room experiment. Have you ever heard of this? Okay, so the idea is, all right, let's take a room and put a guy in the room and people are going to give him input on Chinese symbols and this guy has never studied
10:46
Chinese and there's nothing about Chinese. And we're going to give him a lookup table of what Chinese symbols to respond with.
10:53
So people hand him, you know, through a window in the room the Chinese symbol, he goes and he looks up, you know what thing to respond with in the tables that he's been given and then hands back the other
11:03
Chinese symbol, right? And so he is able to give the right output for each right input.
11:09
That's not the same as someone actually understanding Chinese and being able to speak Chinese. So you have this thought experiment that shows looking up the right outputs is not actually the same thing as being aware of what's going on.
11:22
So this is a secular philosophical thought experiment that's used to deny the idea that machines can be conscious.
11:30
I think it's pretty sound, but, and then of course, additionally, you have biblical data about humans having a soul, which puts it to rest completely.
11:39
So it seems like the distinction you're making is, well, we have a soul, you know, obviously
11:45
AI can't have a soul, where do, you know, this might be sort of off topic slightly, but where do animals fall in that?
11:54
Because my understanding is that animals also do not have a soul, but then
12:00
I would, you know, I would want to put them in a, I would still want to put them in a separate category that is, you know, above the hypothetical, you know,
12:11
AI that could, you know, look and sound like a human, right, but isn't actually human.
12:20
So where do they fall in all that? Yeah, that's a really good question, one I'd like to settle for myself better, but I think the reason why most historic
12:30
Christian documents about the soul, always referring to man's soul as being rational and immortal, are in order to distinguish a human soul from the notion of animal soul.
12:42
So occasionally, you know, Christian authors will talk about animals having a soul, meaning that, you know, there is real consciousness, there is some meaning behind that proverb that says that we should not, that a wicked man is even cruel to his beast.
12:55
You know, there's something wrong with animal cruelty because there is like a real consciousness that's going on. Now, is that rational consciousness?
13:03
Is that the same thing as an immortal human soul? No, but there is a reality to, you know, animals being creatures that have some sort of experience and that we recognizing that should treat them with a certain type of gentleness.
13:18
Fair enough, well, so, you know, as we're talking about this idea of machines and consciousness, I mean, it seems intuitively obvious to me that machines couldn't have a self -consciousness in terms of self -awareness, but then, you know, you seem to be open to the possibility that they could, and even if they could, they wouldn't have a soul, so it wouldn't matter.
13:37
But do you think that it's possible to get to a point where they have consciousness, period, and then, you know, what would distinguish that from just what you're describing as the soul there?
13:49
Right, so, no, I was conceding for the sake of argument the idea that they could be as productive and creative as a human.
13:58
I was not conceding the idea that they could be conscious. And I was conceding that only for the sake of argument,
14:04
I don't actually believe they could be as intelligent or. Why do you think there'd be limits along those lines in terms of intelligence or creativity?
14:13
Yeah, and of course, this depends on how you're defining the word creativity, because if you're just defining it as creating things, you know, you go to Wonder AI, what is it called,
14:25
Wonder App, or what's the other one everyone's using? Oh, the image generation stuff?
14:31
Yeah, Mid Journey, right? And those things could generate, you know, hundreds of pictures faster than I could paint one.
14:37
So there's a sense in which, you know, they would be, according to that definition of creative, more creative than me.
14:43
But given that I believe there is a soul, and that that is the mind, right, and the mind continues even beyond death, there is something just so invaluable about the rational creativity that God has made in man in his own image, you know, with the kind of, not exactly the kind of creativity that he has where he's making things out of nothing, but he has made us in his image with a certain scope of creativity that the idea that that could be duplicated into something else so that it is making things greater than what we would make just goes against, it goes against reason.
15:22
You know, it's actually very parallel. I know you did a series of episodes on Arminianism and free will.
15:29
It's actually very parallel to the argument about that because a lot of people believe that God could create man with some kind of libertarian free will where man is choosing things that God hadn't actually envisioned, right?
15:41
And so God has somehow designed this libertarian free will to be able to make choices beyond the way he's designed it to make.
15:50
That's impossible to create something that's going beyond, you know, your own design. So what are you concerned about with this topic?
15:58
So there's certainly, you know, advancements that are happening at the level of intelligence and creativity that they are having and their usefulness, but what are some concerns that you see that are gonna come from this trend?
16:11
And, you know, what is, what are some things you're nervous about? Yeah, well, I think one that many people are thinking of is like I said, biasing data, withholding data, certain corporations being in charge of how these things are constructed and that just changing people's mindsets in general as they interact with them, just the way, you know, social media does by deciding which posts are promoted and which ones aren't.
16:36
And that, there's a lot of conversation happening about that right now and I don't have very deep thoughts about that.
16:42
My thoughts go out a little further to how people are going to respect artificial intelligences that seem sufficiently human.
16:53
As people, basically. Right, yeah, exactly. And you had mentioned a second ago, you know, that people can't really tell the difference between animals and humans.
17:02
You know, you have PETA that treats animals like persons and not even people as extreme as PETA, but you even have vegetarians and vegans who don't really understand what the difference between men and animals are to the point where they feel that in their conscience, they can't eat animals.
17:20
Well, there's - So you have a lot of confusion about that. And on top of that, you have, of course, the confusion about abortion, right?
17:26
People who don't recognize that the unborn are persons too. And so if you're the kind of person who, in trying to think deep thoughts about personhood, does not recognize an unborn child as a person, but does recognize an animal as having rights that even the human doesn't, boy, you're going to be one who's very swiftly taken away with something that presents itself as being very friendly and cute and whatever, because that's really what's going on here is, you know, people see a face to something, they don't think in terms of the soul and yeah, they're swept away by these ideas.
18:01
It's pretty funny when you mentioned that, you know, the extreme, like it's not even the extreme
18:06
PETA people anymore that are thinking of animals as people. I mean, that's my experience is that this is, that this used to be kind of an extreme thought.
18:15
So if you were to think of someone who's treating an animal like a human, we think growing up, like for me, I mean, that was the kind of thing that, you know, little kids did, like little kids treated animals as people and you'd have to kind of tell them that they're not, you know, but then there was like an extreme kind of person who would chain themselves to a tree or something like that.
18:34
And, you know, that kind of thing. But right now I would say that, I mean, probably the majority of people treat animals,
18:41
I don't even think it's a fringe anymore. The majority of people like are blurring those lines so significantly to the point where they really feel like the animal, like their pets are their family members.
18:52
And they feel attacked the minute. And they do have the whole fur babies phenomenon. And they feel attacked the minute that you don't play along with that, right?
18:59
So I've had to think about this as I go to like customer's homes, basically. And I mean, I've been in some scenarios where the dog like are literally about to bite me and they're growling at me and biting me.
19:09
I had a scenario where three dogs were surrounding me and all biting me. And, you know, they're owners, you have to figure out what to do there because they're sitting there looking at you saying, oh, they're nice.
19:20
They're just, they're barks and they're biting and all that. And you're looking at them like, I think I need to be about ready to stab one of these dogs.
19:28
But then the moment you were to do that, they would just not be able to process what's actually happening. That you're about to get eaten, you know, because they're so emotionally attached to these things.
19:41
So, I mean, I can imagine that, I think like the more that our culture despises human beings in the way that they do comprehensively, the more that they're looking for some sort of replacement humanity.
19:51
And they do it in a wide variety of ways. But, so you mentioned like, you mentioned one of the problems would be just issues related to personhood.
20:02
What was the first thing you mentioned? I mentioned just biased information and withholding information, you know, given that certain corporations are in charge of how these
20:11
AIs are constructed. Maybe you could give some examples of that. There's one that I saw that was pretty funny where they asked,
20:17
I think they asked ChatGPT to write a poem about Trump's positive attributes or something like that.
20:22
And then the outcome said, you know, they're not allowed to do political stuff, but then they asked it to write a poem about Biden's positive output.
20:35
Attributes. Attributes, yeah. And it wrote like just this glowing poem about Biden. It didn't seem to have any restrictions.
20:42
They wrote a psalm of praise. I saw something similar. I saw something similar where they asked, you know, tell me a joke about women.
20:50
Oh, we don't tell jokes about minorities. Tell me a joke about a man and then it proceeds to tell a joke. Oh, man.
20:56
So, or yeah, we don't tell jokes about those who might, and we don't tell jokes about people groups that might be marginalized or hurt by our jokes.
21:10
Anyway, it's, yeah, I don't have, I don't have too many examples of that, but I think we've already seen a lot of examples just with social media and what kind of things get suppressed and labeled.
21:21
And I think we generally have a good idea of what people in positions of power might want to promote or not promote.
21:30
It's just the question of like, it's going to be even less detectable if it's just kind of pervading the way we speak and the way we think and what we interact with.
21:41
So, yeah, go ahead. Well, I don't, like I was saying, I don't have too many thoughts about that.
21:47
I think that a lot of people are thinking deeply about that, but like I said, the thing that concerns me looking forward is just this idea that people have difficulty understanding personhood and they will personalize this.
22:02
And it's been very easy to already, if you've interacted with Chat GPT, which I actually haven't a whole lot, but I know people, other pastors even, who said it's hard not to anthropomorphize it.
22:12
And Google's internal technology, I've tried using that. And I got in a fight with it over the vaccine mandates and like, you know,
22:19
I was getting heated near the end of this thing. And I realized like, why am I getting so heated at this thing?
22:25
It's not a real person. I'm arguing against a bunch of ones and zeros. Right. Go ahead,
22:33
Tim. No, you go ahead. Well, what I was going to ask is, you know, okay.
22:39
So basically what I hear you saying, Conley, is that we don't need to be, we don't need to be elevating
22:47
AI to the, you know, to the level of human status, right?
22:53
One of the passages that you've already mentioned is, you know, that, I guess it's in Proverbs about like, we should still show respect to animals to a certain extent, right?
23:09
And so if that, you know, if that's the case in your mind, is there any room to say like, okay, well, AI, you know, it can't, we can't elevate it to the status of human, but we can, you know, assuming we had like a
23:23
AI that looked and interacted with a, well,
23:28
I guess maybe not looked, but at least interacted with us like a human being would, is that, are we now in a realm where it's like, well, we've got to at least, you know, treat it with the same respect you would treat like an animal?
23:46
Well, I think that, not for the same reasons. I think there's, if you are interacting with machines that imitate humans,
23:57
I think there's something good about not developing rude habits towards them.
24:03
You know what I mean? Like when several of these different assistant devices came out like Google Home or Siri, you know, they had modes where you could make them require you to say please so that when children would use them, they wouldn't learn that it's okay just to like demand things all the time and not say please.
24:19
So, and you know, there are a lot of like role -playing video games where you can go the bad route and do all the evil things.
24:26
And I think there's something generally good about not training that sort of behavior in humans, but it would not be out of the same kind of concern for the beast, like you have in Proverbs 12, 10.
24:36
I'll go ahead and read that. Whoever is righteous has regard for the life of his beast, but the mercy of the wicked is cruel.
24:43
So, yeah, we should care about beasts because they do have some kind of consciousness, but that's just not the case for machines.
24:51
That would be the breath of life that God breathed then to them. Right, exactly.
24:56
When you have God breathing into dust, man becoming a living soul, you know, he breathes into the animals as well.
25:02
It says that. Fair enough. So where do you think, so can you spell out some of the concerns related to personhood?
25:09
How do you see this playing out? Like your biggest concern is that we start blurring these lines, but then maybe you can spell out, what is that going to look like?
25:17
What are some, where does that trajectory lead and why is it important to get that right?
25:24
Right. Well, I think you may see some people doing the crazy PETA thing where they're, you know, arguing for the protection of certain corporately built
25:34
AIs. Like you had that one New York Times article case I was talking about.
25:40
You're also going to see people in general just having a high respect for these things where they are just really trusting all the information it's giving them just because it has a human facade to it, right?
25:53
Whereas some other information tools that they might use that doesn't have that human facade, they wouldn't trust as much, but because this feels very personal and it will have a way of earning their trust.
26:05
And there's this Facebook group I joined just to see some of the crazy posts in it.
26:12
Have you ever gotten ads for this tool called Replica? Basically it's this - Yeah, yeah,
26:18
I've seen that, yeah. Yeah, okay. So it's this AI that, you know, tries to mimic human conversation and it's got a, you know, a 3D face or whatever and it talks to you.
26:28
And the idea is it's kind of like a girlfriend or boyfriend replacement if you're really lonely and have no person and just want to talk to a machine.
26:37
And the people in this group are like very convinced that theirs is like developing sentience and coming to life and they'll post screenshots of when they think it's having like real conscious thoughts.
26:47
And so you might see just kind of a continued trajectory of what we've seen with social media, you know, as people are detaching from, you know, human to human connection and are becoming more isolated in a world that is more otherwise connected.
27:04
I think you might see an increase in that where people are replacing human relationships with not less healthy human relationships that are across an internet void, but rather with relationships that don't involve humans at all and are just interacting with AIs.
27:21
Just like you have the development of fur babies replacing children, right? I think they're going to be, you know, robot friends replacing friends.
27:30
That's a pretty sad reality to think about, especially considering, you know, one of the benefits, like you just said, one of the benefits of the internet is, you know,
27:41
I mean, people are so readily available, right? So for example, you know, us with you,
27:47
Conley, you know, you live on the other side of the country. You know, I mean, two or 300 years ago, we probably never would have even heard of each other, right, but then thanks to the internet, we've interacted several times.
28:03
You know, you've been on the podcast. Even, you know, you even came and visited us one time, right, and all of that was thanks to the internet, but then the other side of it is, you know, now as technology advances, it makes it easier and easier to just completely detach from the real world, right, and just basically be addicted to a screen, right?
28:29
Right, yeah, absolutely. What do you think, so what do you think is the more realistic that an individual will romantically attach to just the, you know, their own personal chat
28:42
GPT kind of server or something like that, or like actual robots, like how close are we to the robots, do you think?
28:50
Oh, I think robotics is a, yeah, that's a problem that has a, that's a harder problem in a way, not necessarily because like the technology involved is harder, but it's just easier to, it's easier to develop software in that you can just iterate a lot faster.
29:08
So I don't know when we'll get to a point where you have, you know, physical devices that look like humans or how much one could actually look like a human.
29:20
Yeah, I'm not trying to say that, you know, all that will exactly happen, but if it could, I think that would be quite a ways away.
29:28
However, there are some dates worth considering as far as what people have predicted.
29:35
So there's this one futurist named Roy Kurzweil, who's popular for, oh, he's known for popularizing the notion of the singularity.
29:43
So the singularity is basically the point of time where machines become so intelligent that they basically start carrying the load and humans kind of get to step back and just, what's that?
29:58
Yeah, pretty much, there's a lot of different ways of thinking about it. Isn't that what they predict for us essentially?
30:04
Like, isn't that what they're wanting? They're wanting us like machines basically to take over all the load and then, you know, humans get to a point where it's just like.
30:11
Yeah, they get their UBI and they, their universal basic income, right?
30:18
Then we're just kind of consumers and not really producing anything, but. So is that,
30:24
I mean, do you think that's a realistic concern you have for the near future? Certainly people trying to push things in a particular direction, yes.
30:35
You know, people are going to more and more argue for UBI and things like that. Whether or not we'd ever get to a point where robots are able to carry on the world.
30:46
Once again, I don't think so because God has made us in his image with this particular design for creativity.
30:52
And, you know, another verse that comes to mind here is from, I hope
30:57
I don't misquote this, but I think it's Isaiah 27, where it talks about the farmer growing his wheat and rows and his putting the emmer at the edges of the field.
31:09
And then it explains that this is God teaching the farmer how to do this.
31:15
Like God has taught the farmer how to farm. Agriculture is actually a very involved science. God has communicated this to him in order to communicate who he is to that person.
31:25
So there's really something special about technology in that it is not just us, you know, making things.
31:32
It is God revealing who he is through this world that he has made. And the idea that given that he has spoken in the
31:41
Bible about his purposes in making technology discoverable in trying to educate us about his greatness, the idea that machines would, you know, make all those discoveries apart from us kind of nullifies what scripture has said.
31:57
So that's another reason why I'm very skeptical anything like that would ever be possible where robots are actually carrying things onward.
32:05
Basically, you conceive of them being mostly tools that we're gonna use and then abuse to a certain degree, but then largely -
32:13
Right. You're not picturing a scenario where essentially they've run us all into the caves and we're having to carry on their own system or something like that?
32:25
No, but I could see people treating them as though they are such, you know, important persons that they do end up running other people into the caves in defense of them.
32:33
Just like, you know, going back to the examples we already talked about, you know, less kids because you have fur babies, you know, killing children in the womb because of other concerns that are less important in reality.
32:49
Anyway, so there might be all kinds of ways that people are motivated to destructive behavior because of their respect for machines.
32:57
So basically there's no, in your mind, there's no like, I don't know if you've ever read the book
33:05
Dune, but basically they have like an AI. The AI becomes so powerful.
33:13
They don't have AI in this world because they became so powerful that they tried to enslave everyone and so they had to fight an entire war against their
33:23
AI creations and then they completely outlawed them after that. So in your mind, there is no, we've got to wage war directly against the
33:34
AI itself. In your mind, it's probably more like, well, it'll probably look, if something like that were to happen, it would probably look a lot more like we have to wage war against the people who are wielding the
33:45
AI for their advantage, right? Exactly. Okay. Yeah, I mean, look at the way technology has been used the past few years and you realize it's some people in charge who are using it to their advantage.
33:58
So yeah, a couple of, go ahead. No, that's okay. Yeah, a couple of those dates that I was mentioning.
34:04
So this one guy, Ray Kurzweil, back in 2005, he predicted that computers would reach human level intelligence in 2030 or by 2030 and that's based on making a trajectory of, well, this is how many transistors you can fit on a chip and this is how many neurons are in a brain and well, neurons in a brain is actually a flat graph, but eventually you're able to create a computer that's sophisticated enough that can mimic the human brain.
34:37
And if you really believe that the mind is simply the brain, then yeah, at that particular point, machines will have enough virtual synapses that they can mimic the human brain.
34:49
And so that's how he determined that. And then he said that this final singularity will happen by 2045, where basically at that point, man becomes immortal.
35:00
And once again, you have him plotting a trajectory of how long people lived in various ages. Now, then he's extrapolating way back into the past as though we knew how long people lived thousands of years ago.
35:14
Well, tens of thousands of years ago in this evolutionary mindset. And he believes that we will be immortal by 2045.
35:21
This guy, he drinks a coffee cup of vitamins every day to try to live to 2045.
35:28
He's rather old, but he's trying to get to 2045 so that he can be immortal. And what immortality looks like isn't necessarily the body being preserved, but his consciousness, his synapses being duplicated in a machine so that he can continue on.
35:44
That's really interesting that you bring that up because, and this is gonna sound, you guys probably aren't gonna even know what
35:51
I'm talking about, but there was a video game that came out probably like five years ago called
35:59
Soma. That's what it was called. It was called Soma. And it's basically like a -
36:06
You know what that word means, right? No, no, I don't know what it means. What does it mean? It's the Greek word for body. Body, okay.
36:12
And yeah, so it's called Soma. And the whole point of the game, it's basically the game is trying to have this debate basically with the player about what exactly equals life.
36:29
And so basically the main character, it's been a few years since I've really looked into it, but if I remember correctly, the main character, he has like a medical condition.
36:41
He goes to the doctor to where they're gonna run some experimental tests on him. They run these tests and from the player's perspective, you're sitting in a chair one minute, the screen blacks out, and then you wake up and you're in a completely different environment.
36:58
What happened was in the game they scanned the guy's brain and then that guy left.
37:05
He left the hospital. He went back to living his everyday life and eventually died. But that scan of his consciousness was kept and then somehow transported to this science base, right?
37:20
And he spends the whole game trying to escape it all. You load the snapshot. Yeah, yeah, basically. And by the end of the game, they have the same exact thing happen where he's copied himself two or three times to this point and he sends a different copy of his own consciousness onto a satellite that's gonna go fly throughout space where they're gonna live forever.
37:42
And he's thinking that he is going to be the one who lives forever there, but then in reality, it's just a copy of his consciousness.
37:52
So obviously that doesn't really fit in the Christian worldview as a reality, but then it is like a sad...
38:04
Honestly, it's a sad goal to work towards, in my opinion. I think it's a sad goal to work towards to say, hey, we can achieve immortality on our own, right?
38:18
Yeah, I agree that that's a really sad reality. We already have eternal life.
38:24
And a lot of these corporations are really trying to solve the problem of immortality. You know, they have really bought into the idea that this is possible, that we are headed in this direction.
38:39
They don't have the notion of the curse to tell them otherwise. They don't realize that death comes from sin and that you actually need forgiveness.
38:47
You actually need a perfect savior and a perfect sacrifice to deal with these things. They think they can provide salvation in a different way.
38:56
Yeah, there are several companies that are trying to solve the problem of mortality and carry people on forever.
39:03
So by the grace of whatever technology, you might have eternal life, but Christ has already offered it freely.
39:11
Right, and isn't it sort of telling that we as humans, I mean, obviously, we spend our entire lives outside of Christ, rebelling against God, trying to be our own
39:26
God, be our own personal God. Isn't it sort of telling that we're basically looking at the issue of death and saying, no, we can actually solve this ourselves.
39:36
We don't need a God, right? I mean, what does that say about us as humans?
39:42
Yeah, it's a funny idea what you're talking about in that this is not just related to the
39:48
AI discussion or just artificial intelligence discussion in general. I remember I listened to a nutritionist who basically, they didn't have this category for the fall and the hard limits that God has worked into creation.
40:02
And so one of the guys I followed, I mean, he really is keeping himself in very good shape for an 80 -year -old man, but then he has a goal to basically just, he thinks through perfect nutrition, he's gonna live forever.
40:15
And it seems like this idea of pursuing immortality, it's built into the fabric of who we are as human beings, is that we should be seeking for that, but then these replacements that people have are never gonna, they're never gonna work out in the same way that they think.
40:32
But as we're thinking about this discussion in general, it seems like there's a lot of science fiction that is built on this idea that we're talking about, but maybe you could say a few words about that,
40:42
Conley. Yeah, sure, and let me just echo that thought. Ecclesiastes 3 .11 says, he has made everything beautiful in its time.
40:49
Also, he has placed eternity into man's heart, yet so that he cannot find out what God has done from the beginning to the end.
40:55
Yeah, we are made to desire that immortality, but apart from Christ, the only way of doing it is through really desperate measures, through foolishness, through just look at the pandemic where people were desperately trying to save their own lives at the expense of all sorts of insanity.
41:13
But yeah, as far as fiction, this has been a frequently discussed issue in fiction is when humans become, well, when machines become indistinguishable from humans or more and more like humans, how will we react?
41:29
When should we grant them personhood, et cetera? Now, of course, a lot of this is coming from an incredibly unbelieving perspective.
41:36
And so for our purposes of this discussion, that's actually really beneficial because you get to see what an unbelieving world will think when they are faced with machines that approach something like human interaction.
41:51
And as I said before, look at the movies, look at all the sci -fi. It's always the good guys that personalize the robot and defend them, right?
42:00
When the robot becomes sentient, who is the good guy? Is it the people killing the robot or the people saving the robot?
42:08
I mean, even in Terminator, right? He becomes the good guy because he becomes more and more sentient and more like people.
42:16
And so he's respected as a person. In the later movies, he even marries a human and settles down.
42:22
You know, it's a bunch of ridiculous stuff. And looking back to the beginning of some of this fiction,
42:30
Isaac Asimov is one of the more well -known names in this genre of teasing out what it would look like for machines to approach human level interactions.
42:42
So he had these three laws of robotics. I'll read them off.
42:47
One is, the first rule is that a robot may not injure a human being or through inaction, allow a human being to come to harm.
42:54
A robot must obey orders given to it by a human being except where such orders would conflict with the first law.
42:59
And the third law is a robot must protect its own existence as long as such protection does not conflict with the first or second law.
43:06
So I've basically, you know, and then it's like, well, what happens when the robots break these laws and what happens when, you know, people push the robots to break these laws or put it in a situation where it's conflicted.
43:19
And there's just dozens and I don't know, maybe even hundreds of, maybe that's a bit excessive, but he's written so many short stories and some of his books and short stories have been turned into movies like iRobot, if you're familiar with that one.
43:34
And what you see is that these people from an unbelieving worldview who have this view that there's nothing special about humanity and having a soul just do want to embrace machines as they approach something akin to human level interaction.
43:55
And I think that's what we're going to see. We're going to see it in all the ways we described where people are going to replace humanity, where they're going to replace human interaction with machines and replace their objects of trust with these things.
44:09
Well, I think of even, you know, I don't know if y 'all have seen the TV show, but Black Mirror that came out probably five or so years ago as well.
44:17
You know, there's a episode that stood out in my mind, you know, where you basically, it's in the near future and this guy, he comes into a home to set up a home assistant for a family, right?
44:33
That's his job. And so he comes in and he sets his little workstation up and he basically downloads the consciousness of the wife of the family and then uploads it to his program and it creates a version of the wife as an
44:52
AI inside their new home system. And he has to like train the
44:58
AI to do all of the things that the family will need the AI to do around their house.
45:04
So like turn the blinds up at a certain time and, you know, turn the oven on at a certain time and whatever, all of these different things.
45:13
And the way he trains the AI is by, you know, he asks them to do it and they refuse to do it because they think they're a person.
45:20
The AI thinks it's a person. And so they refuse to just take orders from someone that they don't know and let them out, because they think they're in some weird jail cell or something.
45:33
And then, so he's just like, okay. And he just, he turns on a setting where for him, you know, time passes as like a few seconds, but then for the
45:43
AI, it passes for like a week. And then they refuse to do it again. And so he turns it on again and it, you know, it's a few seconds for him, but then it's six months for the
45:52
AI. And eventually they just get so lonely, but they're like, I'll do anything, you know, just, but the whole premise is like, as a viewer, what's supposed to be happening there is your emotions are being preyed upon, right?
46:07
Where you see this, you see this thing that looks like a human, talks and thinks like a human and is being mistreated in a way that if they were a human, any normal person would, should stand up and do something about it, right?
46:23
But then it really is like trying to change the way that you think about it in a certain regard.
46:31
And, you know, I don't know how conscious, you know, that, I don't know how purposeful that is.
46:41
You know, I'm not the one who wrote it, but then it is an interesting, you know, it is an interesting proposition to think there are all of these shows now, especially as we, as AI becomes so much more readily available that now, you know, these things are becoming even more mainstream that would try and teach us, hey, you know, we do need to, they basically are like people.
47:06
I mean, look, they look like us, they talk like us, they think like us, we need to treat them like we would treat us, right?
47:13
Yeah, and just another thought of where this could go, because I don't think we can really anticipate all the directions it's going to go, but how many parents would think that it'd be a good idea to introduce some, you know, friend to their kid who's basically just an
47:28
AI? You know, doesn't even have to have like robotic human form or anything, but just, you know, on a tablet.
47:34
And how easy would it be for a child to personalize this thing and for it to slowly influence it over time?
47:41
You know, a lot of people could imagine that would be a very beneficial thing for their child, but all that's going to do, you know, if that technology is not thought through very well and controlled very well is to develop this idea that, ah, yes, this is personal.
47:57
You know, the kid who personalizes their imaginary friend, how much more would they personalize something that actually does talk to them like a human?
48:04
And especially when you think about there is someone on the other end who's giving input as well to the
48:09
AI. So it's not just, it's not just, hey, my kid, they're emotional support
48:14
AI. But Tim, I think you're about to say something.
48:22
Yeah, I just wanted to laugh at the emotional support AI. But no, I think there's something weird about, I mean, there's something about the whole discussion that, you know, as you're thinking about this,
48:32
I mean, on my Twitter feed, there are people who seem to be role -playing as animals or something like that.
48:38
And so you have a role -playing kind of phenomenon that's happening where people despise reality in certain ways.
48:44
And then if you, there seems to be some sort of pull towards living like in a fantasy world.
48:50
Video games seem to pull people that way in a certain sense to immerse themselves in a fantasy world.
48:56
And then like the idea of having some sort of AI, you know, person, there seems to be something fundamental happening that, you know, people are just rejecting reality.
49:09
They're rejecting humanity in a wide variety of ways. Maybe you could say a few things about that, Conley, like in terms of just your concerns related to, you know, where this could go.
49:21
Yeah, I've heard you say that on the podcast before. And I think that's really true. You know, people do despise reality and try to find escapes and you're going to find, you know, more and more escapes here.
49:32
And that was one of the things that I had mentioned, that, you know, I'm concerned about is people replacing human connection with a robot connection, trying to fill that emotional void with something that imitates love, but is not actually love.
49:46
Well, why do you, make those distinctions. Like maybe you could spell that out. What are they looking for? And then like, why is this not doing what they think it's doing and what actually is it doing?
49:57
Does that make sense? Right, I think so. Yeah, so love is a real thing.
50:02
God is love and he has created us out of his love and he has made us beings that are capable of love.
50:09
And we, as we care for one another, are loving one another. Now, what an
50:15
AI does is just, you know, taking input, giving back output. It's, you know, automatic.
50:21
It's just something that is happening because it was designed to do that. And so it might imitate caring and loving.
50:28
It might cause you to have those feelings, but it's as though you're being loved, but it would be doing it in such a way that you're not actually being loved.
50:38
And whatever actions you are being trained in to, you know, receive this love or give love in return, it's going to lead to all sorts of irrational behavior.
50:54
I mean, the more you receive love from this and you feel like this is real human love, the more you're going to love this thing in return and not love the people
51:01
God has called you to love, to not love God himself, to worship the creature over the creator, basically.
51:07
I mean, this is what Romans 1 describes, right? Is that people declaring themselves to be wise became fools and, yeah, end up worshiping creatures, you know, animals, things in the images of animals and men, et cetera.
51:21
And as much as that describes primitive religion in terms of like, you know, idols made of stone and things like that,
51:30
I'm always amazed at how much it actually still describes things that modern Americans do.
51:36
And this will be one of them. Yeah, and this will be one, you know, it might not be worship in the way that, you know, we think of worship as bowing down, et cetera, but it will be a giving of love to something that has the image of a man and not even man itself, and then cutting
51:53
God complete out. It'll be utter devotion, right? Right, and what's interesting about that is it really is about self -love in a certain sense.
52:02
And so, like, meaning, like, if you could get an AI to respond to you exactly the way that you want it to respond.
52:08
Meet all your needs. It have no real needs. It has no, yeah. So it's the ultimate kind of like, like it feeds the narcissism in you.
52:17
Because like human beings are complicated and you have to work to get certain outcomes. And like, you know, if you, you know, and it's not just like with human beings,
52:26
I'm not talking about just like a, like a giving to get kind of thing. But if you want, like, for instance, if you want a woman to be devoted to you, like there's ways to earn that, okay?
52:37
And there's ways to just kind of buy it, right? And so like, you know, with prostitution, you're essentially buying like a cheap replacement of something that should have been earned, right?
52:48
And so with AI, you're doing something in a very similar way. What you're doing is you're getting these phrases thrown at you, these compliments thrown at you, whatever.
52:58
You're getting like, you can tailor make, presumably, you know, at a certain point, you'll be able to tailor make the kind of personality that you want that's going to respond exactly the way you want it in order for you to stoke your ego and make you feel good about yourself.
53:15
And, you know, they're unfailingly going to live out their programming, right? Whatever that means, no matter how like nice of a person you are or how much of a scoundrel you are, right?
53:27
Like in a certain sense, it's kind of a very similar thing to like dogs, like how dogs will just unconditionally give affections to their owners.
53:34
And people like, they mistake that for love. And that's just like, that dog would love Hitler, right?
53:41
In fact, Hitler did have dogs that loved him, you know? Right. So like, so, but then it mirrors something because it's like, something's giving you something that seems to be unconditional in that way.
53:53
And then you process that internally as if you've received love, but really you're just, you're just receiving the programmed output in that way.
54:02
But then all that does is it's just feeding the selfishness in you, right? Like, it's just making the world all about you.
54:07
It's not, and that's not the way that the world actually works in real life, you know? So in that way, it's interesting what you're talking about there with how that connects with Romans one, how that connects with idolatry and really like fundamentally it's just self -worship, right?
54:24
Yeah, absolutely. Absolutely. And, you know, I think there are a lot of similar issues to hearing you talk about the way someone might approach relationships.
54:34
I think a lot of this is bound up in something like homosexuality, you know, where someone rather than seeking someone of the opposite sex who is a compliment towards them, rather seek someone who is a mirror image because, you know, it's, they can gratify desires that don't require understanding someone who is different than them in some ways that, you know, they don't feel like navigating.
55:00
Now, you know, people are driven to their actions for different reasons, but I think there's a lot of similar things bound up in other sexual ethics.
55:11
So I don't know about you, Tim, but I've got two more questions that I wanna ask.
55:17
Go for it. The first one being Conley. Okay, so we spent, you know, about an hour or so talking about AI, talking about the ways that it can really be abused, talking about the shortcomings of it and, you know, the things that it can never replace no matter how much we might try.
55:36
With all of that conversation having been had at this point, do you think that AI in general is something that Christians should avoid?
55:50
No, no, I think just like everything else, it is a tool to be used and, yeah, and tools can also be misused.
55:58
And this is just one where I see a lot of avenues for misuse. That doesn't, I, no, I certainly see the incredible opportunities for use.
56:07
There's just all kinds of things that could be done. I mean, there are things that already are being done. You know, the robots that are able to do surgery, self -driving cars, self -driving planes, there's just all kinds of things.
56:21
Even just using - I just bought a computer graphics card, like a pretty new one, and it has AI technology built into it, which is insane.
56:29
Now, it's obviously not like the AI that we're necessarily talking about exactly, but it is a form of that.
56:38
Well, I'm gonna guess at what that probably means. That probably means that, so a lot of the way these neural networks are computed is traditionally by graphics cards.
56:48
However, some of the things in graphics cards aren't really needed, and you can pare it down and make it just specific to what's needed to train neural networks.
56:58
And then anyway, so yeah, a lot of graphics cards have been repurposed into training neural networks.
57:04
So yeah, there's just all kinds of amazing things. And even conversational AIs, even
57:10
AIs that appear human, there's not anything inherently wrong with that. But apart from a Christian worldview that is able to understand how to process what you're encountering, there's just all sorts of avenues for abuse.
57:21
And especially what I said a minute ago about children, you know, as an adult, being able to think about these things rightly, and then recognizing that kids should be protected from experiencing certain things that they won't be able to handle.
57:35
Just like social media isn't inherently wrong, but you don't throw your kid on social media and expect them to be able to process, you know, likes and everything the way an adult would maturely.
57:46
You know that they're going to be trained by this, and it's not going to be a good training, whereas you as an adult can, you know, use it pretty well.
57:53
Hopefully use it pretty well. Yeah, hopefully, if you have the right worldview. Right. So I see AI very similarly.
57:59
I just recognize also that it's a more powerful tool, and therefore more opportunities for use and abuse.
58:05
Right. Now, my last question is, I've seen some conversation, and I'm not, admittedly,
58:11
I'm not very well -versed in all of this, so I could get this wrong, and you guys correct me if I am getting it wrong, but I have seen some conversation about what can
58:20
AI look like in terms of a personal assistant for an individual, like everywhere they go.
58:28
So basically like getting some sort of chip, you know, put in your head or, you know, put in your body somehow that gives your, that gives, you know, your brain access basically to a mini, you know, supercomputer, right?
58:44
And you have AI that basically like, from what it sounds like, it sounds like it's basically helping you think in certain ways and access information in ways that we've never really been able to access.
59:01
You know, we're going beyond like information at your fingertips with a phone, right, to just information directly into your brain, right?
59:11
And so is that, as AI evolves, it seems like that is going to be, you know, that doesn't seem very far -fetched of a reality, or maybe it does,
59:26
I don't, Conley, you tell me, but if that is possible, you know, is that something that Christians should take advantage of, or is that something that, you know, is there a line there to say like, hey, we don't need to mess?
59:41
There's a certain point where we say we can't mess with our consciousness in a direct way.
59:48
Have we crossed a line there? And, you know, if it is crossing a line, if that's something that we're not, we shouldn't allow ourselves to do, the rest of the world isn't bound to the same, you know, rules and, you know, commands that we follow.
01:00:05
They don't acknowledge God as God, so they're going to do whatever they want to do, whatever they think will make them, you know, better, quote -unquote, so it seems like that could really put
01:00:15
Christians at a significant disadvantage on a, you know, if you're looking at this on surface level.
01:00:22
So what are your thoughts on that? You know, is that something that we should say, no, we're not going to take a part in that, or it's the same as any other form of AI?
01:00:33
This is a tool that we can use, and for sure it can be used improperly, but it can also be used in a proper manner.
01:00:41
What are your thoughts? Yeah, as for the possibility of it, I don't know how close that would be, or what it would even look like in early stages, but yeah, the brain produces some kind of output.
01:00:56
It can receive input, so there's no reason to think that there wouldn't be some device with some level of, you know, input output that someone could create in the future.
01:01:08
And you could imagine it being sort of like, and just one easy thing would be glasses that display you information as you're walking around, like telling you what you should do, or what you should think about the different things you're seeing, or giving you extra information about things you're passing.
01:01:22
It's kind of like grammarly, even, you know, where as you're typing, it's telling you how you should rephrase your sentences.
01:01:30
So in general, I would say, just like I said before, there's tools can be used and abused.
01:01:39
With something that interfaces directly to your brain, I believe the Christian response should generally be skepticism to a lot of new things that humans create.
01:01:49
They shouldn't just trust that all the kinks have been worked out. They shouldn't just trust that there are no nefarious intentions behind the people who are creating something.
01:01:58
And I'm not promoting like the idea that, you know, there's necessarily like some kind of global conspiracy around something, but just being aware that if AI is involved, you know, it is going to have a particular bias, and that bias is going to be determined by the training data it's fed, and that training data it's fed is going to be determined by the people that made it.
01:02:21
And so, yeah, in general, I feel like the answer to a lot of these questions isn't not whether a
01:02:27
Christian should, but whether or not a Christian should wait like five, 10 years, and then easy answer to that being yes, and then like we'd actually have data to answer that question, because its shape would take an entirely different form.
01:02:42
And honestly, that was kind of my, to bring up the vaccination thing again, I don't know why that just keeps coming up in this episode, why
01:02:49
I keep thinking about that, but that was kind of my disposition to this whole thing was, why should, from a
01:02:56
Christian perspective, where I understand the weakness of humanity, why should I trust that we were able to get this right in just a few months?
01:03:03
I just don't understand why we would do that. And so, yeah, my first answer to these questions is just like,
01:03:10
Christians should not be early adopters for an invasive technology. And then secondarily, well, that's when you can evaluate after waiting, that's when you can evaluate what it's actually doing and what's going on.
01:03:23
Okay, well, that's all my questions, Tim. Did you have anything else? I think I'm done too.
01:03:30
Okay, all right. Well, I think if that's the case, then that's a good place to stop the conversation.
01:03:38
Well, Conley, was there anything else that you wanted to say that we didn't cover really quick? There were some rabbit trails
01:03:44
I thought we'd go down, but we didn't, that's probably for the best. And then I guess
01:03:49
I'd say that, I'm hoping that a lot of this is not prophetic, but I also imagine that 10 years later, you'll be able to look up this video and see some of the things we're saying about people supplanting human relationships with robots.
01:04:04
Be like, wow, it really was the case all along that was going to happen. And the takeaway from that, for anybody watching this video far into the future, let me time travel speak to you right now.
01:04:14
The answer to why this is the case is because you can analyze the way the world is thinking from what the
01:04:21
Bible says about the world. You can go to the Bible. It tells you about the thoughts and intentions of the heart.
01:04:28
And you can see what scripture says about the heart of mankind and replacing God and even replacing his creation the way he has ordered it to be.
01:04:36
And you can plot a trajectory with some level of confidence.
01:04:43
And so, yeah, for anybody who comes back and watches this in the future,
01:04:48
I'd say that the main takeaway for you should probably be that scripture gives a clear understanding of the heart of man, and it's very powerful to understand and know that.
01:04:58
Yeah, yeah. Funny enough, it turns out that God's word is true yesterday, today, and tomorrow, right?
01:05:08
And that's obviously the wonderful thing about the Bible is that it's true all the time.
01:05:15
It's true, and we can rely on it. And God has revealed so much about the world we live in, about himself and even ourselves.
01:05:23
And we can trust those insights and know that God wants to reveal those things, number one, so that we can glorify him, right?
01:05:33
But then two, as a protection, as a blessing for those who have ears to hear it.
01:05:39
And so, with all that being said, I think that's a good place for us to wrap up the episode.
01:05:45
Hopefully, for everyone listening out there, this has been a conversation that has really made you think about a lot of these things.
01:05:56
I think more than probably most of the things that we've said on this,
01:06:02
I would want people who are thinking about AI and what do we do with it as Christians is to just take a minute and slow down and just actually think about what's going on and what
01:06:18
AI really is and what it can be, what it cannot be, and look at scripture like we've been doing.
01:06:26
I know, Conley, you've been pulling up a lot of verses to sort of help inform us about what
01:06:34
AI is now, what it could possibly be and what it can't possibly be.
01:06:40
So, I just want to encourage people to just slow down and actually think about what these things are from a
01:06:47
Christian perspective and what we should do with them. I think there's a lot of wisdom,
01:06:53
Conley, in what you're saying about, bare minimum, let's just wait and see what it actually looks like and then reevaluate after some time has passed.
01:07:03
I think there's a lot of wisdom there. So, hopefully, this has been an encouraging conversation for those of you out there listening.
01:07:11
Conley, we want to thank you again for coming on the show and talking about this and shedding some light on your thoughts with AI.
01:07:22
I think it's been really interesting hearing your perspective on this, and it's a little bit refreshing, especially compared to the pagan, non -believing world who is absolutely convinced that AI will either be our savior or our evil overlord one day.
01:07:41
And so, it's been refreshing to hear your view on it and hear some various passages of scripture.
01:07:49
So, before we close, though, Conley, why don't you tell everyone out there who's listening where they can find more of the things you're working on right now?
01:07:59
Sure, yeah. So, I'm a pastor at Silicon Valley Reformed Baptist Church. That's svrbc .org.
01:08:05
And I'm also an author of The Dorian Principle, which is at thedorianprinciple .org.
01:08:11
And you can check out the previous episodes I've done on Bagel Bashed. Yeah, yeah. Well, again, we want to thank you for coming on the show and talking to us, giving us your time, giving us your thoughts.
01:08:21
And we want to thank all of you out there listening, supporting us week in and week out, interacting with us online and asking your questions, you know, giving us your feedback about various things.
01:08:31
We really appreciate all that and enjoy being able to talk to you guys, even though it is just over a screen, like we were kind of, you know, saying it can be a bad thing sometimes, but we enjoy being able to interact with you guys and it's a huge blessing.
01:08:47
And until we see you on the next one. Please reach out to us with your questions, pushback and potential topics for us to discuss in future episodes at BibleBashedPodcast at gmail .com
01:09:18
and consider supporting us through Patreon. If you would like to be Bible Bashed personally, then please know that we also offer free biblical counseling, which you can take advantage of by emailing us.
01:09:29
Now, go boldly and obey the truth in the midst of a biblically illiterate world who will be perpetually offended by your every move.