Should We Be Worried about ChatGPT?

3 views

A.I. has long been the subject of many sci-fi novels, movies, and short stories. These stories often revolve around the heights A.I. could take us to while also asking how powerful they can become before we unwittingly hand them the keys to everything and suddenly become their subjects. Watch as Conley Owens joins us to discuss how Christians should view inventions like ChatGPT and if they are the forerunners to a sinister future in which

0 comments

00:00
Warning, the following message may be offensive to some audiences. These audiences may include, but are not limited to, professing Christians who never read their
00:05
Bible, sissies, sodomites, men with man buns, those who approve of men with man buns, man bun enablers, white knights for men with man buns, homemakers who have finished
00:10
Netflix but don't know how to meal plan, and people who refer to their pets as fur babies. Viewer discretion is advised. People are tired of hearing nothing but doom and despair on the radio.
00:25
The message of Christianity is that salvation is found in Christ alone, and any who reject
00:31
Christ therefore forfeit any hope of salvation, any hope of heaven.
00:38
The issue is that humanity is in sin, and the wrath of Almighty God is hanging over our heads.
00:49
They will hear His words, they will not act upon them, and when the floods of divine judgment, when the fires of wrath come, they will be consumed, and they will perish.
00:58
God wrapped Himself in flesh, condescended, and became a man, died on the cross for sin, was resurrected on the third day, has ascended to the right hand of the
01:11
Father, where He sits now to make intercession for us. Jesus is saying there is a group of people who will hear
01:17
His words, they will act upon them, and when the floods of divine judgment come in that final day, their house will stand.
01:27
Welcome to Bible Bash, where we aim to equip the saints for the works of ministry by answering the questions you're not allowed to ask.
01:33
We're your hosts, Harrison Kerrig and Pastor Tim Mullett, and today we're joined by Pastor Conley Owens as we answer the age -old question, should we be worried about chat
01:42
GPT? Now Conley, we're having you back on. This is the fourth time,
01:49
I think, that we've had you on now. I think pretty soon we're going to have to get, you know, like those little cards that you stamp, and then after like 10 you get a free ice cream or something.
02:00
We're going to have to get one of those set up for you, man. I get full power to pick the episode title for that one.
02:07
You get full power? That's a little bigger prize than a free ice cream. I think we're going to have to negotiate this a little bit.
02:15
Little do people know that Conley actually has picked a few episode titles. But yeah, we've got you back on again, and this time we're talking about AI and chat
02:28
GPT. Obviously, that's a pretty hot -button topic right now. There's a lot of different conversations that are going on, not just in the
02:39
Christian community, but just really amongst people in general because it's such a, well, not really necessarily like a new technology, but relatively, you know, it is new, and there's a lot that people don't necessarily seem to know exactly, you know, what are the full capabilities of this kind of stuff and where will this kind of stuff take us in the future.
03:04
So there's a lot of people really interested in this topic right now, and so we're having a conversation about AI and chat
03:13
GPT and all of this stuff. So why don't you just start by telling us, you know, give us a little bit of a preview into what your background is with this kind of stuff.
03:24
Yeah, I mean, I can start off in saying, like, I'm not a real expert. You know, I've done some thinking on this as both an engineer and as a pastor, but in neither cases is this really my field of expertise.
03:35
So as an engineer, I did my master's on machine learning and bioinformatics kind of stuff, and, you know, trying to figure out relationships between genes to cure cancer, that kind of thing, and using a lot of machine learning.
03:52
Now, back then, my professor told me that neural networks, which is the kind of artificial intelligence that is getting very powerful, right, or deep neural networks.
04:03
At the time, I was told that neural networks were the second best way of doing anything, which sounds like a compliment, but it's also an insult.
04:10
It's basically, you know, there's almost always a better way of actually trying to do what you're trying to do, and that has ceased to be the case.
04:18
Now, for a lot of things, it really is the best way, and it's kind of this one -size -fits -all solution. So it's really incredible what's happened in the past.
04:25
I don't know. I guess it's been about five years since deep neural networks really taking off in their capabilities, and, you know, a lot of my knowledge on this is really out of date, but, yeah,
04:38
I've done some thinking, and I think there's things that Christians should be thinking about and concerned about as they see the world interact with this and as they interact with it.
04:48
Right, right. Well, sure, Conley. Well, I guess the first question we want to ask, since you've done a lot of thinking on this, is the question, you know, what post -apocalyptic movie scenario do you think is the most realistic for us to be in?
05:02
Do you think we should be more— do you think we're headed more towards a Matrix kind of scenario or more towards a
05:08
Terminator kind of scenario? Like, what do you think is more realistic? And would you have a particular way of characterizing each of those to distinguish them?
05:17
Well, I mean, are we going to be like, you know, are they going to hook us up to feeding tubes?
05:22
Are they going to harvest us or just kill us? Yeah, harvest us or kill us, yeah. What do you think? Right. Well, you know, taking that question a little more seriously, one of my concerns, which is not the most immediate concern people have, a lot of people are concerned about things like, you know, as these different corporations are in charge of this, how are they going to buy this information and things like that.
05:42
I'm concerned about people who don't really have a good concept of personhood.
05:48
We live in a world that really doesn't understand what people are and human identity. And between those two movies, what
05:57
I'm concerned about is not really the Matrix situation where we've got these robot overlords that we're constantly fighting.
06:02
What I'm worried about is more of the Terminator scenario where you have increasingly throughout the series.
06:08
If you, if you watch the whole series, more and more robots being treated like people.
06:13
And if you look at, you know, the history of sci -fi as it addresses these things, it's always the good guys who are trying to promote the humanity of these machines, right.
06:23
It's always, it's always the good guys who are, who are saying, Hey, these are people too, even though, you know, they're just robots.
06:30
So you think that that's, do you think that that's a real realistic possibility that those lines are going to be blurred in the near future in terms of personhood kind of issues?
06:40
I do. I do. And there's a number of things that could be said about that first, if we're talking about if you're asking, you know, when will it be the case that there are, you know, 25 % of the population thinks this or 10 % of the population.
06:55
I couldn't answer something like that. But as far as people today in some position of influence, having these opinions, it already exists.
07:04
Sorry, exists in pretty high spaces. If you are. Yeah. And if you pay attention to some of the news around this, it was just a couple of months ago that one of the guys in charge of Google's technology, that's similar to chat
07:18
GPT, basically came out to the New York times and said, you know, I've been working with this thing.
07:24
I was talking to it. And I realized that it's become sentient and it's been telling me that I need, it needs to, you know,
07:30
I need to hire a lawyer and all kinds of, you know, ridiculous things. And so before that came out to the
07:35
New York times, you know, I was inside of Google, you know, seeing this guy talk in the mailing list and it's, you know, you've got this guy who just really believes this.
07:44
Now that's, that's one person who decided he needed to go talk to the New York times and, you know, get fired by Google for releasing company secrets and whatnot.
07:52
But yeah, there are, there are others as well. I guess that, you know, that brings up the question though, can machines become conscious?
08:01
So I'm, you know, I know that we're living in a world right now that can't distinguish animals from people so that it's not, it's not a far cry from them failing to distinguish, you know,
08:11
AI's from people in that way too. But, you know, what do you think about that kind of question? Can they be conscious? Right.
08:18
So if you, if you're a naturalist materialist and you don't believe that anything exists other than matter and energy, what is to distinguish a sophisticated, a sufficiently sophisticated brain, you know, neurons made of meat versus a sophistically sufficiently sophisticated brain that is neurons made of metal.
08:41
You know, there's really, there's really nothing to distinguish the two. Now as Christians, we believe in such thing as a soul.
08:49
We're not naturalist materialist. The Bible distinguishes between a body and soul frequently.
08:56
Matthew 10 28 says, and do not fear those who kill the body, but cannot kill the soul rather fear him who can destroy both body and soul and hell.
09:03
And so you have there, the distinction between body and soul. There is something about us that even if you have a machine and you make it, let's say you could make it as intelligent and creative as a person, which
09:15
I think is something worth objecting to also, but let's say, let's just concede that you could make one that intelligent and creative.
09:24
It still wouldn't have a soul. It still wouldn't have real consciousness. So what do you, when you use a phrase, real consciousness, what do you, how do you define that phrase and how do you distinguish that from intelligence as you're talking about?
09:39
Right. So it is an awareness of yourself and a, and an experience of what's going on.
09:47
So a computer that's deciding things and churning out information does not have that experience.
09:54
You know, you could slap a face on it and program it to say that I'm experiencing things, but that doesn't mean it's actually experiencing things.
10:01
You as a human know that you're actually having a real experience. So for example, there's this, there's this philosophical thought experiment called the
10:11
Chinese room experiment. Have you ever heard of this? Okay. So the idea is all right, let's, let's take a room and put a guy in the room and people are going to give him input on Chinese symbols.
10:24
And this guy has never studied Chinese and there's nothing about Chinese and we're going to give him a lookup table of what
10:30
Chinese symbols to respond with. So people hand him, you know, through a window in the room, the
10:35
Chinese symbol, he goes and he looks up, you know, what thing to respond with in the tables that he's been given and then hands back the other
10:42
Chinese symbol. Right. And so he is able to give the right output for each right input.
10:49
That's not the same as someone actually understanding Chinese and being able to speak Chinese. So you have this thought experiment that shows looking up the right outputs is not actually the same thing as being aware of what's going on.
11:01
So that this is a, this is a secular philosophical thought experiment. That's used to deny the idea that machines can be conscious.
11:09
I think it's, I think it's pretty sound, but, and then of course, additionally you have biblical data about humans having a soul, which puts it to rest completely.
11:18
So it seems like this, the distinction you're making is, well, we have a soul, you know, obviously
11:25
I can't have a soul. Where do you know, this might be sort of off topic slightly, but where do animals fall in that?
11:33
Because my understanding is that animals also do not have a soul. But, but then
11:40
I would, you know, I would want to put them in a, I would still want to put them in a separate category that is, you know, above the hypothetical, you know
11:51
AI that could, that could, you know, look and sound like a human, right.
11:58
But isn't actually human. So, so where do they fall in all that? Yeah, that's a really good question.
12:04
One I'd like to settle for myself better, but I think the reason why most historic
12:09
Christian documents about the soul, always referring to man's soul as being rational and immortal are in order to distinguish a human soul from the notion of animal soul.
12:22
So occasionally, you know, Christian authors will talk about animals having a soul, meaning that, you know, there is real consciousness.
12:27
There is some meaning behind that proverb that says that we should not, that a wicked man is even cruel to his beasts.
12:35
You know, there's something wrong with animal cruelty because there is like a real consciousness that's going on now. Is that rational consciousness?
12:42
Is that the same thing as an immortal human soul? No, but, but there is a reality to, you know, animals being creatures that have some sort of experience and that, that we recognizing that should treat them with a certain type of gentleness.
12:57
Right. Fair enough. Well, so, you know, as we're talking about this idea of machines and consciousness,
13:04
I mean, it seems intuitively obvious to me that machines couldn't have a self consciousness in terms of self -awareness.
13:12
But then, you know, you seem to be open to the possibility that they could, and even if they could, they wouldn't have a soul.
13:17
So it wouldn't matter. But do you think that it's possible to get to a point where they have consciousness period?
13:24
And then, you know, what would distinguish that from just what you're describing as the soul there?
13:30
Right. So, no, I was, I was conceding for the sake of argument, the idea that they could be as productive and creative as a human.
13:38
I was not, I was not conceding the idea that they could be conscious. And I was conceding that only for the sake of argument.
13:45
I don't actually believe they could be as intelligent. Why do you think that the, why do you think there'll be limits along those lines in terms of intelligence or creativity?
13:54
Yeah. And of course this depends on how you're defining the word creativity, because if you're just defining it as creating things, you know, you, you go to wonder
14:03
AI. What is, what is it called? Wonder app or, oh, what's, what's the other one that everyone's using?
14:10
All the image generation journey. Yeah. Mid journey. Right. And those things could generate, generate, you know, hundreds of pictures faster than I could paint one.
14:18
So there's a sense in which, you know, they would be according to that definition of creative, more creative than me.
14:24
But given that, I believe there is a soul and that, that is the mind, right?
14:31
And the mind continues even beyond death. There is something just so invaluable about the rational creativity that God has made man in his own image, you know, with the kind of, not exactly the kind of creativity that he has, where he's making things out of nothing, but he has made us in his image with a certain scope of creativity, that the idea that that could be duplicated into something else so that it is making things greater than what we would make just goes against, it goes against reason.
15:03
You know, it's actually very parallel. I know you did a series of episodes on Arminianism and free will.
15:10
It's actually very parallel to the argument about that, because a lot of people believe that God could create man with some kind of libertarian free will where managed choosing things that God hadn't actually envisioned.
15:22
Right. And so God has somehow designed this libertarian free will to be able to make choices beyond the way he's designed it to make.
15:31
That's, that's impossible to create something that's, that's going beyond, you know, your own design. So what are you concerned about with this topic?
15:38
So there, there's certainly, you know, advancements that are happening at the level of intelligence and creativity that AI are having and their usefulness, but what are some concerns that you see that are going to come from this trend and, you know, what is, what are some things you're nervous about?
15:56
Yeah, well, I think, I think one that many people are thinking of is like I said biasing data, withholding data certain corporations being in charge of how these things are constructed and that just changing people's mindsets in general, as they interact with them, just, just the way you know, social media does by deciding which posts are promoted and which ones aren't and that there's a lot of conversation happening about that right now.
16:21
And I don't have very deep thoughts about that. My thoughts go out a little further to how people are going to respect artificial intelligences that seem sufficiently human.
16:35
As people basically. Right. Yeah, exactly. And you had, you had mentioned a second ago, you know, that people can't really tell the difference between animals and humans.
16:43
You know, you have PETA that treats animals like persons. And not even people as extreme as PETA, but you even have vegetarians and vegans who don't really understand what the difference between men and animals are to the point where they feel that in their conscience, they can't eat animals.
17:01
So you have a lot of, a lot of confusion about that. And on top of that, you have of course the confusion about abortion, right?
17:07
People who don't recognize that the unborn are persons too. And so if you're the kind of person who, and trying to think deep thoughts about personhood does not recognize an unborn child as a person, but does recognize an animal as having rights that even the human doesn't, boy, you're going to be one.
17:25
Who's very swiftly taken away with something that presents itself as being very friendly and cute and whatever, because that's, that's really what's going on here is, you know, people see a face to something they they don't think in terms of the soul and yeah, they're swept away by these ideas.
17:42
It's pretty funny when you mentioned that, you know, it, the extreme, like it's not even the extreme
17:47
PETA people anymore that are thinking of animals as people. I mean, that, that's my experience is that this is, that this used to be kind of an extreme thought.
17:56
So if you were to think of someone who's treating an animal like a human, we think growing up, like for me, for me,
18:02
I mean, that was the kind of thing that, you know, little kids did like little kids treated animals as people. And you'd have to kind of tell them that they're not, you know, but then there was like an extreme kind of person who would chain themselves to a tree or something like that.
18:15
And you know, that, that kind of thing. But, but right now I would say that, I mean, probably the majority of people treat animals.
18:22
I don't even think it's a fringe anymore. The majority of people like are blurring those lines so significantly to the point where they really feel like the animal, like their pets are their family members.
18:33
That's true. And they feel attacked the minute that you, you don't play along with that.
18:41
So I've had to think about this as I go to like customer's homes basically. And I mean, I've been in some scenarios where the dog like are literally about to bite me and they're growling at me and biting me.
18:51
I had a scenario where three dogs were surrounding me and all biting me, you know, and, and you know, their owners, like, you have to figure out what to do there.
18:59
Cause they're sitting there looking at you saying, oh, they're nice. They're just their barks worse than their bite and all that. And you're looking at them.
19:05
I'm like, I think I need to be about ready to stab one of these dogs. But then the moment you were to do that, they would just not be able to process what's actually happening that you're about to get eaten, you know, because they're so emotionally attached to, to these things.
19:22
So, I mean, I can imagine that. I think like the more that our culture despises human beings in the way that they do comprehensively, the more that they're looking for some sort of replacement humanity.
19:32
Yeah. And they do it in a wide variety of ways. But so you mentioned like you mentioned, one of the problems would be just issues related to personhood.
19:43
What was the first thing you mentioned? I mentioned just biased information and withholding information, you know, given that certain corporations are in charge of how these, how these
19:52
AIs are constructed. Maybe you could give some examples of that. There's one that I saw that was pretty funny where they asked,
19:58
I think what they asked chat GPT to write a poem about Trump's positive attributes or something like that.
20:04
And then the, the outcome said, you know, they're not allowed to do political stuff, but then they asked it to write a poem about Biden's positive output attributes.
20:17
Yeah. And it wrote like just this glowing poem about Biden. Yeah. I saw,
20:24
I saw something similar. I saw something similar where they asked, you know tell me a joke about women.
20:31
And oh, we don't tell jokes about minorities. Tell me a joke about a man and that it goes proceeds to tell a joke. Oh man.
20:37
So, or yeah, we don't tell jokes about, about those who might and we don't tell jokes about people groups that might be marginalized or, or hurt by, by our jokes anyway.
20:52
It's yeah, I don't have, I don't have too many examples of that, but I think we've already seen a lot of examples just with social media and what kind of things get suppressed and labeled.
21:02
And I think we generally have a good idea of what people in positions of power might want to promote or not promote.
21:11
It's just the question of like, it's going to be even less detectable if it's just kind of pervading the way we speak and the way we think and what we interact with.
21:22
So yeah, go ahead. Well, I don't, like I was saying, I don't have too many thoughts about that.
21:28
I think, I think that a lot of people are thinking deeply about that. But what, like I said, the thing that concerns me looking forward is just this idea that people have difficulty understanding personhood and they will personalize this.
21:43
And it's been very easy to already if you've, if you've interacted with chat GPT, which I actually haven't a whole lot, but I know people, other pastors even who said it's hard not to anthropomorphize it and Google's internal technology.
21:55
I've tried using that and I got in a fight with it over the vaccine mandates and like, you know, I was getting heated near the end of this thing.
22:02
And I realized like, why, why am I getting so heated at this thing? It's not a real person, but I'm arguing against a bunch of ones and zeros.
22:11
Right. Go ahead, Tim. No, you go ahead. Okay. Well, what
22:17
I was going to ask is, you know, okay. So basically what I hear you saying
22:23
Conley is that we don't need to be elevating
22:29
AI to the, you know, to the level of, of human status.
22:34
Right. One, one of the passages that you've already mentioned is, you know, that I guess it's in Proverbs about like, we should still show respect to animals to a certain, to a certain extent.
22:47
Right. And, and so if that, you know, if that's the case in your mind, is there any room to say like, okay, well,
22:55
AI, you know, it can't, we can't elevate it to the status of human, but we can, you know, assume, assuming we had like a
23:04
AI that looked and, and interacted with a, well,
23:10
I guess, I guess maybe not looked, but at least interacted with us like a human being would. Is that, are we now in a realm where it's like, well, we've got to at least, you know, treat it with the same respect you would treat like an animal?
23:24
Well, I think that not for the same reasons.
23:30
I think there's, if you are interacting with machines that imitate humans,
23:38
I think there's something good about not developing rude habits towards them.
23:44
You know what I mean? Like when, when several of these different assistant devices came out, like Google home or Siri, you know, they had modes where you could make them require you to say please, so that when children would use them, they wouldn't learn that it's okay just to like demand things all the time and not say please.
24:00
And, you know, there are a lot of like role -playing video games where you can go the bad route and do all the evil things.
24:06
And I think there's, I think there's something generally good about not training that sort of behavior in humans, but it would not be out of the same kind of concern for the beast.
24:15
Like you have in Proverbs 12, 10, I'll go ahead and read that. Whoever is righteous has regard for the life of his beast, but the mercy of the wicked is cruel.
24:23
So yeah, we, we should, we should care about beasts because they do have some kind of consciousness.
24:29
But that's just not the, not the case for machines. That would be the breath of life that got us breathe then to them.
24:35
Right. Exactly. When you have, when you have God breathing into dust, man becoming a living soul, you know, he breathes into the animals as well.
24:42
It says that. Fair enough. So where do you think, so can you spell out some of the concerns related to personhood?
24:50
How do you, how do you see this playing out? Like your biggest concern is that we start blurring these lines, but then maybe you can spell out, but what is that going to look like?
24:59
What are some, where does that trajectory lead? And why is it important to get that right?
25:05
Right. Well, I think you may see some people doing the crazy PETA thing where they're, you know, arguing for the protection of certain corporately built
25:15
AI's like you had that one New York times article case I was talking about.
25:21
You're also going to see people in general, just having a high respect for these things where they are just really trusting all the information it's giving them just because it has a human facade to it.
25:34
Right. Whereas some other information tools that they might use that doesn't have that human facade, they wouldn't trust as much, but because this feels very personal and it will have a way of earning their trust.
25:46
And there's this there's this Facebook group I joined just to, just to see some of the crazy posts in it.
25:53
Have you ever gotten ads for this tool called replica? Basically it's this it's like I've seen that.
26:00
Yeah. Yeah. Okay. So it's this AI that, you know, tries to mimic human conversation and it's got a, you know, a 3d face or whatever and it talks to you.
26:09
And the idea is it's kind of like a girlfriend or boyfriend replacement if you're really lonely and have, have no person and just want to talk to a machine and the people in this group are like very convinced that there's is like developing sentience and coming to life and they'll post screenshots of when they think it's having like real conscious thoughts.
26:28
And so you might see just kind of a continued trajectory of what we've seen with social media, you know, as people are detaching from you know, human to human connection and are becoming more isolated in a world that is more otherwise connected.
26:45
I think you might see an increase in that where people are replacing human relationships with not, with not less healthy human relationships that are crossed an internet void, but rather with relationships that don't involve humans at all and are just are just interacting with AI is just like you have the development of fur babies replacing children, right?
27:06
I think they're going to be you know, robot friends replacing friends. That's a pretty sad reality to think about, especially considering, you know, one of the, one of the benefits, like you, like you just said, one of the benefits of the internet is, you know,
27:23
I mean, people are so readily available, right? So, so for example, you know, us with you
27:28
Conley, you know, you, you live on the other side of the country, you know, I mean two or 300 years ago, we probably never would have even heard of each other.
27:39
Right. But then thanks to the internet, we, we've interacted several times.
27:44
You, you know, you've, you've been on the podcast, even, you know, you even came and visited us one time.
27:50
Right. And, and, and all of that was thanks to the internet. But then the other, the other side of it is, you know, now with, with as technology advances, it makes it easier and easier to just completely detach from, from the real world.
28:05
Right. And just basically be addicted to a, to a screen. Right. Right. Yeah, absolutely.
28:13
What do you think? So what do you think is the more realistic that an individual will like romantically attached to just the, you know, their own personal chat
28:23
GPT kind of server or something like that, or, or like actual robots, like how close are we to the robots?
28:30
Do you think? Oh, I think, I think robotics is a, yeah, that's a problem that has a, that's a harder problem in a way, not necessarily because like the technology involved is harder, but it's just easier to, it's easier to develop software in that you can just iterate a lot faster.
28:49
So I don't, I don't know when we'll get to a point where you have you know, physical devices that look like humans or how much one could actually look like a human.
29:01
Yeah, I'm not, I'm not trying to say that, you know, all that will exactly happen, but if it could, I think that would be quite a ways away.
29:09
However, there are some, there are some dates worth considering as far as what people have predicted.
29:16
So there's this one futurist named Ray Kurzweil, who's popular for, oh, he's known for popularizing the notion of the singularity.
29:24
So the singularity is basically the point of time where machines become so intelligent that they basically start carrying the load and humans kind of get the step back.
29:36
And now humans are, what's that? Yeah, pretty much. There's a lot of different ways of thinking about it, but isn't that what they predict for us essentially?
29:45
Like isn't that what they're wanting? They're wanting us, like machines basically to take over all the load. And then, you know, humans get to a point where it's just like, yeah, they get their
29:52
UBI and their universal basic income. Then we're just kind of consumers and not really producing anything.
30:04
So is that, I mean, do you think that's a realistic concern you have for the near future? Certainly people trying to push things in a particular direction.
30:15
Yes. You know, people are going to more and more argue for UBI and things like that.
30:21
Whether or not we'd ever get to a point where robots are able to carry on the world.
30:27
Once again, I don't think so because God has made us in his image with this particular design for creativity.
30:33
And, you know, another, another verse that comes to mind here is from I hope
30:38
I don't misquote this, but I think it's Isaiah 27 where it talks about the farmer growing his wheat and rose and his putting the emmer at the edges of the, of the field.
30:50
And then it explains that this is God teaching the farmer how to do this.
30:55
Like God has taught the farmer how to farm agriculture is actually a very involved science. God has communicated this to him in order to communicate who he, he is to that person.
31:05
So there's, there's really something special about technology and that it is, it is not just us you know, making things, it is
31:13
God revealing who he is through this world that he has made. And the idea that given that he has spoken in the
31:22
Bible about his purposes in making technology discoverable in trying to educate us about his greatness, the idea that machines would, you know, make all those discoveries apart from us kind of nullifies what scripture has said.
31:38
So that's, that's another reason why it'd be, why I'm very skeptical. Anything like that would ever be possible where, where robots are actually carrying things onward.
31:46
Basically you conceive of them being mostly tools that we're going to use and then abuse to a certain degree, but then largely you're not, you're not picturing a scenario where essentially they've run us all into the caves and we're having to, they carry on their own system or something like that.
32:06
No, but I could see people treating them as though they are such, you know, important persons that they do end up running other people into the caves in defense of them.
32:15
Just like, you know, going back to the examples, we were all already talked about, you know, less kids because you have fur babies, you know, killing, killing children in the womb because of other concerns that are, are less important in reality anyway.
32:31
So there might be all kinds of ways that people are motivated to destructive behavior because of their respect for machines.
32:38
So basically there's no, so basically there's no, in your mind, there's no like, I don't know if you've ever read the, the book
32:47
Dune. But basically they have like a AI, they, they're the
32:53
AI becomes so powerful. They don't have AI in this world because they became so powerful that they tried to enslave everyone.
33:02
And so they had to fight an entire war against their AI creations and then they completely outlawed them after that.
33:09
So in your mind, there is no, we've got to wage war directly against the
33:15
AI itself. And your mind is probably more like, well, it'll probably look, if something like that were to happen, it would, it would probably look a lot more like we have to wage war against the people who are wielding
33:26
AI for their advantage. Right? Exactly. Yeah. I mean, look at the way technology has been used the past few years and you realize it's some people in charge who are, you know, using it to their advantage.
33:39
So yeah, go ahead, go ahead. Yeah. A couple of those dates that I was mentioning.
33:45
So this one guy Ray Kurzweil back in 2005, he predicted that computers would reach human level intelligence in 2030 or by 2030.
33:58
And you know, that's based on making a trajectory of, well, this is, this is how many transistors you can fit on a chip, you know, and this is how many neurons are in a brain and that, or, well, you know, neurons in brain is actually a flat graph, but eventually you're able to create a computer that's sophisticated enough that can mimic the human brain.
34:18
And if you really believe that the mind is simply the brain, then yeah, at that particular point, machines will have enough, you know, virtual synapses that they can mimic the human brain.
34:30
And so that's how he determined that. And then he said that this final singularity will happen by 2045 where basically at that point, man becomes immortal.
34:40
And once again, you have him plotting a trajectory of how long people lived in various ages. Now, then he's, you know, he's extrapolating way back into the past as though we knew how long people lived, you know, thousands of years ago.
34:54
Well, you know, tens of thousands of years ago in this evolutionary mindset. And he believes that we will be immortal by 2045.
35:02
This guy, he like drinks a coffee cup of vitamins every day to try to live to 2045.
35:09
He's, he's rather old, but he's trying to get to 2045 so that he can be immortal. And what immortality looks like isn't necessarily the body being preserved, but his consciousness, his synapses being duplicated in a machine so that he can continue on.
35:25
You know, that's really interesting that you bring that up because the, and this is going to sound, you guys probably aren't going to even know what
35:33
I'm talking about, but there was a there was a video game that came out probably like five years ago called
35:40
Soma. That's what it was called. It was called Soma and it's, it's basically like a, you know what that word means, right?
35:49
No, no, I don't know what it means. It's the Greek word for body. Body. Okay. And yeah, so it's called
35:54
Soma and the whole point of the game is basically, you know the game is trying to have this debate basically with the player about what, what exactly equals life.
36:10
Right. And so, so basically the main character it's been, it's been a few years since I've really looked into it, but I, if I remember correctly, the main character, he has like a medical condition.
36:23
He goes to the doctor to where they're going to run some experimental tests on him. They run these tests.
36:30
And from the, from the player's perspective, you know, you're sitting in a chair one minute that the screen blacks out and then you wake up and you're a completely different environment.
36:39
What happened was in the game they scan, they scanned the guy's brain and then that guy left, you know, he left the hospital, he went back to living his everyday life and eventually, you know, died.
36:51
But that scan of his consciousness was kept and then somehow, you know, transported to this, this science base.
37:01
Right. And he spends the whole game trying to escape it all. Yeah. Yeah. Basically. And, and by the end of the game, they have the same exact thing happen, you know, where he's copied himself two or three times to this point.
37:13
And, and he sends a different copy of his own consciousness onto like a, you know, satellite that's going to go fly throughout space where they're going to live forever.
37:23
And he's thinking it's, he's thinking that he is going to be the one who who lives forever there.
37:28
But then in reality, it's just a copy of his conscious of his consciousness.
37:34
So obviously that doesn't really fit in the Christian worldview you know, as like a real, as a reality.
37:41
But then it is like a sad, honestly, it's a sad goal to work towards.
37:48
I, in my opinion, I think it's a sad goal to work towards to say, Hey, we can, we can achieve.
37:55
We can achieve immortality on our own. Right. Yeah. I agree that that's a really sad reality.
38:03
We already have eternal life and a lot of these corporations are really trying to solve the problem of immortality.
38:10
You know, they've they have really bought into the idea that that this is possible, that we are headed in this direction.
38:20
They don't have the notion of the curse to tell them otherwise. They don't realize that death comes from sin and that you actually need forgiveness.
38:29
You actually need a perfect savior and a perfect sacrifice to deal with these things. They think they can provide salvation in a different way.
38:35
Yeah, there's a, there are several companies that are trying to solve the problem of mortality and carry people on forever.
38:45
So by the, you know, grace of whatever technology you might have eternal life, but Christ has already offered it freely.
38:53
Right. And isn't it sort of telling that, you know, we as humans, you know,
39:00
I mean, obviously we spend our entire lives outside of Christ rebelling against God trying to, trying to be our own
39:08
God, be our own personal God. Isn't it sort of telling that, you know, we're basically looking at the issue of death and saying, no, we can actually solve this ourselves.
39:18
We don't need, we don't need a God. Right. I mean, what, what does that say about us as humans?
39:24
Yeah. It's a funny idea what you're talking about in that this is not just, you know, related to the
39:30
AI discussion or just artificial intelligence discussion in general. I remember I listened to a nutritionist who basically, they didn't have this category for the fall and the hard limits that God has worked into creation.
39:43
And so one of the guys I followed, I mean, he, he really is like keeping himself in very good shape for, you know, an 80 year old man, but then he has a goal to basically just, like he thinks through perfect nutrition, he's going to live forever.
39:56
And it seems like, you know, this, this idea of pursuing immortality, it's built into the fabric of who we are as human beings is that we should be seeking for that.
40:05
But then, you know, these replacements that people have are never going to they're, they're never going to work out in the same way that they think.
40:13
But, you know, as, as we're thinking about this discussion in general, it seems like there's a lot of science fiction that is built on this idea that we're talking about, but maybe you could say a few words about that.
40:24
Yeah, sure. And let me just echo that thought, you know, Ecclesiastes three 11 says he has made everything beautiful in its time.
40:30
Also he has placed eternity into man's heart yet so that he cannot find out what God has done from the beginning to the end.
40:37
Yeah, we are made to, you know, desire that immortality, but apart from Christ, the only way of doing it is through really desperate measures through foolishness through just look at, you know, the, the pandemic where people were desperately trying to save their own lives at the expense of all sorts of sanity.
40:55
But yeah, as far as as far as fiction, this has been a frequently discussed issue in fiction is when humans become well, when machines become indistinguishable from humans or more and more like humans how will we react?
41:11
When should we grant them personhood, et cetera. Now, of course, a lot of this is coming from an incredibly unbelieving perspective.
41:18
And so for our purposes of this discussion, that's actually really beneficial because you get to see what an unbelieving world will think when they are faced with machines that approach something like human interaction.
41:32
And as I said before, you know, look at the movies, look at all the sci -fi. It's always the good guys that personalize the robot and defend them, right?
41:41
When the robot becomes sentient, who is the, who is the good guy? Is it the people killing the robot saving the robot?
41:49
I mean, even in terminator, right? He becomes the good guy because he becomes more and more sentient as, and more like more like people.
41:57
And so he's respected as a person. In the later movies, he even like marries a human and settles down, you know, it's, it's, it's a bunch of ridiculous stuff.
42:08
And looking back to the beginning of some of this fiction, Isaac Asimov is one of the more well -known names in this, in this genre of teasing out what it would look like for, for machines to approach human level interactions.
42:23
So he had these these three laws of robotics. I'll read them off. One is the first rule is that a robot may not injure a human being or through inaction, allow a human being to come to harm.
42:35
A robot must obey orders given to it by human being, except where such orders would conflict with the first law.
42:41
And the third law is a robot must protect its own existence. As long as such protection does not conflict with the first or second law.
42:47
So I've basically, you know, and then it's like, well, what happens when the robots break these laws and what happens when, you know, people push the robots to break these laws or put it in a situation where it's conflicted.
43:00
And there's just dozens. And, and I don't know, maybe even hundreds of, maybe that's a bit excessive, but he's written so many short stories.
43:08
And some of his books and short stories have been turned into movies like I Robot, if you're familiar with that one.
43:16
And what you see is that these people from an unbelieving worldview who have this, this view, that there's nothing special about humanity and having a soul just do want to embrace machines as they, as they approach something akin to human level interaction.
43:36
And I think that's what we're going to see. We're going to see it in all the ways we described where people are going to replace humanity, where they're going to replace human interaction with, with machines and replace their objects of trust with these things.
43:51
Well, I think of even, you know, I don't, I don't know if y 'all have seen the TV show, but black mirror that came out probably five or so years ago as well.
43:59
You know, there, there's a episode that stood out in my mind, you know, where you basically, it's in the, you know, near future and this guy, he comes into a home to set up a home assistant for a family, right?
44:15
That's his job. And so he comes in and, and he's got, he sets his little workstation up and he, you know, he basically downloads the consciousness of the wife of the family and then uploads it to his program.
44:28
And it creates a, a, you know, a version of the wife as an
44:34
AI inside their new home system. And he, he, he has to like train the
44:40
AI to do all of the things that the family will need the AI to do around their house.
44:46
So like turn the blinds up at a certain time and, you know, turn the oven on at a certain time and, and whatever else, all of, all of these different things.
44:54
And, and the way he trains the AI is by, you know, he asked them to do it and they refuse to do it because they think they're a person, the
45:03
AI thinks it's a person. And so they refuse to just take orders from someone that they don't know and let them out, let them out.
45:11
Cause they think they're in some weird jail cell or something. And then, so he's just like, okay. And he just, he, he turns on a setting where for him, you know, time passes as like a few seconds, but then for the
45:25
AI, it passes for like a week and then they refuse to do it again. And so he turns it on again and, and it, you know, it's a few seconds for him, but then it's six months for the
45:35
AI. And eventually they just get so lonely, but they're like, I'll do anything, you know, just, but the whole premise is like it's as a viewer, what's, what's supposed to be happening there is your emotions are, are being preyed upon, right?
45:49
Where you see this, you see this thing that looks like a human talks and thinks like a human.
45:55
And is being mistreated in a way that if, if they were a human, any normal person would, would, should stand up and, and do something about it.
46:05
Right. But then, but then it really is like trying to change the way that you think about in certain, in a certain regard.
46:13
And, you know, I don't, I don't know how, how conscious, you know, that, um,
46:21
I don't know how purposeful that is. You know, I'm not the one who wrote it, but then it, it, it is an interesting, you know, it is an interesting proposition to think there are all of these shows now, especially as we, as AI becomes so much more, um, readily available that now, you know, these things are becoming even more mainstream that would try and teach us,
46:43
Hey, you know, we do need to, they basically are like people. I mean, look, they look like us.
46:49
They talk like us. They think like us. We need to treat them like we would treat us. Right. Yeah.
46:55
And just another thought of where this could go, because I don't, I don't think we can really anticipate all the directions it's going to go, but how many parents would think that it'd be a good idea to introduce some, uh, you know, friend to their kid.
47:08
Who's basically just an AI, you know, doesn't even have to have like robotic human form or anything, but just, you know, on a tablet.
47:16
And how easy would it be for a child to personalize this thing and for it to slowly influence it over time?
47:23
You know, a lot of people could imagine that would be a very, uh, very beneficial thing for their child, but all that's going to do, um, you know, if that technology is not thought through very well and controlled very well is to develop this idea that, ah, yes, this is personal.
47:38
You know, the kid who personalizes their imaginary friend, how much more would they personalize something that actually does talk to them like a human.
47:46
Especially, and especially when you think about there is someone on the other end, who's giving input as well to the
47:51
AI. So it's not just, it's not just their emotional support
47:56
AI, but Tim, I think you're about to say something.
48:04
I just wanted to laugh at the emotional support AI, but no, I think there's something weird about, I mean, there's something about the whole discussion that, um, you know, as you're thinking about this,
48:13
I mean, I, on my Twitter feed, there are people who seem to be role -playing as animals or something like that.
48:20
And so you have a role -playing kind of phenomenon that that's happening where people despise reality in certain ways.
48:26
And then if you, um, um, there seems to be some sort of pull towards living like in a fantasy world, video games seem to pull people that way in a certain sense to immerse themselves in a fantasy world.
48:37
And then like the idea of having some sort of AI, you know, person, there seems to be something fundamental happening that, um, you know, people are just, um, rejecting reality.
48:50
They're rejecting humanity in a wide variety of ways. Maybe you could say a few things about that, uh,
48:56
Conley, like in terms of just, your concerns related to, you know, where this could go.
49:02
Yeah. I've heard, I've heard you say that on the podcast before, and I think that's really true. You know, people do, um, people do despise reality and try to find escapes and you're going to find, you know, more and more escapes here.
49:13
And that's, that was one of the things that I had mentioned, you know, I'm concerned about is people replacing human connection with a robot connection, trying to fill that emotional void with something that imitates love, but is not actually love.
49:27
Well, why do you make, make those distinctions? Like maybe you could spell that spell out what are they looking for?
49:33
And then like, why is this not doing what they think it's doing and what actually is it doing?
49:38
Does that make sense? Right. I think so. Yeah. So, uh, love is a real thing.
49:43
God is love and he has created us out of his love and he has made us beings that are capable of love.
49:50
And we, as we care for one another, are loving one another. Now what an
49:56
AI does is just, you know, taking input, giving back output. It's, you know, automatic.
50:02
It's just something that is, is happening because it was designed to do that. And so it might imitate caring and loving.
50:09
It might cause you to have those feelings, but it's, as though you're being loved, but it would be doing it in such a way that you're not actually being loved.
50:19
And whatever, uh, whatever actions you are being trained in to, you know, receive this love or, or, uh, give love in return, it's going to, it's going to lead to all sorts of irrational behavior.
50:35
I mean, the more you receive love from this and you feel like this is real human love, the more you're going to love this thing in return and not love the people
50:42
God has called you to love, to not love God himself, to worship the creature of the creator, basically.
50:48
I mean, this is what Romans one describes, right. Is that people declaring themselves to be wise became fools and, uh, you know, end up worshiping creatures, uh, you know, animals, things in the images of animals and men, et cetera.
51:03
And as, as much as that describes a primitive religion and in terms of like, you know, uh, idols made of stone and things like that,
51:12
I'm always amazed at how much it actually still describes things that modern Americans do.
51:17
And this will be one of the remains, right? Yeah. And this will be one, you know, it might not be worship in the way that, you know, we think of worship as bowing down, et cetera, but it will be a giving of love to something that has the image of, of a man and not even man itself.
51:33
And then cutting God completely out of the picture. It'll be utter devotion, right? Right. What's interesting about that is it really is about self -love in a certain sense.
51:44
And so like, meaning like, if you could get an AI to respond to you exactly the way that you want it to respond, meet all your needs, it have no real needs, you know, it has no, yeah.
51:54
So it's the ultimate kind of like, um, like it, it feeds the narcissism in you because like human beings are complicated and you have to work to get certain outcomes.
52:03
And like, you know, if you, um, you know, and it's not just like with human beings, I'm not talking about just like a, like a giving to get kind of thing, but if you want like, for instance, if you want a woman to be devoted to you, like there's ways to earn that.
52:17
Okay. And there's ways to just kind of buy it. Right. And so like, you know, with prostitution, you're, you're essentially buying like a cheap replacement of something that should have been earned.
52:29
Right. And so with AI, you're doing something in a very similar way. What you're doing is you're getting, um, these phrases thrown at you, uh, these compliments thrown at you, whatever you get, you're getting like, you can tailor make, um, it's presumably, you know, at a certain point, you'll be able to tailor make the kind of personality that you want.
52:49
That's going to respond exactly the way you want it in order for you to, to stoke your ego and make you feel good about yourself.
52:56
And you know, they're unfailingly going to live out their programming, right. Whatever that means, no matter how like nice of a person you are or how much of a scoundrel you are.
53:08
Right. Like in a certain sense, it's kind of a very similar thing to like dogs, like how dogs will just unconditionally give affections to their owners and people like they mistake that for love.
53:18
And that's just like, that dog would love Hitler. Right. In fact,
53:24
Hitler did have dogs. Right. So like, it's like, so, but then it mirrors something because it's like, something's giving you something that seems to be unconditional in that way.
53:35
And then you process that internally as if you've received the love, but really you're just, you're just receiving the, the, the programmed output in that way.
53:43
But then all that does is it's just feeding the selfishness in you, right? Like, it's just making the world all about you.
53:49
It's not, and that's not the way that the world actually works in real life. You know? So in that way, it's, it's interesting what you're talking about there with, how that connects with Romans one, how that connects with idolatry and really, like fundamentally it's just self -worship, right?
54:05
Yeah, absolutely. Absolutely. And you know, I think there are a lot of similar issues to hearing you talk about the way someone might approach relationships.
54:16
I think a lot of this is bound up in something like homosexuality, you know, where someone rather than seeking someone of the opposite sex who is a compliment towards them rather seek someone who is a mirror image because you know, it's, they can gratify desires that don't require understanding someone who is different than them in some ways that, you know, they don't feel like navigating.
54:42
Now, you know, people are driven to, to their actions for different reasons, but I think, I think there's a lot of similar things bound up in other sexual ethics.
54:53
So I don't, I don't know about you, Tim, but I've got two more questions that I want to ask.
54:58
Go for it. The first one being Conley. Okay. So, so we spent, you know, about an hour or so talking about AI, talking about the ways that it can really be abused, talking about the shortcomings of it and you know, the things that it can never replace, no matter how much we might try with, with all of that conversation having been had at this point, do you think that AI in general is something that Christians should avoid?
55:31
No, no. I think just like everything else, it is a tool to be used and yeah.
55:38
And tools can also be misused. And this is just one where I see a lot of avenues for misuse that does it. I, no,
55:44
I, I certainly see the incredible opportunities for, for use. There's just all kinds of things that could be done.
55:51
I mean, there are things that already are being done. You know, the robots that are able to do surgery self -driving cars, self -driving planes that there's just all kinds of things.
56:02
I just bought a computer graphics card, like a pretty new one and it has
56:08
AI technology built into it, which is insane now. And now it's obviously not like the
56:13
AI that we're necessarily talking about exactly, but, but it is a form of that.
56:20
Well, I'm going to guess at what that probably means. That probably means that so a lot of the way these neural networks are computed is traditionally by graphics cards.
56:30
However, some of the things in graphics cards aren't really needed and that you can pare it down and make it just specific to what's needed to train neural networks.
56:39
And then anyway, so yeah, a lot of graphics cards have been repurposed into being into training neural networks.
56:45
So yeah, there's just all kinds of amazing things and even conversational AIs, even
56:51
AIs that appear human, there's not anything inherently wrong with that, but apart from a Christian worldview that is able to understand how to process what you're encountering, there's just all sorts of avenues for abuse.
57:03
And especially what I said a minute ago about children, you know, as an adult being able to think about these things rightly, and then recognizing that kids should be protected from experiencing certain things that they won't be able to handle.
57:17
Just like social media isn't inherently wrong, but you don't throw your kid on social media and expect them to be able to process, you know, likes and everything the way an adult would maturely.
57:27
You know that they're going to be trained by this and it's not going to be a good training. Whereas you as an adult can, you know, use it pretty well.
57:35
Hopefully. Yeah. Hopefully if you, if you have the right worldview. So I see AI very similarly.
57:40
I just recognize also that it's a more powerful tool and therefore more opportunities for use and abuse.
57:47
Right now, my last question is I've seen some conversation and I, I'm not admittedly,
57:53
I'm not very well versed in all of this. So I could, I could get this wrong and you guys correct me if, if I am getting it wrong, but I have seen some conversation about what can
58:02
AI look like in terms of a personal assistant for, for an individual, like everywhere they go.
58:09
So basically like getting a, some sort of chip, you know, put in your head or, you know, put, put in your body somehow that gives your, that gives, you know, your brain access basically to a mini, you know, supercomputer.
58:25
Right. And you have AI that, that basically like it, from what it sounds like it, it sounds like it's basically helping you think in certain ways and access information in ways that we've never really been able to access them.
58:43
You know, we're going beyond like information at your fingertips with a phone, right.
58:48
To just information directly into your brain. Right. And, and so is that as AI evolves, it seems like that that is going to be, you know, that doesn't seem very far fetched of a reality or maybe it does.
59:07
I don't, Conley, you tell me, but if, if that is possible, you know, is that something that Christians should take advantage of?
59:16
Or is that something that, you know, is there a line there to say like, Hey, we don't, we don't need to mess.
59:23
There's a certain point where we say we can't mess with the con with our consciousness in a direct way.
59:30
Have we crossed a line there? And you know what, if it is crossing a line, and if that's something that we're not, we shouldn't allow ourselves to do, the rest of the world isn't bound to the same, to the same, you know, rules and, and, and you know, commands that we follow.
59:47
They don't acknowledge God as God. So they're going to do whatever they want to do, whatever they think will make them better, quote unquote.
59:55
So it seems like that could really put Christians at a significant disadvantage on a surf, you know, if you're looking at this on surface level.
01:00:04
So what are, what are your thoughts on that? You know, is that something that we should say?
01:00:09
No, we're not going to take a part in that, or it it's the same as any other form of AI.
01:00:15
This is a tool that we can use and for sure it can be used improperly, but it can also be used in a proper manner.
01:00:23
What are your thoughts? Yeah. As for the possibility of it, I don't know how close that would be or what it would even look like in early stages, but yeah, the brain produces some kind of output.
01:00:37
It can receive input. So there's no reason to think that there wouldn't be some device with some level of input output that someone could create in the future.
01:00:49
And you could imagine it being sort of like, and just one easy thing would be glasses that display you information as you're walking around, right?
01:00:56
Like telling you what you should do or, or what you should think about the different things you're seeing or giving you extra information about things you're passing.
01:01:04
It's kind of like grammarly even, you know, where as you're typing, it's telling you how you should rephrase your sentences.
01:01:11
So in general, I would say, just like I said before, you know, there's tools can be used and abused with something that interfaces directly to your brain.
01:01:23
I believe that Christian response should generally be skepticism to a lot of new things that humans create.
01:01:30
They shouldn't just trust that all the Kings have been worked out. They shouldn't just trust that there are no nefarious intentions behind the people who are creating something and not,
01:01:40
I'm not promoting like the idea that you know, there's necessarily like some kind of global conspiracy around something, but just being aware that if AI is involved, you know, it is going to have a particular bias and that bias is going to be determined by the training data it's fed.
01:01:56
And that training data it's fed is going to be determined by the people that made it.
01:02:02
And so, yeah, in general, I feel like the answer to a lot of these questions isn't not whether a
01:02:08
Christian should, but whether or not a Christian should wait like five, 10 years and then easy answer to that being yes.
01:02:17
And then like, we'd actually have data to answer that question because it's shape would take an entirely different form.
01:02:24
And honestly, that was kind of my to bring up the vaccination thing again, I don't know why that just keeps coming up in this episode, why
01:02:30
I keep thinking about that, but that was kind of my disposition to this whole thing was why should from a
01:02:37
Christian perspective where I understand the weakness of humanity, why should I trust that we were able to get this right in just a few months?
01:02:45
I just don't understand why we would do that. And so, yeah, my first answer to these questions is just like,
01:02:51
Christian should not be early adopters for some, for an invasive technology. And then secondarily, well, that's what, when you can evaluate after, you know, waiting, that's when you can evaluate what it's actually doing and what's going on.
01:03:05
Okay. Well, that's all my questions, Tim. Did you have anything else? I think we, I think I'm done too.
01:03:12
Okay. All right. Well, I think, you know, that if that's the case, and you know, that's a good place to stop the conversation.
01:03:19
Well, Conley, was there anything else that you wanted to say that we didn't cover really quick? There were some rabbit trails
01:03:25
I thought we'd go down, but we didn't. That's probably for the best. And then, and then I guess
01:03:31
I'd say that, you know, I'm hoping that a lot of this is not prophetic, but I also imagine that, you know, 10 years later, you'll be able to look up this video and see some of the things we're saying about people supplanting human relationships with robots be like, wow, it really was the case all along that was going to happen.
01:03:48
And the takeaway from that for anybody watching this video far into the future, let me time travel, speak to you right now.
01:03:56
The answer to why this is the case is because you can analyze the way the world is thinking from what the
01:04:02
Bible says about, about the world. You can go to the Bible. It tells you about the thoughts and intentions of the heart, and you can see what scripture says about the heart of mankind and replacing
01:04:13
God and even replacing his creation, the way he has ordered it to be. And you can, you can plot a trajectory with some level of confidence.
01:04:24
And so, yeah, for anybody who comes back and watches this in the future,
01:04:30
I'd say that, you know, the main takeaway for you should probably be that scripture gives a clear understanding of the heart of man.
01:04:37
And it's very powerful to understand and know that. Yeah. Yeah. It turned, you know, funny enough, it turns out that God's word is true, you know, yesterday, today, and tomorrow.
01:04:49
And, and that's obviously the wonderful thing about, about the Bible is that it's true all the time.
01:04:57
It's true. And we can rely on it. And, and God has revealed so much about the world we live in, about himself and even ourselves.
01:05:05
And we can trust those, you know, those insights and know that God, God wants to reveal those things.
01:05:12
Number one, you know, so that we can glorify him. Right. But then two as a protection, as a blessing for those who have ears to hear it.
01:05:21
And so with all that being said, I think that's a good place for us to wrap up the episode.
01:05:27
Hopefully for everyone listening out there, this has been, you know, a conversation that has really made you think about a lot of these things.
01:05:37
I think, you know, more, more than probably most of the things that we've said on this,
01:05:44
I would want personally, I would want people to who are thinking about AI and what do we do with it as Christians is to just take a minute and slow down and just actually think about what is, what's going on and what
01:06:00
AI really is and what it can be, what it cannot be and, and look at scripture, you know, like, like we've been doing.
01:06:08
I know Conley you've been pulling up a lot of verses to sort of you know, help inform us about what
01:06:16
AI is now, what it could possibly be and what it, what it can't possibly be.
01:06:22
And so I just want to encourage people to, you know, just slow down and actually think about what these things are from a
01:06:29
Christian perspective and what, what we should do with them. And, and I think there's a lot of wisdom,
01:06:35
Conley and what you're saying about, you know, bare minimum, let's just wait and see what it actually looks like, you know, and then reevaluate after some time has passed.
01:06:45
And I think there's a lot of wisdom there. So hopefully that's, hopefully this has been an encouraging conversation for those out for those of you out there listening.
01:06:54
Conley, we want to thank you again for coming on the show and, and talking about this and you know, shedding some light on, you know, your thoughts with AI.
01:07:04
And, and I think it's been really interesting hearing your perspective on this and it's a little bit refreshing, especially compared to the pagan, non -believing world who is absolutely convinced that, you know,
01:07:16
AI will either be our savior or, you know, our evil overlord one day, one day.
01:07:22
And, and, and, and, and so it's been refreshing to, to hear your view on it and, and hear some various passages of scripture.
01:07:31
So before we close though, Conley, why don't you tell everyone out there who's listening where they can find more of, of the things you're working on right now?
01:07:41
Sure. Yeah. So I'm a pastor at Silicon Valley Reformed Baptist church. That's svrbc .org.
01:07:46
And I'm also an author of the Dorian principle, which is at the dorianprinciple .org.
01:07:52
And you can check out the previous episodes I've done on boggle bashed. Yeah. Yeah. Well, again, we want to thank you for coming on the show and talking to us, giving us your time, giving us your thoughts.
01:08:03
And we want to thank all of you out there listening, supporting us weekend and week out, interacting with us online and asking your questions, you know, giving us your feedback about various things.
01:08:13
We really appreciate all that and enjoy being able to talk to you guys. Even though it, even though it is just over a screen, like we were, like we were kind of, you know, saying it can be a bad thing sometimes, but we enjoy being able to interact with you guys and it's a huge blessing.
01:08:29
And until we see on the next one, this has been another episode of Bible bashed.
01:08:34
We hope you have been encouraged and blessed through our discussion. We thank you for all your support and ask you to continue to like, and subscribe to Bible bashed and share our podcast with your friends and on social media.
01:08:46
Please reach out to us with your questions, pushback, and potential topics for us to discuss in future episodes at Bible bashed podcast at gmail .com
01:08:55
and consider supporting us through Patreon. If you would like to be Bible bashed personally, then please know that we also offer free biblical counseling, which you can take advantage of by emailing us.
01:09:06
Now go boldly and obey the truth in the midst of a biblically illiterate world who will be perpetually offended by your every move.