The AI Sex Robot Interview | Jeremy Meeks

Room For Nuance iconRoom For Nuance

0 views

Join us for a conversation with Jeremy Meeks, the Director of the Chicago Course on Preaching at Simeon Trust. Listen to Jeremy explain the serious danger that technology and AI sex robots pose to our understanding of what it means to be made in the image of God.

0 comments

00:10
All right, we are back with another episode of Room for Nuance. I'm here with my special guest.
00:16
Jeremy Meeks. Who is the? Director of the Chicago Course on Preaching and resident pastor for Ministry Apprenticeships at Christ Church Chicago.
00:25
Are you gonna talk that fast, this whole interview? No, of course not. Okay, and the center for what now, for preaching? The Chicago Course on Preaching.
00:33
Okay, and that's run through? Simeon Trust. Simeon Trust. Yep. Okay, but today we're here to talk about?
00:40
Whatever you wanna talk about, but I think we're talking about, believe it or not, sex robots. AI sex robots.
00:47
Yeah. And Augustine. Yeah. Or is it Augustine? Oh, it's Augustine. It's Augustine for sure. Augustine is a city in Florida.
00:54
Yes, I hear the nightlife there is great. I do not, but you can do whatever. All right, get us started with prayer.
01:01
Ask for the Lord's help, because I feel like we're gonna need it. Okay, all right. All right. Dear Lord, thank you for this opportunity to have a engaging and hopefully entertaining and enlightening conversation about a serious, fascinating, dark, yet hopefully very encouraging topic for the future of Christianity.
01:25
Help us to ask good questions and give good answers in order that those who listen might be encouraged and educated, in Jesus' name, amen.
01:35
Amen. I couldn't help but notice in that prayer, brother, that you said engaging, enlightening, and entertaining alliteration.
01:42
You just can't help it. I just can't help it. It's just built in. Yeah. All right, I'm a little nervous about this conversation.
01:48
I decided to do it. You and me both. Other people are nervous about this conversation. Right. The Semian Trust guys are like, you're gonna talk about what now?
01:55
Oh yeah, no, they're all used to it. Okay. Mainly our friends who are like, oh my gosh, you two are gonna talk about this?
02:01
Well, we have Luke here to sort of act as a referee and we can edit, baby. That's right. The power of the edit.
02:06
That's right. All right, in all seriousness, I sent you this verse about a week ago to ask you if it was even appropriate for us to have this conversation.
02:17
So let me read the verse and then we'll answer that question. Ephesians 5, 12. For it is shameful even to speak of the things that they do in secret.
02:32
So should we even be having a conversation about sex robots or is that unbecoming of Christians?
02:38
Yeah, so it depends. All right, well, moving on. No, here's what it depends on.
02:44
Here's what it depends on. It depends on what we're doing with the conversation. If we are merely having a conversation in order to tell jokes about it or to make fun of people who would do it or to get salacious with details for the sake of being salacious, then no, we should absolutely not be doing this.
03:04
But I've been asked this question ever since the first time that I told people this is what my research was.
03:09
Many Christians are like, this is ungodly. Now everybody seems to think, wow, you're a prophet.
03:15
And it's like, no, it's just always trying to do this work for the sake of other people. So if this conversation is for the benefit of both
03:21
Christians and non -Christians in order to think hard about this, in order to figure out what it means to be a
03:26
Christian in the world and what the difference the gospel makes and Jesus and the resurrection, that kind of stuff, then yeah, absolutely, it's worth talking about.
03:34
Yeah, there's almost a sense in which you've been, are you currently a pastor? I know you were a pastor. I was, I still am. Okay, so you're a pastor,
03:40
I'm a pastor. There's almost a sense in which it can't be avoided. Right.
03:45
Right, like someone in your congregation, I don't know how big your congregation is or what, but I mean, the internet makes the world a small place.
03:52
That's true. Most people in your church have probably heard about this to some degree and some people might even be toying with the idea of getting involved with it.
04:01
Right, yeah, and really, as I kind of lay out in my research, it doesn't actually matter if you've ever thought about these things or seen these things or fantasized about these things.
04:15
They kind of are part of a conceptual universe that you are inevitably wrapped up in that makes things like this incredibly appealing.
04:22
So by thinking about this, we think about a bunch of other things. We're forced to think about those things and we get to think about those things.
04:28
Were you nervous at all in writing about this subject, semicolon, because, did
04:35
I use the semicolon right? Because. Probably not. Probably not. I can imagine someone might uncharitably think this guy's writing about AI sex robots.
04:44
He must be some kind of strange pervert. I mean, who willingly, and by the way, reading your dissertation, you didn't get salacious, but it's dark.
04:54
Oh, it's dark, yeah. So like, have you been at all nervous about what people might uncharitably suspect about your heart?
05:00
Yeah, so I have been gifted by God one of the darkest senses of humor on the face of the planet.
05:06
So I welcome the people, like I actually want people to think that I'm weird in order to then provoke a conversation that hopefully leads them to the betterment of things.
05:16
But I do think it's like a legit question, like why would you do this? And my answer has always been, well, part of it, at least in relation to what you're talking about, is because somebody has to.
05:26
But like, how did you first decide like, ah, I'm actually gonna do a deep dive on this? Right, so I got a master's in bioethics.
05:36
Can you explain? Yeah, so bioethics being kind of decision -making on matters of right and wrong pertaining to life, death, and all of the decisions about living in between.
05:49
So kind of medical stuff and all that stuff. So I got a master's in that to prepare me to write for something on a
05:57
PhD level for lots of different reasons. But I knew I wanted to write on something
06:02
I was interested in. And the dirty secret of a PhD is do something that not a lot of people have done.
06:08
I knew I wanted to do something on food or sex because they're the most kind of animalistic parts of our nature. And yet we are fascinated by them.
06:16
We write songs and fight wars about them. So why? And then I was like, okay, well, what's the most controversial yet ultimately beneficial thing that I could write on for the sake of the church?
06:27
Oh, sex robots. And yeah, that's why I did it. And then the more
06:32
I thought about it, the more I started working down that path, I was like, oh yeah, this is exactly what I should be doing. Yeah.
06:39
Wow, what a strange calling. Yeah, yeah, yeah. And to some degree, I'm doing it so you don't have to. Thank you.
06:44
Right, because pyramid of sins, right? Sex stuff, I'm not about to say it's not on the pyramid, but it's nowhere near the top of the pyramid.
06:50
So I can live in this world with relative ease. I become more of a Christian, more sanctified by thinking about this stuff and being in this world.
07:00
Whereas I think that for some people, it doesn't make me any better than anybody else. Like I can't be around money, for example, right?
07:06
So like, I'll leave that to you. But this is the world that I'm very comfortable in and I can actually be very helpful with Christians.
07:14
When you say you're comfortable, you mean you don't feel particularly inclined towards these sin struggles? Right, and I feel particularly inclined towards following Jesus and trying to help others do so as a result of being around this stuff.
07:26
Whereas if you got involved with like ethical questions about money and stuff like that, you did deep dives, you could very easily see your heart going astray.
07:34
Yeah, yeah, yeah. If I do like ministry to rich people. Yeah, I'm a greedy little pig, exactly, yeah. Yeah, okay, yeah, that, you know, and that is, yeah,
07:42
I think I'm the exact opposite. I think you could put me in a money ministry and I just don't know that I'd be very much phased by it.
07:49
But like - Which is great. Doing what you did, doing a deep dive into the world of sex robots,
07:55
I literally, for the sake of my soul, don't think I, like halfway through the dissertation, I thought, I don't know if I can keep reading this.
08:01
Yeah, I get it, yeah. Okay, so sex robots, and obviously, the second half of that is
08:09
Augustine. I see that was sarcasm. Right. It was, it was poorly executed.
08:15
It was poorly executed. How on earth, is this, because this is just kind of what you have to do with PhD stuff, right?
08:21
You have to come up with a super weird angle on things and then people are gonna be like, this guy's a genius. Right. But like, seriously, what made you decide to interpret
08:29
AI sex robots through the ethical lens of Augustine? Right, great question. So let's talk about what a sex robot is first.
08:35
Just very briefly, it's a robot you can have sex with. So there you go. That was good.
08:40
That was good, right? So, and it can take a lot of different forms. I'm imagining something that essentially looks like a human and quacks like a human, that has the kind of evocative, not even like we want to kind of give it personhood, but what we found through human robot interactions, the more like a human it looks, the more we're drawn to these things.
09:02
So I'm imagining something that is not a person, but actually like draws us to granted personhood just without us.
09:08
This isn't anything new. Like we do this with Roombas. Do you have a Roomba? Is that? That's a little disc thing that goes around.
09:14
Yes, I do. Okay. Have you gendered it? No, but I know what you're talking about.
09:19
We do tend to anthropomorphize animals, electronics. Yeah, yeah. So like, and that thing's just a disc on the ground, right?
09:26
People are often like him, her, give it a name, all that kind of stuff. Yeah, that's right. So we're just kind of prone to doing this.
09:31
It's kind of who we are. So to have sex with these things is something that is compelling for a lot of people, but it's not yet here.
09:40
Like as far as like, there's the initial stages of these things, but it's technology is always progressing.
09:48
It's gonna get more and more appealing. So, but because we don't have a lot of empirical evidence about whether these are good or bad for humans, then all we have are theoretical arguments.
09:58
And so as I was kind of looking at the landscape, I was like, ooh, there's lots of pros and lots of cons, very few religious people have done work in this area.
10:06
And so instead of going like, well, just here's my hot takes to add to the conversation. I was like,
10:11
I wonder if there is something that I, somebody I could use. And as I was thinking about robots,
10:17
I'm like, well, what makes them so desirable? And it's really their, what we call symbolic value. So they stand for something else.
10:24
So a good example is like the American flag. Like the American flag is just like a couple pieces of cloth that are sewn together in a particular shape.
10:31
And for some people it stands for like freedom and hope and all that kind of stuff. And for other people like tyranny and oppression, all this kind of stuff.
10:38
And it's interesting that it's not even clear that Betsy Ross wanted us to think all of those things when she originally sewed a flag together.
10:47
Yet we have imported, like you see it and you think about eagles and all that kind of stuff. So it's this kind of the persuasive nature of symbols that we attribute a lot of value to.
10:58
So I was like, oh, well, so it's really the rhetoric of sex robots that's particularly interesting. And this universe of kind of consumerism that we live in, it's always driving us to want something more.
11:09
So I was like, huh, well, Augustine thought a lot about rhetoric and Augustine thought a lot about sex and Augustine lived in a sexual time very much like our own in the 21st century.
11:20
So - And he also wrote and thought a lot about love. Exactly, yeah. Which I think is right at the heart of this as well.
11:25
Right, right at the heart of this. And he was a pastor and trying to help people. And I think the most helpful thing about Augustine is he was always trying to like, you know, in improv, they call it yes and.
11:36
It's like, okay, he was reticent to say no. He's always like, okay, there's something here. Let me just direct you maybe in a different direction to where you can find that thing.
11:47
The key thing that he was talking about was happiness. He's like, every humans always wanted to be happy, which he just stole from Aristotle.
11:53
And he's like, well, if God is the highest good in the universe, then we all just want God. We just all look for him in all the wrong places often.
12:01
So if everybody wants to be happy, then how can I help people find happiness? I think people wanna have sex with robots because they wanna be happy.
12:08
And so it's just like, oh, I can use this guy. You're right, super weird, who would have guessed, to help people kind of say, here's the pro arguments, here's the con arguments for sex robot use.
12:21
And Augustine says yes to both sides and kind of to both sides, orienting them towards something better.
12:27
Now I can imagine some of our listeners will be on the most uncharitable end of things, really bothered.
12:36
Sure. On the least charitable end of things, really bothered. On the most charitable end of things, a little confused.
12:43
When you talk about sex robots, you say there are pros and cons. Yes. Do you mean that one could imagine pros and cons?
12:51
Do you mean that there are really, actually, truly pros? Oh yeah, so there are, when
12:57
I said pros and cons, there's like pro arguments, like people who are for them, con arguments, people who are against them.
13:02
I'm trying to, like Augustine, say yes to all of the things that the pro side sees as a benefit.
13:11
So for example, let's just give a real simple one. The nice thing about sex robots is that they provide the user a rather large amount of, here's a fancy term, ontological security.
13:23
Ontological, it means the reality of a thing, the substance of a thing. Here's the nice thing, sex robots don't die. But you're gonna die, and I'm gonna die, and we'll see who wins.
13:33
But the crazy thing about a partner is they're going to die. And so it's like, well,
13:38
I could own this thing and have it be with me through my whole life and I never have to worry if it's gonna get sick and die.
13:45
Or if it's gonna leave me or be mad at me or deny me. Right, because I own the thing and it can serve me and everything.
13:52
And I want - It creates a sense of stability in my life, yeah. So I'm not saying that we should own the thing because I thought slavery was a bad idea and I think consent is good.
14:00
Bold, bold, bold. Yeah, I just got hot takes all day. But what I do think is the desire for ontological security, the desire to have something that's permanent and lasting is absolutely not only a good thing but a necessary thing for human life.
14:14
So it's - Just where do you find it? Yeah, where do you find it? And Augustine would say you find it in God. That's right, yeah. So saying yes to the thing, that's a pro thing.
14:22
You know, a lot of people, or some people, criticize Keller's A doctrine,
14:27
B doctrine thing. You know, where he says, oh, I see that you love justice. Well, let me show you that the way that you're thinking about justice is actually misguided.
14:35
But if you look to the gospel, you'll actually find what you've been looking for all along. I think that's a fantastic method of evangelism.
14:44
And you're right, it is as old as Augustine. I would say it's as old as Jesus. You better believe it.
14:50
Yeah. I do. Or like, yeah, you could even, this isn't even, this isn't even, like you guys don't believe in Jesus.
14:56
You know, I mean, I'm not about to like, you know, deny the eternal, whatever. But like the prophets are talking about this.
15:02
I mean, God is even like, here's the deal. Like you all want to be free, no longer slaves in Egypt or wherever.
15:08
I'm gonna give you a promise, man. I'm gonna make your life great. All you gotta do is do the things that I'm asking you to do, which is going to be the best way to live in the world because I made the thing.
15:16
So why don't you just do that and it'll be awesome. And the people are like, nah, bad idea. And then it goes bad for them.
15:22
But I think, yeah, I think that like, why not say yes to everything we can say yes to and go, yeah, that's a great thing.
15:28
I mean, how's that working out for you? Like maybe there's something else to consider here. Yeah. There's a germ, a whisper, a hint of pre -fall, right, in everyone's heart.
15:41
And their desires are just perverted, they're corrupted. So you want to just say like, no, the germ of what's true there,
15:47
I want to take that and just show you what the actual fulfillment of it is. And I think the sex robots help us go down like the deepest and darkest tunnels to think about like the most to, you know, call it what it is, perverted stuff you could imagine.
16:02
And how do you yes and that stuff? Yeah, you know, I was blown away. A lot of the craziest arguments that you quoted, the pro arguments in the book, excuse me, dissertation,
16:14
I'm imagining it will be a book at some point. Yeah, hopefully. Is a bunch of people who
16:19
I think are trying their best to be compassionate. Yes, 100%. Can you just say, yeah.
16:25
So, you know, we could go down any line, but let's go down the most controversial, okay?
16:32
The people who are arguing for the use of child sex robots for those who are either pedophiles or minor attracted persons, which
16:42
I'm not using some kind of like woke term for whatever. I'm just trying to distinguish the people who have harmed children.
16:48
That's right. And those who have not and desire to remain committed to not doing so.
16:53
That's right. And so some people are going like, well, let's at least experiment with clinical trials or whatever, child sex robots.
17:01
And there's a couple reasons for this. They're like, Christians really need to get their heads wrapped around. The current way of helping or treating either carrot or stick, those who either have offended or desire to not offend are questionably constitutional and definitely inhumane.
17:20
Kind of the chemical castration and all this kind of stuff. It is like, it ruins people's lives.
17:26
And so they're going like, I don't know, like let's find some way to help these people out.
17:32
And some of those ways are incredibly controversial. And one of them is, well, let's just try child sex robots.
17:37
I don't know if it'll work. Now there's all kinds of reasons. So you mean a sex robot designed to look like a child that the person could have sexual relations with.
17:44
Yeah, then maybe that will help them either remain committed to non -abusive children, which I would guess, like, let's just do this.
17:51
Like, are you for the violent sexual abuse of children? I am against it. Hot take, yep. So if we had less of it in the world, it would be a better world.
18:00
So in one sense, it's like, well, then let's just do whatever. It's like, well, no, like the people who are advocating for this stuff are like,
18:08
I don't know how to help these people, but it comes from a place of compassion. There's all kinds of arguments on the con side.
18:13
Like, well, what about normalization? And I don't know if this is the greatest idea, but the con side is also, and I think this is important going like, we do need to help these people.
18:22
I just don't know how it's gonna work, but maybe child sex robots aren't the greatest idea. But I think it's important to notice that both the pros and the cons side of these arguments are
18:32
I think seeking to be legitimately compassionate. Yeah, you know, I was, the child sex robot stuff made me sick to my stomach.
18:44
The one that I thought a little more interesting was like for elderly people. Yeah. Not that I particularly want to imagine any of that.
18:53
And listen, I'm not even thinking about sex as part of the sex robot, but when they were saying like elderly people, you know, having some kind of companion.
19:03
Companion, right. But I, as soon as I considered that and I thought, oh, that's interesting.
19:08
I immediately was hit with a cascade of ways in which this is not only insufficient, but evil.
19:15
Right. You know, so let's just talk about that real quick. Why do you think fundamentally at the core, sex robots are bad for society, bad for people's souls?
19:27
Oh, right. Yeah, I think it's because it orients them toward something that's not
19:34
God and is so persuasive that it might just work in the sense of giving them what they're looking for, at least in the short term.
19:46
And as a result, leading them to kind of aim too low and not get something that will actually satisfy them for the rest of their lives.
19:56
Now, there are also the potential, you know, you talked about the elderly, like, is it actually just gonna drive them even further away from human companionship?
20:08
Is it going to lead to more confusion on their part? I mean, they're already confused in the society in which we live.
20:14
And so it's like, oh, well, is this just gonna further marginalize people who are already marginalized, who we already kind of just wish would die and go away.
20:22
So it's like, well, this isn't gonna help that thing. So it's not helpful for the individuals using them.
20:28
And it's also not helpful to the rest of the people who essentially will, I think, take moral license and go, well,
20:34
I used to go visit grandma twice a year, but now she's got the robot thing. So now
20:39
I feel good not visiting her at all. I can just tap in and like, look through the robot's quote unquote eyes and see grandma, maybe talk to grandma, and then
20:45
I'm out. She's got a companion. She doesn't know any better. So I think it really leads to, again, we like live in this world.
20:54
Even the idea of like, oh, my grandma's really hard to deal with. I know that there's these companion robots.
20:59
Sex is a part of it, but it's much broader than that. Just a care robot. Somebody to sit there and listen to her crazy stories every day.
21:06
Even living in that world, that kind of temptation to be like, oh, I gotta go see my grandma again. Man, it'd be nice to just give her the robot.
21:13
I know that robots are probably not a good idea, but it'd be so much easier. Like we have to live in that kind of world.
21:20
And so this is how these things are drawing on even our desires for just kind of selfishness or whatever, even if I'm not using one, my grandma's not using one, but I know that exists in the world, which is why we should be thinking about this stuff.
21:36
I have like 15 ,000 different directions I could go with what you just said. I actually have like a notes document full of notes that I took while reading, but like every other sentence you say,
21:46
I'm like, I'm trying not to chase that, you know? So I guess I'm just saying that to say if our viewers help me or bear with me as I try to figure out how to -
21:57
And this is partly why I did this work. Yeah. Because here's the thing, you go, well, why shouldn't we give elderly people care robots?
22:10
And you go, oh, that seems bad. We should just care for our neighbor, which is the
22:15
Christian response, right? And we have good arguments for it. What's so fun about this stuff, complicated, but fun, at least
22:22
I think it's fun, is that you, when you come to the end of it, having thought through all this craziness, you actually have a reappreciation for the most basic things.
22:32
Man, that's so true. But you would go like, oh yeah, of course. I'm not even gonna think about that. I mean, at the end of my dissertation, like here's the conclusion,
22:38
I'll give it away. Christians should patiently, in a world full of sex robots and the persuasive desires to have sex with them, provide a better argument.
22:50
Not just saying sex robots are wrong, but demonstrating the lack of necessity for sex robots as they pursue committed marriages and dedicated celibacy towards the service of God.
23:02
You're like, well, why'd you even write the dissertation? It's like, well, shoot, you would have not appreciated that nearly as much had you not had to think about elderly care robots.
23:11
When I started the dissertation, like maybe 20 pages in, I thought, this is ridiculous.
23:16
Like I'm reading 200 and I think like 60 pages of dense scholarly material about like, you know, through the lens of Augustine, why shouldn't people have sex with robots?
23:31
I mean, the answer is just like, because don't, right? And then I worked through it all and I really thought, oh, after working through all of that, because this will not help you love
23:42
God and this will not help you love neighbor. It just came back to the great commandment. Yeah, and it won't actually make you happy, which is all you've ever wanted to be.
23:50
Yeah, that's right. You know, when you said earlier, like it might actually work, that to me seems so unlikely.
24:01
I mean, work, it depends on what you mean by work, right? You've seen the movie, Her. Oh yeah. Okay, Joaquin Phoenix falls in love with an
24:07
AI voice, right, an incorporeal being who he perceives to be sentient because he ascribed sentience to her, right?
24:16
That's a big part of AI. We want, we kind of want these things to be real. But like at the end of the day, ontologically, this thing can never be real, right?
24:27
It can never be sentient. And human beings are made by God in such a way that we can never really find our satisfaction in these things.
24:36
So then that leads us to the conversation of idolatry, which is us constantly trying to find our satisfaction in things that aren't
24:42
God, that can't ultimately satisfy us. But it doesn't stop us from trying. Of course not, because we try really hard.
24:49
And instead of just having the argument of that's an idol, you're dumb, don't do that thing,
24:56
I think it's much more provocative to go, huh, why are you doing that? And what do you think that's gonna get you?
25:05
And those things that you actually want, I think my biggest concern, this gets to the, it working, my biggest concern is that you're going to persuade yourself and it's going to work for a minute.
25:17
The question is what happens when it stops working? And can you live in a world where you actually can't just buy your way into happiness?
25:27
Because being shaped in that way is a terrible way to be. And I'm not even talking about sex robots, I'm now talking about the people in your church and my church that want a new car.
25:36
Because they're persuaded that the new car will make their life better. Not because their car is like terrible or whatever, their car broke, they're just like, yeah, but I just want something more,
25:45
I just want it. And it's like, well, interestingly enough, by thinking about sex robots, you gotta think about all these other things before it that will hopefully shape you when you have that overwhelming desire to buy that new, like electric toothbrush or whatever, or that new
25:59
TV. You go, wait, why am I doing this? Why are these desires so compelling inside of me?
26:05
And the danger is that they work for a minute. You know, people, I hate having a cell phone.
26:13
But I remember when I got my latest cell phone, which was a while ago, I was so happy. I was like, this thing is so much better than the old piece of garbage.
26:21
And that worked for like, I don't know, a few days. And I was just like, oh, it's just like any other phone. And you feel that thing of like, well,
26:28
I wonder when the next phone's gonna come out. Or the next one does come out and you get mad at your phone, even though your phone's fine. It works for a minute and that's the problem.
26:35
It's that like little dopamine hit or whatever. This stuff could work too. And the question is like, how do we, how are we gonna actually change our whole ways of thinking or being in the world in order to make ourselves happy or just at least try?
26:49
Because we're so persuaded that it could work. And it's terrifying because it's not that much different than what we do with other objects.
26:56
So we think about this, hopefully we'll think better about things that are closer to us. If you really want to understand the world, you got to think about sex robots.
27:05
That's right. There's the hot take. Obviously, as I'm reading about this stuff,
27:13
I can't help but process it through the conjugal relationship that I share with my wife, okay?
27:19
All right. We've been, I say, he's like, all right, it wasn't obvious to me, right? Well, I'm thinking in one sense, in a very mere sense,
27:29
AI sex robots are just very expensive, very complicated masturbation machines. That is correct.
27:34
But in another sense, they're probably going to do better than other masturbation machines because people will believe the lie that they can find some sort of real intimacy, love connection to these things, whether they're deceived in that or not is one, is another thing.
27:51
But then I'm thinking about the lengths that people will go to experience what they consider to be a good orgasm.
27:58
Yeah. Then I'm thinking about, okay, I've been married going on 20 years, okay? And me and my wife both came into our marriage with broken paths when it comes to sexuality.
28:07
And I'm thinking about the glory of the marriage bed as we've grown together in Christ. And the way that that has happened because we are ontologically, like we are real beings, we change, we grow.
28:22
The Sean that my wife is married to today is not the same Sean that she married. And we all thank God for that.
28:28
And we all thank God for that. But I was just thinking the worst sexual intimacy that I've ever had with my wife has to be better due to the ontological nature of who we are as human beings and the literal bonding that our souls have as the two have become one, has to be better than the best sex, quote unquote sex, that someone can have at the sex spot.
28:56
Depending on how you define that. Okay, well, that's, yeah, you do it. Yeah, I mean, I do believe and want to fully concede that while your wife may be incredibly talented or whatever, that a sex robot could blow your mind.
29:11
Well, I don't, I'm not just speaking of like the sort of raw physics of the thing. I'm saying this is why the better thing is really important and why we are even like careful when we talk about what do we mean by better?
29:24
I don't know, like blaming you for not being precise, but just like in some one sense, it might be better. I don't know, but isn't, the question is like, is sex only the act?
29:33
Now, interestingly enough, we're really helped here by like very liberal feminists who are like, when are you guys gonna get over the fact that like sex is just like this action that lasts, let's be honest, not so long.
29:43
So like, and is usually not very good for us, but only good for men. And so the better stuff is because, you know, it's what we tell people in premarital counseling all the time.
29:54
You know, hey guys, just real clear, everything that happens outside the bedroom affects what happens inside the bedroom.
29:59
Sex starts when you help with the dishes. That's right. Well, I'm including that. That's what I mean. When I say better,
30:05
I mean, from the time that I started doing the dishes or giving a shoulder rub until we wake up together the next morning and we hear our kids being loud in the living room, whatever that is, it has to be infinitely better than the most carnally good orgasm you can have with a sex bot.
30:26
Would you still pick at that? No, I wouldn't pick at that. You've just made it better. Okay, thank you.
30:31
So, yeah. So here's the thing. I never get it first try. Yeah. Rarely get it second try.
30:38
See, joke's right themselves, I'm just gonna avoid them. So here's the thing. I'm really glad that that's true for you.
30:45
I would say that's true for me. Yeah. I think that - Not true for a lot of people. It's not true for a lot of people. It's not true for a lot of Christians, which is why one of the things that I'm arguing for is that Christians need to demonstrate the goodness of our convictions.
31:00
Right. Or we have no ground to stand on. Now, I hope that there's more people like us who essentially, again, live such
31:09
Christian lives that people go, oh, well, no, the sex role,
31:15
I don't want a sex role, I just wanna be like Sean. Well, I mean, hasn't the data proven that religious people in general are some of the most sexually fulfilled people there are?
31:26
Yeah, it does, but I - But how do we demonstrate that? Do I just need to roll out some spreadsheets for people?
31:32
Do you like - Yeah, no, I think that it's continually talking about trying to identify what people are really looking for and then saying, yeah,
31:44
I think that that's a good thing. Now, let me tell you how I've found that thing and how that thing is actually much broader than the thing you think it is.
31:51
That kind of the overpowering, mind -blowing orgasm. It's like, okay, the orgasms are great, but let me tell you what's really great.
31:59
The whole waking up in the morning next to a woman, secure in yourself and in her, and hearing your kids play in the living room, getting up and making pancakes.
32:10
You gotta understand, especially you 21 -year -old weirdo, that's part of it, and that's just part of the duty of it.
32:19
That's part of the joy of it. And then actually living it out. And so people are like, oh, well, this is rather provocative.
32:27
It's not just provocative words. It's a provocative way of being in the world. You keep using that word provocative.
32:33
Yeah. How come? Because I think that we live in a society where everybody is just like, let's just go to heaven.
32:41
We've been saved or whatever the heck that means, and we're just gonna be fine, so let's just chill and then stay away from those people over there and try to shut that thing down over there if we can.
32:52
But then we'll just go to heaven and it'll be fine. So Christianity is by its very nature a very provocative thing.
33:00
There's a guy who came who's also God who died and then made us right with God of the universe, and then we get to go be with him when we die.
33:07
That's the dumbest thing I've ever heard of on one level. So it's like that whole keep Christianity weird thing.
33:13
This religion isn't normal. Quit pretending that it is. And it doesn't provide, interestingly enough, for a normal way of life, and yet it's the most normal way of life.
33:23
Interestingly enough, Christianity provides not only the argument but the grounds and the hope for living in the most human way possible, the most fulfilled and fulfilling way possible, not only for you but for your neighbor.
33:35
So it's just like, just embrace the fact that this is both the most normal and the most weird thing at the same time.
33:42
And as we talk both to Christians who are very kind of just man their faith and the non -Christians who are like, what are you people?
33:50
There needs to be something other than like, this is all normal, come join our club because it's all normal. No, there's this kind of provocative element of like, you really enjoy things but you don't need them.
34:00
You're looking forward to like a new heavens and new earth, but you're also like taking your kids to the park and enjoying today for whatever it is.
34:07
You know you're gonna die and yet you enjoy the living, but you don't put too much stock in it. Like, that's the coolest thing in the world.
34:15
But it is provocative and it should be. So you are bent towards, you want to provoke people.
34:22
Yeah. Whenever you talk about things, you want people to stop and go, huh? Yeah. Yeah. You gotta be careful with that.
34:28
Of course. Should we cuss on stage? I think it depends on the context. Okay. So.
34:38
Mark Driscoll is totally vindicated. Nope, that's not what I said, but thanks for putting words in my mouth.
34:43
I imported meaning into you. If you edit this to make me say yes to that statement, I will roast you forever.
34:50
So, which I will anyways. But the fact of the matter is, like always provoking people toward faith and discipleship.
34:59
Yes. What that means in each individual context is going to differ. I'm not going to speak the same way when
35:06
I'm doing inner city prison ministry as I am when I get invited to a lady's tea.
35:12
For some weird reason, they invited me. Don't know why that would happen, but I'm going to talk differently. But always, in both those cases -
35:19
Let your speech be seasoned with grace as fits the occasion. Absolutely. But always trying to essentially, the way
35:25
I talk to my students about this when I talk about doing application, is your job is to kill everybody.
35:32
Not just one certain sector of the - Even the way you said that was provocative. Exactly. What do you mean, kill everybody?
35:40
Help everybody understand that they are the ones who need to hear this word in order that they would lose trust in themselves and have all the trust in the world in Jesus.
35:50
In order to do that, you have to kind of get under people's skin, get through their armor, pierce their consciences, and go, yeah, no, maybe you thought you needed
35:59
Jesus when you walked in here today. Maybe you didn't. But the Jesus you need is this Jesus, and you need him far more than you could possibly imagine.
36:07
So that hopefully the strong believers are stronger, the weak believers are strengthened, the unbelievers are provoked towards faith in Jesus.
36:14
Instead of just like half the congregation going like, I'm sure glad Billy Joe Bob was in this sermon, and Billy Joe Bob goes like,
36:21
I just got wrecked today. And either he's really excited about it, because like, oh,
36:26
Sean, I just love it when Sean punches me in the face. Or, why didn't he talk about everybody else in the room?
36:32
Why'd you pick on me? You gotta kind of kill everybody instead of sort of like hit everybody in the room with the truth and the exhortations and all of it.
36:41
Yeah, I'm thinking about in 1 Thessalonians, Paul says, you know, help the weak, exhort the idle.
36:49
Yeah, know who's here and make sure that everybody gets a dose. Right, yep.
36:54
Back to sex robots. There we go. I was trying to think through this.
37:02
You're writing as an ethicist. Yeah. I'm reading this through the lens of a husband, a father, a pastor.
37:08
I know you are those things too, but those are the primary things. I'm trying to think, what is a scenario that I can imagine happening in my church?
37:17
Here's what I came up with. You walk me through how you would respond to this. I thought a wife in the church asked for a meeting.
37:28
You sit down and you meet with her, and she says that her husband is trying to get her to bring a sex robot into the bedroom.
37:35
She has disagreed with that, but he is insistent that anything that happens in the marriage bed is sanctified, therefore it's not sin.
37:46
Oh, gotcha. How would you respond? Yeah, so I would first respond that if somebody's listening to this and going, this is just Sean being a weirdo.
37:59
These are the kinds of things that are happening. They will continually happen in greater numbers.
38:06
And actually, the reason I did all this work is to help the pastors who have these questions and have never thought about them.
38:14
So what do you say? I think you say, the first thing you say is, well, let's have a conversation together.
38:22
You should never have a conversation with the wife and then have a separate conversation with the husband, unless there's a really good reason to have this split up.
38:29
It's like, well, let's have a conversation together about this. Step one, get everyone in the same room. And then be like, okay, why is it that you think, husband, just always question the premise.
38:39
Why do you think that everything that happens in the marriage bed is sanctified? And then when he gives some kind of crazy answer that makes no sense, because it doesn't make any sense, demonstrate how wrong he is.
38:51
Okay, so if you chop your wife's arms off in the marriage bed, is that sanctified?
38:57
Well, of course not. Why not? Oh, so there are limits to this. So the question isn't, are there limits?
39:05
It's what are they? Because you just demonstrated that you don't even believe your own premise. So now let's actually work towards what this would be.
39:12
Now, all you're trying to do at the end of the day is like, quit being a perv, dude. Like, what do you even want this thing for?
39:18
And how dissatisfied are you? Don't you understand that like doing this is so dishonoring and shaming to your wife?
39:25
Like, what kind of man are you? What kind of husband are you? What kind of Christian are you? But like, you know, what you want to do is, again -
39:33
You're shepherding them. Yeah, you want to provocatively get to that place. So you don't even have to say that. He feels that and says, thank you for it.
39:40
Which is way better. It is better. Having said that, let's just imagine that there's a pastor out there who's not trained in the art of rhetoric.
39:50
And he just sits down and he says, brother, what you're doing is dishonoring to your wife. This is shameful. You should stop this immediately.
39:57
This is sin. He's not wrong for saying that either. Not at all. Good, better, best.
40:04
Is there a wiser, more winsome way to Socratically lead him along the path? Right.
40:09
Yes. Okay. You know, there's something about like the 1970s
40:18
Southern Baptist biblicist impulse lives strong in me. I get that.
40:23
I want to sit down and say like, let's open up our Bibles and let me show you exactly why you can't bring the sex doll right into the bedroom.
40:37
Proof texting is going to be a little difficult there, but the way that we do all theology is usually by way of implication.
40:44
So for example, someone in your congregation is shooting up heroin. There's no Bible verse about heroin, but there is a
40:50
Bible verse that says, be filled with the spirit, do not be drunk with wine. And so you take them there and you say, and by way of implication, anything that's filling you up, that's not the spirit of God is sin.
41:02
It's making you not sober minded, X, Y, Z. So what would be like the Bible verse, one, two, three
41:08
Bible verses that you might take them to? Yeah, so I would probably take them somewhere like Jesus' commandment to love neighbor and then say like, all the world will know that you're my disciples on the basis of the love you have for another.
41:31
And you go, so here's the thing, please explain to me how that's a loving action. So I mean,
41:37
I do, I understand. Yes, I'm trained in philosophy, I'm trained in rhetoric and all that kind of stuff. It's one of my deep and abiding laments that pastors are not very good at this.
41:47
And I think that if we have any hope of doing serious damage for good in this world, we need to get good at this.
41:55
You have a Paul and a Peter for a reason. You're the Paul, I'm the Peter. But you go like - Uneducated, obviously.
42:02
You said it. So I'm just affirming you, I'm trying to be a Barnabas. Thank you. Yeah. You're a
42:07
Paul and a Barnabas. Wow. You got all the gifts. I got it all. That's right, split personality. So the point is that like, you just go like, okay, this
42:16
Jesus says that other people will know us on the basis of our love. How is this a loving action? That feels so benign to me.
42:23
Okay, that's fine. I'm not saying that's the only Bible verse. Sure, sure, sure. The other Bible verse would be something -
42:28
Swing and a miss. Okay, swing and a miss, good luck. Obviously the commandment to love your neighbor is not a swing and a miss.
42:34
So I would also say something like this. I mean, again, it's like kind of scratching. It's like, well, God says don't have sex with animals.
42:43
So why not? Why shouldn't we have sex with animals? Well, that's disgusting. It's like, well, hold on, what's the difference?
42:50
So again, getting people to think, but like, actually, there are, in fact, the Bible is rather limiting on what you can have sex with.
42:59
And it goes out of its way to name things you can't have sex with. Therefore, you've got some pretty tight parameters here.
43:07
You can't just have sex with whatever you want. Right. You know, I was thinking, this was interesting.
43:13
I was doing like all these thought experiments in my mind as I was reading. And I thought the direction
43:19
I might go is, you know, Jesus, when he uses the phrase porneia, or the word porneia, right?
43:25
It's like the junk drawer for sexual sin. Definitely the junk drawer. And it's in light of the fact that sex was created to be between one man and one wife in union for life, right?
43:35
So anything that falls outside of those parameters, sex with three people, sex with a dude, sex with a chick, if you're a chick, sex with a pole or with a robot, you know, like anything that is not one man and one wife in union for life, anything outside of that is sin.
43:52
That was kind of like my love your neighbor answer, kind of the all encompassing one. I think it's good.
43:58
And I think it's important to be like, and by the way, I'm not just talking about physical acts, right, that's why the porneia thing's helpful.
44:04
Because, you know, I've had this brought up to me either by pastor, by Joe Blow Christian, it's like, well, as long as it just happens in my head, then no harm, no foul.
44:17
Jesus says something about that, doesn't he? Indeed he does. So it's just like, yeah, but it is,
44:23
I do want to affirm the fact that it is more complicated to talk about something like sex robots than it is to talk about something like,
44:32
I don't know, doing drugs or whatever, go back to your example. Now talking about doing drugs is also complicated, but it's much easier than this, especially if the sex robot looks like my wife.
44:47
Yeah, which you can have the robot designed to look like your wife. That is correct.
44:53
But you know, man, I just kept, I kept taking, okay, I said, okay, you know, they're coming along with this really advanced
44:59
AI and at certain points in your dissertation, you talk about how they have like warming technologies and responsive elements that can be built into the robot.
45:09
And I thought, you know, if you spent a billion dollars on a robot, I still just,
45:14
I mean, maybe once it would be this really amazing experience, but I just thought even then, because it's new, it's novel, even then it cannot compare with what you experienced with a body that was designed by God for sexual pleasure.
45:33
Like as nice as a, let me be very careful here. Here we go. Very careful.
45:39
As much engineering power as you can put into a simulated vagina, it will never be as good as a properly functioned,
45:52
God -designed vagina or phallus to use the opposite of the spectrum, right? I believe that's true, but not everybody knows what that's like and not everybody likes that.
46:05
Sure. Yeah. So I agree with you on the kind of platonic ideal level, but there are people who are looking for other stuff.
46:15
When you say the platonic ideal, it's like, it makes it feel like you're saying like, it's basically unattainable, but I think it's the most normal thing.
46:25
I would say, yes, I would agree with that statement. It is the most normal thing. Yeah. Not everybody is normal.
46:31
Like above 50 % of people who get married are going to have, yeah, they're going to have a good time with one another's bodies.
46:37
I agree with that, especially if they work at it. Yes. Right? Yeah. And it does. It's like, you know, the nice thing is it's free and it's plentiful.
46:46
And you can practice as much as you want. You can practice as much as you want. So, you know, it's that thing of like, you both should seek to outdo one another.
46:52
Yes. Man, that's such a good point. The more you go with a sex robot and it doesn't take very long, the less good it is.
47:04
Like the novelty wears off immediately. Right. But like, if you are in the context of, and I know there are other people who aren't, but I'm just saying, let's say you are in the context of a basically happy marriage and you practice for 20 years together.
47:19
Right. It's just going to get better and better. Oh, yeah. Yeah, so I - Until things stop working. Right. But I, you know, I used to work in an afterschool program with a bunch of hood rats.
47:28
And yeah, so I'm a real good person. But they would regularly ask me, because my wife was there too, and they'd regularly ask me like, dude, how is it that you like have sex with just like that one lady?
47:39
And I would regularly tell them like, I do not at all lament, or whatever, the next like five sexual experiences you're going to have, because they're all going to be terrible.
47:51
Because I've got it good, because we've actually worked and practiced this thing. And they were all kind of like super shocked and like, whoa,
47:58
I never, I thought what they were expecting was, oh man, I wish I could be young again. It's like, nah man, being like an old middle -aged dude that's excited about lawnmowers rocks.
48:09
And they were just like, again, the provocative thing, they're like, what kind of world is this? I don't have to have sex with the same woman,
48:15
I get to. I get to. Things get better as you go. Now, this is where the sex robot stuff is so challenging, is that we live in this consumeristic universe that persuades us that yeah, it might not last forever, but there's always upgrades.
48:29
Yeah, that's right. There's always just this one next thing. Now, I think it's a secular version of the prosperity gospel, right?
48:36
Just like your problem is that you didn't have enough and Christianity would be faith, or the kind of religious fear would be faith.
48:44
In the consumeristic world is you didn't have enough buying power, or you just had poor decision -making skills.
48:50
So just the most pernicious lie of consumerism is we might have created the problem, but we can also fix it.
48:58
Just buy the next thing, swap the next thing out. And so it's this perpetual kind of discovery machine in search of a potential utopian happiness that you're always just almost reaching, but can't quite reach because of course, if you reached it, then you'd stop buying things.
49:14
But what I wanna say is Christian marriage is great and you and I seem to have decent
49:20
Christian marriages that are continually improving, which means that you and I both, given what
49:25
I know about you and you know about me, have married extraordinary women. The Lord has been very kind. Very kind and humorous.
49:31
And so as a result, stuff is going pretty well for us. But a lot of people aren't in that situation.
49:37
A lot of people aren't like this. And it's good that we have good models because I think you have to have models in order to know what you're aiming at.
49:43
But I still wanna say to those who are out there who are like, well, I don't have a marriage like Sean or Jeremy's. First of all, my marriage is great, but it's still like, she's gotta be married to me.
49:52
So there's that. Very imperfect. And the second thing is no marriage can bear the weight of eternity and eternal happiness and the stuff that only
50:01
God can give. So it's super great and I'm more excited to be married today than I was when I got married more than I was 10 years ago.
50:08
But like, it's a very tempting thing to be dissatisfied because you're putting so much weight on it.
50:13
So I wanna say like, yeah, Christian marriage is great. I mean, it's not perfect. Sure. It sounds like when you say but, it sounds like you're about to say something sympathetic towards sex robots.
50:24
Just that I get it. Oh, okay, yeah. And I do too, actually. Yeah, I think we all should.
50:30
One of the things that you deal with at length later in your dissertation is like incels.
50:36
Yeah. Involuntary, involuntary celibates. White guys in a basement on their computer who can't get a lady to lay down with them.
50:45
They don't have to be white, but. Right, they're predominantly white and they're predominantly males, yes.
50:50
That's right. Another pastoral situation. Okay. You got an incel as a member of your church.
50:59
Great. He's not at peak incel or else he probably wouldn't be showing up to church on Sunday morning. Probably not. Okay, but you got somebody who's on the path.
51:05
They're trending. I think those guys are in our churches. That's right, yes. Tons of godly women, if they would just go get them.
51:15
Anyways, let's say you see a big old box on his porch one day. Yep. You say, hey, where'd you get in the mail?
51:23
And he lies to you, he comes back a little bit later. You can tell I got a lot of pastoral experience. This is probably how it would really go.
51:29
He lies to you, comes back later under conviction and he says, actually man, I ordered a sex robot.
51:36
And as you start trying to work through this with him, his argument is, I don't want to burn with lust.
51:43
Right. I don't wanna watch pornography. This allows me to not watch pornography. And or this keeps me from going and seeing a prostitute or having sex with a woman that's not my wife while I'm trying to find my wife.
51:57
What would you say to him? Yeah, the first thing I would say is I understand to some degree your thinking on this in the sense that you think the sex robot is mitigating other sins.
52:13
However, lesser of evils. Yeah, please explain to me how Christianity is a race to the bottom of lesser of evils.
52:21
So I like, let me help you understand that the pursuit of Christianity is the pursuit of the good, not the avoiding of the baddest.
52:30
And therefore, I think that again, like this is just the kind of match that lit a fuse that you should be both lamenting as a pastor and very, very thankful for.
52:40
This is why I think that we should thank God for sex robots to some like weird way. Cause it's like, man, this guy really needs some help.
52:49
He doesn't just need help in regard to his sexual desires. He needs help with understanding what
52:54
Christianity is. He doesn't get it. And so hopefully if I get to help him in this area and help him on the level of what we're pursuing, then great, then this should spill out into all areas of his life, what he does with his money and what he does at work and all that kind of stuff.
53:09
Because it's just, we're not trying to pursue the lesser of evils. That's not what
53:15
Christianity is about. Now you can't just say that cause you also have to help him and be like, but it does sound like you need friends.
53:25
Incels are incels because they're incredibly lonely because they think society is against them.
53:30
And they think that women are terrible, but women also are victims of this because they've been shaped by a society to want men that are not like them.
53:40
So they feel unwanted. Well, guess what? Christianity has the resources to want unwanted people and be thankful and grateful that they're around, even if they're incredibly challenging to have around.
53:51
But you better be able to have them around instead of just like, don't do that thing. We'll put covenant eyes on your phone and you can just hang it all by yourself, and get off the internet.
54:00
You got to provide something better. You can't just say no. The expulsive power of new affections.
54:07
Okay, so I'm kind of doing like a little checklist in my mind if I'm sitting with this guy. Step number one, you're saying like, immediately help him see, we're not trying to choose between the lesser of two evils here or five evils.
54:20
Number two, we need to seriously think about how to get him involved deeper with significant communities so that he feels loved and cared for.
54:27
I'm also thinking of just kind of like the base passion of the thing. This is a young man who's burning.
54:34
We have to help him understand how to fight lust. Right. Right. What else?
54:41
What else would you - Yeah, so like to get to that, because we don't even talk about this with sex robots, but a lot of people are like, well,
54:48
I need to get married quickly because I don't want to burn with lust. And you're like, you do know that you're called to self -control as well.
54:55
Like this is, don't ever treat your future potential wife as the cure to your lack of self -control.
55:01
I don't want to masturbate, so I got to get married. Right, stupidest argument I can ever imagine. You put the weight of your fidelity on your wife, which is -
55:09
She can't bear that. No, she can't bear that. And it's also like, yeah, oh, totally wrong. So I think that it's exactly what you talked about.
55:19
And then always trying to bump it up to the higher level of saying like, and listen, man, this is actually affecting all of your life.
55:27
Because it's going to always, we want to just delimit the thing and be like, as long as we can solve his lust problem.
55:33
But like, Augustine's really helpful on this because he's like, you know, everybody thinks he's like this weird perv or whatever.
55:38
He actually found out what that demonstrates is people have only read Confessions and they've only read to book four. It's like, well, you should just keep reading.
55:45
He actually found giving up sex pretty easy because you don't have to have sex to stay alive. But lust where it showed up greatest in his life was both a desire for power because everybody wanted to talk to him.
55:56
And because he was a genius, but also food and drink, which is why he was so ascetic. He would have been a terrible dinner partner.
56:04
But the reason why was because he didn't know any other way to like stay away from the stuff. Because he's like, but I have to do this because I have to stay alive.
56:12
So his lust was mainly in the realms of like music, food and alcohol. But like, you can, this is the whole
56:20
John Owen thing with like mortification of sin. I think sometimes we get myopic as pastors and leaders and go as long as we help with his porn problem and his lust problem, then we'll fix him.
56:32
And it's like, well, guess what? That thing is going to evidence in all kinds of other areas of life.
56:37
So if we help him understand what it means to follow Jesus at all that's involved in that and then give him good examples because he's, it's one thing to say like love your neighbor and then go like, what the heck does that mean?
56:48
And I go, well, it's complicated. Or like, I don't know, it involves a lot of things. You gotta be like, well, I don't know. Why don't you come with me as I seek to love some of my neighbor.
56:54
And then we'll have a conversation about what that was. But it's going to take a lot of time and a lot of dedication, which is great.
57:03
Cause that's what we're called to do. Like what else are we alive for? But it's just a much, it's a much bigger problem but also a much simpler problem,
57:09
I think. It's like encompassing. It seems like that's a pretty common theme here is it's more complex than you think it is but it's actually simpler than you think it is.
57:19
And so we come to appreciate the simple by thinking about the complex. Yeah. Kind of like physics, the standard model versus like quantum, like they're kind of at odds with one another, but they have to fit together even if you don't see it.
57:32
See, look who's the smart guy now. Like I don't know anything about this though. That's true. Yes. Point for Sean.
57:38
I'm thinking this guy probably wants to be married one day. Right?
57:44
The incel? The incel, yeah. Maybe. Maybe, let's assume it. Yeah, let's assume that the most normal thing is that he wants deep companionship with a lady.
57:54
Great. Helping him understand that he's actually training himself to be a bad lover and a bad husband to his future wife.
58:03
Right. Yeah, and at the same time affirming that what he wants, if that is what he wants, is a great thing.
58:11
So the question is, how do you prepare yourself well to make yourself the best kind of lover in all the aspects, the best kind of spouse?
58:24
And do you think, brother so -and -so, that engaging in all of this stuff with the sex robot or porn or anything like that is going to train you to actually be not just satisfied yourself, but satisfying to somebody else?
58:40
The best lovers are the most giving, the most self -sacrificing. So if you spend a decade ejaculating into an aluminum can and thinking that that is, with a little bit of rubber around it, thinking that that is good sex, when you actually do get a lady who is kind enough to let you lay down in the same bed with her, you're not gonna serve her well at all.
58:59
No. And listen, sex is hard enough as it is for somebody who hasn't had all those messed up experiences.
59:05
You gotta give it some time. You gotta figure some stuff out. You do. And by the way, that's a word for anybody listening who is using the fact that they're not using sex robots and only using pornography.
59:19
That's a word of condemnation against those brothers or sisters who are going like, oh yeah, I'm not engaging in that.
59:25
Therefore, I'm better than those people. I'm only doing this thing. So I'm only, maybe I'm not preparing myself well for this, but I'm staving off lust or whatever.
59:34
No, you're not. Like, all this stuff is shaping you into a pretty terrible kind of human. Again, not just for somebody else, but also for yourself.
59:41
I think one of the things we need to help people understand when it comes to kind of sexual, what we might call sexual deviancy, right?
59:48
Anything outside of the norms is like, this is actually bad for you. And it's bad for other people.
59:54
To some degree, we were always having sex in public. Like, because again, what happens in the bedroom affects outside of the bedroom.
01:00:01
Then we're always being shaped into particular kinds of people. So like, don't think that you're better than the guy who uses a sex robot or the woman who uses a sex robot because you just use porn.
01:00:12
It's all deforming. And I get it. I get why you would want to pursue those things.
01:00:19
But I think we really have to help people go like, it's bad for you.
01:00:24
And it's bad for other people. And I know you might not believe that. You might be thinking you're killing it in life right now.
01:00:30
So I can't persuade everybody. Let me at least give you something, some reasons to think about what you're doing, some reasons to consider the alternative, and hopefully come over to dinner to my house and see that a possible world of just living life with a woman over time is a possibility.
01:00:50
I was thinking another thing with this guy might be, I think sometimes we're afraid to have like, cause we don't want to be the red pill guys.
01:00:58
You know, like the - At least I don't want to be. Yeah, don't want to be like the male Oprahs, which is kind of how I view like David Goggins and like Jocko.
01:01:05
Not that I disagree with everything that they say, but I think they're like male Oprahs. There's the hot take for the day.
01:01:12
Yeah, a good take, I would say. You look like one of them. Thanks, man. You're welcome. But I'll take that as a compliment.
01:01:17
I'll put meaning to it that I don't want. Barnabas, round two. There is something to say to that young man, like, hey, stop playing video games.
01:01:29
Like 18 hours a day, you know what I'm saying? Pursue an education, get a good job, work out.
01:01:35
You don't have to be buff, but you're a fat, lazy piece of crap. Of course, no one wants to marry you. You know, you smell bad.
01:01:42
Your hair looks like a bird made its home in it. Like it's a nest on top of your head. I think pastors should not be afraid with some of these young men to say like, you need to make yourself be the kind of man that a woman would want to actually unite herself to.
01:01:56
And having sex with a robot's not gonna help. No, because you know what? The robot doesn't care if you're musty.
01:02:03
Right. The robot doesn't care if you want to play video games for six hours a day. Right. As long as,
01:02:09
I agree with you, as long as that doesn't become one kind of person, like you have to attain this ideal that either you are or is some kind of fictitious thing.
01:02:21
And the other thing is, you're gonna tell it to young men, you better say, and I'm going to help you, not just kind of rail on you in the office, but like,
01:02:30
I'm gonna have you just over. You're just gonna hang out with me. And like, listen, you might be thinking,
01:02:35
I don't like hanging out with those guys. Well, especially if you're in ministry, tough nuts. Like that's what you signed up for.
01:02:43
Yeah. I mean, as long as someone in the local church. Yeah, yeah, yeah. Men in the church should be ready, available.
01:02:48
And it probably should not be the guys who are oriented towards the kind of red pill thing. Right? Not the manly men and like, we're gonna burn stuff.
01:02:56
This is what men do. And the guy's like, you know, he's like, I just want to go play on my computer again. You're like, no, that's for women or whatever.
01:03:03
It's like, okay, unless you're watching videos of like that one dude who eats liver all day long. Right? Like, come on.
01:03:09
His name is Liver King. How could you not know that? Lots of reasons why I couldn't know that. I'm not surprised you do.
01:03:16
So the thing is like, we do need to challenge men. We need to challenge women.
01:03:22
To just be like, look, be the kind of person that is admired for all the right reasons.
01:03:32
Is desired in the best kinds of ways. That is essentially a really good neighbor.
01:03:37
And if you do that, it's not just gonna be good for you with the ladies or whatever. It's gonna make you a better citizen.
01:03:45
And it's gonna actually be fulfilling to you. And you go, and I bet that you probably don't believe that.
01:03:51
Cause you think you're doing fine or whatever. And this thing's gonna fix you. I get why that's an attractive alternative.
01:03:56
Just get a robot. Then you can just go along with your life. But really like, how are you doing? Like, are you really satisfied right now?
01:04:03
And then hopefully over time, what happens is you have one of those guys who did change into something else.
01:04:10
That she's kind of got more self -confidence and serving people. And you go, okay. Now your task is to tell the other guys that come up behind you.
01:04:19
Like, listen, I know I was where you were and it can get better. And you don't have to become like me.
01:04:25
Who knows what the thing is or how it all works. But like, come hang out with me because I wasn't happy.
01:04:31
But I get why you think you're gonna be happy with the sex robot or whatever. Yeah. Bring them along the way. Last thing on the checklist.
01:04:40
Whether you eat or drink or whatever you do, do all to the glory of God. One of our elders, when we were thinking through the question of masturbation and how to deal with it pastorally, he very wisely just said, there's no way you can finish masturbating and say like,
01:04:56
I think I did that to the glory of God. Right. Right. So this is me doing what
01:05:01
I think you've been doing throughout this interview is trying to just anchor this in something a lot deeper. Not that some of the other superficial things aren't real that they don't matter.
01:05:10
They do. But we're trying to anchor this in a ballast that will not be moved.
01:05:17
And I think that this is what's really helpful. If you kind of were to ask the person who's really struggling with something like masturbation, for example, this is where sex robots can be really helpful.
01:05:32
It's like, do you think you could have sex with a robot to the glory of God? Oh no, no, no, no, no, no, no, no.
01:05:40
What's the difference? And then have the conversation. You're actually pulling it back from there and demonstrating that like, oh, you think you've somehow convinced yourself that this is fine.
01:05:52
But then also like, if you can get people thinking down that path, this is why something like that's so helpful.
01:05:57
Because it also applies to what you do with your money. It also applies to like how you serve the church.
01:06:03
It also applies to how you treat your wife. It also applies to how you work at your job.
01:06:09
Are you doing it to the glory of God? Are you just like, just phoning it in every day or trying to be as rapacious as possible to get the money so you can buy the thing to make you happy?
01:06:22
It's like, well, that's not to the glory of God either. And I've heard this with guys, women who are really wrapped up in sex stuff.
01:06:29
It's like, well, you know the other people in the church. It's like, well, what about ism? Doesn't give you the license to do the thing you wanna do.
01:06:36
But you are right. It's not just sex stuff. It is everything to the glory of God.
01:06:43
So let's think about like, what does it mean to be a Christian disciple? Okay, let's get a little bit back into theory.
01:06:51
Great. On page 144 of your thesis, I think I found at least one of the hearts of what you're doing here.
01:06:58
Your thesis might have two or three hearts. I found one of them. You say that the core of Augustine's concern with sex has less to do with the act itself and more to do with how it affects those who engage in it.
01:07:12
So how does sex with a robot affect us?
01:07:19
First of all, actually, can you have sex with a robot? Depends on what you mean by sex. Okay, biblically.
01:07:26
No. So I would go back to your comments that in one sense, it's just a very advanced masturbation machine.
01:07:35
In another sense though, to getting back to the symbolic value thing, if essentially, if someone were to walk by and see you interacting sexually with the robot, they would think you were having sex with it.
01:07:49
And in order to, this is one of the things that trips Augustine out. Augustine trips out about human bodies all the time.
01:07:56
And he's like, what's super weird is you can kind of do all kinds of stuff with your body. He talks about like a guy who can just like fall asleep like he's dead.
01:08:05
A person who can sweat on command, move his ears. He tops out with like the pinnacle of it is a guy who can make music out of his anus, which is amazing.
01:08:14
It's like Augustine had jokes. One of the less quoted Augustine. Right, yeah, it's in city of God. And then he also says like, but the weirdest thing is, is the very part of a human male body that you can urinate through at will, you actually have to be provoked by lust in order to use for sexual purposes.
01:08:34
Now, there's technology today where that's not necessarily the case, but still like, quote unquote, naturally, you have to have that, which is strange.
01:08:42
And so you can have sex with a robot, at least in the kind of, it's symbolic of the act, but it is not the fulfillment or the totality of the act in any way, shape or form.
01:08:58
The telos of sex is not there. Right, because there's a lot that goes into sex. Because here's the one thing we really have to figure out.
01:09:06
Like, well, is rape sex? Right, I mean, there's two humans there. At least one is convinced that the other one probably wants it.
01:09:16
And they want it. So they're just gonna go take whatever they want. So I think it's important to distinguish it.
01:09:24
Like, as long as there's two humans and there's penetration going on, we got sex going on. It's actually really - You're saying that's wrong.
01:09:29
That's wrong, yeah, for sure. But then you go, okay, well, what is sex? And sex is kind of more complicated than that.
01:09:37
And there's lots that goes into it. And it's kind of when special attention is being paid in a gracious way to the sexual parts of one person and another person.
01:09:48
So I'm paying special attention to your parts. You're paying special attention to my parts. So it could include penetration.
01:09:55
It doesn't necessarily have to. Most normally does. Most normally. Yeah, it's the loving giving of oneself physically.
01:10:04
Right. Most normally, okay, yeah. Yeah. Okay, so how does this simulacrum?
01:10:11
Oh, that's Latin, by the way. Yeah, that's in my dissertation. With a robot.
01:10:16
How does that affect us? Yeah, so I think that, at least conceptually, because we don't have a whole lot of empirical evidence on this, but it persuades us to believe something that's not true.
01:10:32
That either, it's good enough for now. I don't know if it's gonna, I'll be persuaded about this forever.
01:10:39
But humans have these sexual scripts, these kinds of ways of thinking. It's like, because there's a real weird question about why does masturbation work?
01:10:47
And there's all kinds of research into why it might work or whatever. But one reason is because we have this ability to kind of fantasize and create these whole worlds in our head, which you have to do with a robot, too.
01:10:58
And the complicated thing about the robot is you have to read it as a sexual partner one moment, and then just like an object you own the next moment, just so you're not weirded out by it, especially if it's then gonna go clean your dishes or whatever, so you're actually messing with your own way of viewing the world.
01:11:12
But on the sex side of it, I think that you can persuade yourself that we're in a loving relationship. There's testimony from sex doll users that have kind of persuaded themselves that these dolls have histories, that they have essentially what we would consider souls, that they've swapped parts out on these things and all that kind of stuff, but still the essence of the thing remains.
01:11:32
You say that in Japan, it's actually really easy for people to believe that because Shintoism teaches that all objects do have a sort of soul in them.
01:11:39
Yeah, and not even like sexual objects, like needles. They have funerals for sewing needles and stuff. And so I think it affects us by kind of conning us towards something that is kind of right, but absolutely wrong, which is the real bummer of the thing.
01:11:57
One of the things that I kept thinking about in relation to this was G .K. Beals, well, he got it from the prophet, so it's not his thing, but he is expounded on it probably more than any theologian, the whole you become what you behold thing.
01:12:12
Do you see any connection? Yeah, because, so we can use this in the realm of just social robotics in the workplace.
01:12:21
So if I'm working a job and it's a job that a robot can also perform, then when we're already finding this, then
01:12:33
I will be persuaded to work longer, faster, harder to keep up with the robot.
01:12:41
So I actually will change myself to keep up with this thing.
01:12:48
This is the real kind of danger of sex robots for the Christian church that's like healthy church or whatever, that doesn't even engage in this stuff.
01:12:56
You still live in a world where there's objects that are able to be made to look just like you, but better.
01:13:02
Like what does that do to the way we view ourselves? Which is why people, I tell people, whether you wanna talk about sex robots or not or think about them,
01:13:10
I understand why you might not want to but because you live in a conceptual universe in which they exist, you're going to be affected whether you know it or not.
01:13:19
We already deal with living in a world with plastic surgery and just body modification.
01:13:27
It's like the temptation to be like, well, let me just like cut this off or tack this on. And even if you're like, well, of course
01:13:34
I'm not gonna do that because I'm pursuing faith in Jesus or whatever. It's still there. You still like think about it every once in a while.
01:13:41
And yeah, and by the way, people in your church are thinking like that without a doubt. The walls of the church are porous.
01:13:48
It's in the air you breathe. We watch too much TV, we're on social media too much. Right, so the kind of living just even in a world of sex robots is going to affect
01:13:58
Christians who think like, even if it's to the level of just being slightly less satisfied with what you got on the basis of what you think you could have with a robot.
01:14:10
Like this is a world we live in. You know, it's really sad. Like one of the things that I love about growing old with my wife is that both of our bodies are changing for the worse.
01:14:23
Hers at a much slower pace than mine. She's still basically glorious. But like, I love that I get to love her despite that.
01:14:30
And I love that she loves me despite my body very much showing the evidences of aging.
01:14:36
There's something glorious in her ability to look past that and love me despite the effects of sin on my body.
01:14:45
Yeah, so this is one of the really interesting things. One of the things
01:14:51
I get into is how sex robots work in a like transhumanist world. So this idea of the biggest problem we have is our bodies, these kind of meat puppet suits that we just are addicted to as we get beyond.
01:15:03
So sex robots essentially function like as we head towards this future of infinity, and we can just get out of our bodies, but use them whenever we want.
01:15:11
Then sex robots are an experimentation with our future selves. And one of the really interesting things that people have noted it is, while that sounds very progressive, it's actually the most conservative thing humanly imaginable.
01:15:24
Because you want no change, everybody the same, everybody happy all the time.
01:15:30
You can't deal with aging and difference and diversity. It's all just like gotta be exactly the same.
01:15:37
And I think Christians have a real strength in this area to kind of be like, no, getting old is both really miserable and really great.
01:15:46
And you should be excited about it. There's the both and again. Yeah. There's a section where you quote one of these transhumanist authors who says that once we do, he's very optimistic obviously.
01:16:00
Once we do transcend this, this meat bag, this flesh bag with juices and oozing stuff.
01:16:08
Once we do that, we're gonna experience a kind of sex that is infinitely more enjoyable than anything we could ever imagine in these bodies.
01:16:21
You want my response to that? Yeah, it's real simple. You don't know what sex is like, except for the kind of sex you have with this meat suit.
01:16:30
So it is not only wrong and super religious, by the way, like you'd think that it takes faith to believe
01:16:39
Jesus rose from the dead. That's a whole new level of faith you gotta have in the future.
01:16:44
But it's also patently ridiculous because you don't know what sex is like without your body.
01:16:51
Therefore, to imagine that that's gonna be great is pure rhetoric and like fanciful imagination.
01:16:58
That's like what a seven -year -old says. Like any rational human being would be like, well, as complicated as human sex is, sometimes it's the only kind of sex we know.
01:17:08
So there's no way to talk about the goodness of the future in a bodiless world because you've never experienced that.
01:17:16
Yeah, I also, whenever I read stuff like that, I know
01:17:21
I keep using this word, but I don't think God will ever allow us to ontologically transcend.
01:17:30
Oh, right. Like we are corporeal beings. Yeah, we are corporeal beings. And I understand, in some sense,
01:17:37
I'm already a little bit transhumanized. I have this thing beside me that takes my brain power and amplifies it 10 times.
01:17:47
Easily. Yeah, easily, what it's normally capable of. And my memory has expanded. And isn't it funny?
01:17:53
I grab it first thing when I wake up and it's the last thing I put down before I go to bed at night. It's almost like I have an external brain pack connected to my body.
01:18:00
Having said that, that is, I think, in a universe entirely apart from being able to take the nefesh of a person and take it outside of the flesh of a person.
01:18:13
Now that rhymes, but I don't think the Hebrew people plan for that to rhyme with English. Also, there's no clue to know if that's actually the way you should pronounce the word.
01:18:21
So you're just making it up. Yeah, there you go. Yeah, but it rhymed, so I guess it works. Here's the thing, we really can't do that because we don't even know what that thing is.
01:18:31
This is why the transhumanist project is so hilarious. Because it's like, oh yeah, once we get upload our minds, you're like, what is a mind?
01:18:38
You don't even know what that is. And none of us do. What is consciousness? Wait, stop right there on the mind thing.
01:18:44
Because this blows me away. On the one hand, we need to get our mind out of our brain and put it into something else.
01:18:50
On the other hand, all I ever hear is that there is no such thing as a mind. Correct. It's just neurons like levers and pulleys.
01:18:56
Right. So which one is it? So this is the thing. You should be so thankful.
01:19:02
I keep thinking back to like Calvin's thing of like, when you see like idols and temples of other religions, on the one hand, it should break your heart.
01:19:10
And the other hand, you should thank God that humans are just like irredeemably religious. This is the same kind of thing where you kind of go, oh, that's weird.
01:19:19
So you have a mind thing. So you exist apart from your body, at least hypothetically. We can get to a place where you could exist outside of your body.
01:19:26
That's curious. How do you think that happens? Which is why there's some people, secular people, who are like, the transhumans are idiots because all we are are bodies.
01:19:33
But this is something that we can make use of in an incredible way of going like, I agree with you that there is something beyond our bodies.
01:19:41
The question is, what is it? Where is it going? And why does it matter? Which is fantastic.
01:19:47
This is something you should be super grateful for, that there are people in the world who think that exists. It's great.
01:19:53
See, man, that's so funny. And I'm so helped by that because our inclinations are so different.
01:19:58
I am such a curmudgeon. When I listen to the transhumanist, my first thought is, you frigging idiot.
01:20:05
And your first thought is, oh, I actually really like that you think about that. Let me, I use this illustration all the time.
01:20:12
Let me climb inside of your car, take the steering wheel, and then crash it into a pole for you. You know? Yeah, for your good, right?
01:20:19
For your good, yeah. But like, and I'm the curmudgeon too, because I'm just like, how do I do that? Like, essentially, how do
01:20:24
I cruise down the road? This is what I've had to try to do, which actually writing the dissertation for a bunch of British academics really helped me with.
01:20:32
How do I drive down the road a few miles instead of just jump off of the bridge right here? And how do
01:20:38
I go as far down the road as I possibly can before I wreck the car? Always for your good. Yeah, always, yeah.
01:20:44
Like, let me just try and do this and enjoy it all the way down. That's really good. I can't imagine having to write for British people.
01:20:51
It's the worst. I'm so uncultured. I used to think that the British Channel was a station on TV and not a body of water between England and France.
01:21:01
Okay. That was great. Thanks, man. Back to how this affects us.
01:21:10
Yeah. You talk about the enslaving gays.
01:21:16
Oh, yeah. Not G -A -Y -S. G -A -Z -E, how having sex bots can train us basically to like hyper objectify the human body for sexual purposes.
01:21:34
Yeah, which again, this is the thing with like, well, how is this actually gonna affect other people? Because the libertarian argument is like, well, if I'm not hurting anybody, then why does this matter?
01:21:46
Oh, it's gonna hurt people. It's gonna hurt people. But one of these things is like, well, it turns you into the kind of person that looks at everything.
01:21:53
We already live in an objectified society. This just ramps that through the roof. And I wanna be very clear that I can imagine a potential world where Christians are like, well, as long as we don't hump the thing, then we'll have it in our house to do the dishes and everything.
01:22:11
We'll have the person in and we'll treat our children to respect it. I mean, you're already dealing with this thing of like when people talk to Alexa or whatever, you're like, no, say thank you.
01:22:20
They say please and thank you. Yeah, that's right. Because you need to treat the thing right. And it's just like, ooh, this is getting us into some weird territory.
01:22:28
But the enslaving gays is this thing. It sounds like all this kind of like sci -fi thing, which is this idea of every time
01:22:34
I look at anything, I'm just thinking, well, I could just modify it though. Which then leads to, because the question is what's the kind of, if that's the way you do with robots, then what's the backlash of how you view humans?
01:22:47
What's the kind of way where it comes back and you become just, we're already dissatisfied with other humans.
01:22:53
They're late. They're lazy. They're whatever. So how much more is that kind of just botheredness with humanity expanded when you're like, well, why don't
01:23:04
I just fill my life with robots? Forget all this. Yeah. I'm even thinking you have one section in there where you talk about how it's already been demonstrated that violence towards sex dolls as they are now.
01:23:19
Yeah, happens. Happens. And how that has been shown in some way to translate into it's easier to be violent towards real women.
01:23:28
Because you just get so used to doing it. You have this thing that you've come in your mind to think of as a human person and you treat it however, and then you get a real person.
01:23:37
And it's just like with pornography, right? You know, you're trained to think this way. This is what sex is. And then when you actually have a real person, you go to do it.
01:23:44
You just do what you've been trained to do so far. Yeah. It's one of the benefits of being human is we can be trained in certain ways.
01:23:54
This can go real well, but that powerful thing can just go sideways real quick. When you say this, you don't mean sex robots.
01:24:01
No, just being trained in anything. Yeah, that's right. Which is why like self -discipline is a possible thing.
01:24:07
But if self -discipline is possible in anything, finances, working out or whatever, it's also a degenerative thing that if you head down the wrong path, you disciple yourself into something else.
01:24:19
It's not bad, it's just powerful. Yeah. Can sex robots ever gain sentience?
01:24:34
Well, to get back to something I mentioned earlier, we don't know what sentience is. So that's complicated.
01:24:41
So I don't think that that's a possibility. However, I'm not really concerned with that.
01:24:47
Here's what I'm concerned with. I'm concerned with robots continually improving. By that,
01:24:52
I just mean like better interactors with human beings in human space. And the better they get at it, the more we're going to persuade ourselves that they're sentient and treat them as if they were.
01:25:06
Whether like not even thinkingly. So back to kind of the Roomba deal. Like people have modified their homes in order so that the
01:25:15
Roomba doesn't get stuck. So they can go all the way through the house. You're already modifying your human space for a disc on the ground that sucks things up.
01:25:23
When the robot gets stuck under couches, whole family freaks out, go save the robot. There's not even eyes on the thing.
01:25:31
Cynthia Brazel, who made Kismet, which is like one of the first like humanoid robots.
01:25:37
It was essentially like an erector set with eyes and eyebrows. And the big thing was this robot voice. The eyebrows moved and eyes kind of just went back and forth.
01:25:44
But it was super basic. When she left the thing behind because she was changing institutions and the robot stayed behind, she cried.
01:25:52
And she was like, this is the dumbest thing ever. I made the thing. And yet I don't want to leave this behind because I've grown so attached to it.
01:26:00
It's crazy. But it's also part of just like what it means to be human and create things and be drawn to things.
01:26:06
The more human these things are, the more we're drawn to them. So I'm not concerned about what's going on inside the robot.
01:26:13
I'm concerned with how humans approach the robot and how that then forms or malforms them in the world.
01:26:22
You deal a lot with consent. I do, in one chapter, yeah. Yeah. I thought it kept coming up.
01:26:30
But I mean, hey, I just read it. You wrote it. You probably haven't looked at it in a long time. I don't even know if I want to talk about it.
01:26:40
I brought it up. I don't know if I want to talk about it. Is there anything about consent that you think would be edifying for us to discuss? I think this is what's edifying.
01:26:46
I think that consent is a good thing. Right? Another bold statement.
01:26:53
Bold statement. Slavery is bad. Yep. Consent is good. Yep, that's in the same chapter. So I'm just, yeah,
01:26:59
I'm just like really firebrand. But what I do want to say is that, you just talked about forming people in particular ways.
01:27:06
Imagine the deformative aspects of living in a world where consent, which is complicated, and there's lots that goes into it, and it is like a baseline, it is not sufficient for sexual interaction, but it is important.
01:27:20
Although we have made it the baseline in our society. And it's unfortunate. But it's better to live in a world that's pursuing consent than a world that doesn't.
01:27:30
But what happens, not only to consent, but even the desire to pursue consent when you don't need to get it from a robot.
01:27:36
Or, and people go, oh, well, no, we'll put it in robots. But it's all gamified. You just got to hit the right buttons or whatever, and you can just go through it.
01:27:42
It's always going to be this problematic thing that we're already struggling with human interaction.
01:27:48
Robots will inevitably make that kind of thing worse. Even on a conceptual level, if I've never used the robot, but I just know that like,
01:27:55
I don't have to do that with a robot. And pursuing like legitimate sexual intimacy with my human spouse is so complicated, because so much goes into it.
01:28:07
Man, robots seem so much better, which will lead to just like, maybe I won't use them, but I might be pretty dissatisfied with what quote unquote
01:28:15
God gave me, which is a bummer. Yeah. Also interesting in there, you talk about how robots can be programmed to require consent, but also how like rape play can be programmed into that sort of a thing.
01:28:36
So this is a problem with the consent thing, right? So if I program consent into the robot and the robot says no, then like, this was very quick.
01:28:45
People were going, hold on a second. That sounds like a manufactured rape setting. So yes, I will program the consent through the roof so the robot like does everything within its power to try to get me to not have sex with it.
01:28:57
Well, great. That's just a rape setting because I want them to struggle. You know? Yeah.
01:29:02
That doesn't sound great. No, but in a consumeristic society, which you deal with at length where the dollar, and by the way,
01:29:10
I'm a free market guy, you know, but every system has its flaws because we live in a sinful world. Where, you know, you got enough money, you can do what you want.
01:29:19
Yeah. You can build a bear, but just way more perverted. Yeah. Well, and you could build a bear if you wanted to.
01:29:25
Yeah, you have like a mermaid section in there and you can build your sex mermaid. Well, so the mermaids are a conceptual category of human beings that lack something that they need in order to be fulfilled.
01:29:36
That's right. So Ariel needs legs. Oh yeah. But you could build a mermaid. You could build a mermaid. Yeah. Or you can build like an elephant or whatever, or like a, you know, a centaur, you can do whatever you want.
01:29:46
Yeah. I think it's centaur. I think so too. But let's be honest. You totally embarrassed yourself just now. I know as much about mythology as you know about physics.
01:29:54
So we're on equal grounds. You got quarks and gluons. Cool. Right? You know, you glue the atom on.
01:30:01
It's a gluon. Okay. We do live in the future. This dystopian.
01:30:07
The heck are you talking about? This dystopian vision. Yeah. Like I remember when I was a kid,
01:30:13
I watched, what was that movie with Arnold Schwarzenegger where he goes to? Sixth Day. No, Total Recall.
01:30:19
Total Recall, yeah, yeah, yeah. Total Recall or anything like that. And there's, or Sylvester Stallone in that movie with Sandra Bullock.
01:30:30
Judgment Day? It's about where he goes to the future. Yeah, yeah, I know which one you're talking about. Anyways, and they have sex by putting on.
01:30:35
She goes, do you want to have sex? And he's like, yeah. He's like getting ready. They put on these helmets and they put on these gloves and they sit and they go, mm.
01:30:43
And when you're a kid, you're sitting there and you're thinking like, whoa, the world will never be this crazy.
01:30:48
That's exactly what we have going on. But it is. We live in a dystopian future. If we can't have babies, we freeze a bunch of them and keep them for as long as we may possibly need them.
01:31:00
There's a million frozen babies in our country. You can have sex with robots. We have flying cars.
01:31:06
They're called helicopters. Just not really what we thought that they were gonna be. And I have to tell you, brother, after working through this,
01:31:15
I am depressed. Oh, that's too bad. I know, I know. Again, I told you before, I'm a curmudgeon.
01:31:20
I know that Jesus wins. I have hope. But thinking about a world, the conceptual universe, in which this is not only possible, but plausible and desirable, and to think about all the people who are going to incorporate this into their life, it's just soul crushing.
01:31:40
You know? So, and I'm imagining that's gonna be the case. I think some of our listeners and our viewers are gonna be saddened.
01:31:47
They're gonna be sickened. They're gonna be frustrated. They're gonna be righteously angry. All of that is appropriate.
01:31:53
Help us end on a note of hope. Yeah. Joy. Sure. You know, you only get one go around on the rock.
01:32:07
If God exists, then you were placed here in a time and place such as this for the purpose of living out, you know, good gospel witness in this society.
01:32:20
And you might not like it, but it's exactly where God wants you to be. And should
01:32:26
God be in charge of the world, and not at the whim of the world, then the world as we currently have it is exactly the kind of world that God wants it to be, which is beyond my understanding.
01:32:39
So God doesn't just have the future figured out, but the now figured out, and he's in control of it all. So instead of just being sad or discouraged, try and find the cracks in the universe with even the darkest stuff and go, hold on a second.
01:32:56
We could actually see people really become Christians on the basis of this. If I lean in, not to the goodness of the thing, but the good that's behind, underneath, buried in the thing, and see how those things are actually found in the gospel of Jesus Christ in a fulfilling way.
01:33:14
And just like I try to do in my dissertation, reorient people towards the good and go, that thing, always be trying to find the good.
01:33:22
You need to come to a place where you're both sickened and heartbroken, and also find a way to thank
01:33:29
God for the world in which you live, because it provides unique opportunities to preach the gospel, demonstrate the beauty of the
01:33:37
Christian life, see people come to Jesus. And like, you got no other option, because we do not live in the future.
01:33:44
We only forever live in this moment. So just, the question isn't, what are we gonna do in 10 years from now?
01:33:50
That doesn't even exist. All that exists is, what does it mean to be a Christian today, right now? And do that to the best of your ability.
01:33:58
And Jesus wins. Yeah, exactly. That's right. All right. Let's move on to a little lighthearted fair as we wrap these.
01:34:06
I ended on like such a dope note. You were like, okay, lightning round. What's your favorite cheeseburger?
01:34:12
That's right. Let's go. Yeah, that's the way I'm gonna do it. Get your own show. That's right. I have one.
01:34:18
Okay. Tea or coffee? Coffee. Favorite sitcom? Oh, my favorite sitcom is
01:34:26
The Office. Yeah, pretty good pick. You're stuck on an island. You can only have the books of one of these men,
01:34:33
Dever, Piper, Keller, Sproul, John McArthur. That's the most ridiculous list I've ever heard.
01:34:38
I know, but it's not your show. Okay. Or you can only have the sermons of one of those men.
01:34:47
I would go Sproul. Still Sproul. Lewis or Tolkien? Lewis. All of the collected works, not just the fiction stuff.
01:34:53
Lewis. Easy choice, I would say. No question. Yeah. He's in the minority, for sure. Not even close.
01:35:00
Not even close, so much. The minority of everybody who's been on this show. That's true, but we've also interviewed a bunch of idiots.
01:35:05
There you go. You said it, not me. Favorite fiction? My favorite kind of fiction?
01:35:11
I want you to take that however you want. My favorite fiction is probably really well -done, character -driven, especially over a series,
01:35:22
Murder Mysteries. Ooh. Did you read Ride, Sally Ride by Doug Wilson? Yes, next question.
01:35:29
Oh, okay, I was gonna ask you more about it, but I won't now. You can. Well, I mean, he's taken a lot of heat for it.
01:35:35
Yeah. But I haven't read it. Okay. He wrote it, he says, to help people, to be provocative and to help people see how unfulfilling sex robots are.
01:35:45
Yeah. Did he do a good job? It's easy to build straw men, light them on fire.
01:35:54
Okay. So not recommended reading? No. Okay. Mountains or beach?
01:36:01
Oh, mountains, no question. Beach is for psychos. Champagne or wine? Bad champagne is terrible, but champagne.
01:36:13
Least favorite candy? All of it. Really, you don't like any candy? I will, no. Your body looks like it was built by candy though.
01:36:23
Android or iPhone? iPhone. Macaroni salad or potato salad? That was such a good one.
01:36:30
Potato salad, but only if my wife makes it. Ooh, I like that. Is it a mayonnaise base or a mustard base? Mustard base. Yeah, baby.
01:36:36
Get mayonnaise out of here. You're really fancy. Imagine that in your mind. Foie gras or escargot?
01:36:42
Foie gras, not even a question. Too easy. Seared. Yeah. Night out or night in?
01:36:49
If it's a night out, do I get to spend some good money? Whatever you want.
01:36:55
Okay, then it's probably a night out. If you had more money than you have now. But I'm swinging for it.
01:37:01
Yeah, it's either hole in the wall, just hood, ethnic place, or three -star dining.
01:37:09
Otherwise, we make way better food at my house. We're not doing Applebee's. Yeah, no. Applebee's can die. I agree.
01:37:17
Concert or football game? Football game. Morning person or night owl? Night owl. Burger King or McDonald's?
01:37:24
McDonald's. Oh, okay, good. A lot of people go, oh, I'm too good for either. Yeah, no, McDonald's all day. Mexican or Italian?
01:37:31
Before you answer, I'm asking, which is your least favorite race? Yeah, so Mexican. That's your favorite food as well?
01:37:42
Burgers or barbecue? Okay, another thing. Like, I think that the best barbecue is better than the best burger, but it is way harder to do good barbecue than good burgers.
01:37:55
Therefore, it's less accessible, so you're probably gonna say burger. Yeah, if somebody goes, let's go out for burgers or barbecue, what do you want?
01:38:01
I'm going burgers. Burger, but if we're talking about like the platonic form of a thing. Then it's barbecue. Barbecue. Chinese takeout, bad
01:38:08
Chinese takeout. Or sushi? Well, just, okay, you didn't qualify sushi.
01:38:14
I know. Sushi. Cold or hot? Unless we're in Alabama. Cold or hot?
01:38:20
I'm going hot. Hot, baby. Rock or rap? Rap. What? What?
01:38:25
Classical or jazz? Jazz. I knew you were a freak. That's right.
01:38:31
I'm trying to both and everything. Come on. I play on a minor scale. Yeah. Careful now.
01:38:37
Trapped on an island with only one systematic theology, which one do you choose? Or do you even read, are you just reading a bunch of dead
01:38:43
French philosophers? Do you actually, do you know what systematic theology is? Yeah. I do know what systematic theology is.
01:38:48
I'm probably taking, Calvin's Institutes doesn't count, right? Ah, probably
01:38:54
Calvin's Institutes. If not that, then it's a toss up between Horton and Byrd.
01:39:00
Michael Byrd. Yeah. Interesting, how come? Because he doesn't suck at writing about theology.
01:39:06
And most systematic theology books are the most boring things on earth. They can be. And they're also like completely divorced from real life.
01:39:14
And you think Byrd does a better job of that? Better job, yeah. Interesting. I've not heard anyone ever say that. You've intrigued me.
01:39:20
There you go. A provocateur strikes again. Nailed it. What hymn do you want to be sung at your funeral?
01:39:26
Oh, Be Thou My Vision. Not even close. Wow, I like that. You got it preloaded. Yeah. Yeah. Does a straw have one hole or two?
01:39:35
Well, that is begging the question about what a hole is. So. I don't think it is.
01:39:43
Begging the question means you assume the conclusion and the premise. You assume that there is the possibility of having one hole or two.
01:39:49
Yeah. You know, I think people who are into philosophy are the worst. Like, I'll say like, what is,
01:39:57
I'll say, what is this? Right. And you'll say, well, what do you mean by is? Right, yeah. Yeah.
01:40:03
Yeah. Well, that's all I got, Jeremy. Thanks for being on the Room for Nuance podcast.
01:40:08
I think genuinely this will be a helpful podcast for people, well, actually, how do
01:40:17
I want to say that? I don't know. I didn't have anything preplanned. I do think that this will be helpful. I think it'll be. I hope so.
01:40:22
I think it'll be difficult for some people. Yeah. My hope is that by the time this comes out, you will have already begun to have like a popular version of your, is that the plan?
01:40:32
Kind of do like a popular version. When is that coming out? Who is it coming out with? Well, so I start talking with publishers in a few months.
01:40:39
So you're planning to plan. Finishing another book. So I've told everybody like, you know, I didn't finish too long ago.
01:40:45
So I was like, I need to put that on the shelf for a minute and come back and think about the popular treatment. I imagine you're going to have a hard time with this.
01:40:52
You don't think so? Hard time? What's this? Yeah, right, good. Getting a publisher. No, there's at least four that want to talk to me right now.
01:40:59
Oh, you're a hot commodity. I'm a hot commodity. I would imagine that this would be something that they'd be nervous about. I think that, well, that's the real question is, can
01:41:06
I write the book I want to write? Or do I have to write the book they want me to write? And I have no interest in writing the book they want me to write.
01:41:13
So then maybe they won't publish it. And you'll self -publish and dozens of people will read it. That's right. That's right. Well, I do hope that you popularize it and make it more accessible.
01:41:23
Thanks. Even if it's nothing, I don't assume it will just be this. But I would like, as a pastor,
01:41:29
I'm always thinking about like resources, right? So something comes up, this guy emails me and says, hey,
01:41:35
I got this issue in my church. I would love to be able to be like, oh, here, read this book. You know, like you have with so many other things.
01:41:41
Right, I would love to write that book too. I think the complicated thing, even as like this interview, which was entertaining, has demonstrated is it spirals out into so many different areas.
01:41:52
So it's kind of gonna, whatever book I write, I'm gonna have to perpetually be answering the question, well, why didn't you do it about, like you didn't touch this topic or that topic.
01:42:01
Well, you chose it. That's right. Did I ever tell you about my friend who has a big pumpkin head? Nope. Do you wanna hear about it?
01:42:08
Not really. But because it's your show, yes. So I have a friend who has a really big pumpkin head.
01:42:16
Yep, cool, brag. Here's how he got it. He found a genie in a lamp one day.
01:42:23
You know what I'm talking about, that old trope? Yep. Three wishes. Great. First wish, he wishes for all the money in the world.
01:42:30
Makes sense. Gets it. Yachts, penthouse in the sky.
01:42:36
He's got everything he could possibly want. Except what's the one thing money can't buy? Love, right? Wish number two, he says he wants the most beautiful woman in the world.
01:42:44
Sex robot. Gets it. So now he has all the money in the world, most beautiful woman in the world.
01:42:51
What else could he want? One wish left. Yeah. He totally drops the ball.
01:42:58
He wishes for a big pumpkin head. Okay. Isn't that weird? Not really.
01:43:04
All right, let's pray. Lord, thank you so much for my brother, Jeremy. Lord, what a difficult task it is to engage with the idea of sex robots from a
01:43:14
Christian perspective. How does the gospel apply to this? We thank you that you've raised up someone who will truly help us think about how all of Christ applies to all of life.
01:43:24
We know that this conversation will generate a lot of light, but perhaps also some heat.
01:43:30
We pray that people will interact with Jeremy and his work now and in the future, charitably, that they will think the best, and that if they disagree, they will still appreciate the desire to help the church think well about things that we very much have to deal with.