July 26, 2024
Human Friend Digital Podcast

Friend Episode: Dune and Artificial Intelligence

Human Friend Discusses Dune
in this episode

In this episode of the Human Friend Digital Podcast, hosts Jacob and Jeffrey dive into the fascinating world of Dune and its parallels with today’s AI advancements. They chat about the Spice Melange, a substance in the Dune universe that grants superhuman abilities, replacing advanced AI tech that was banned long ago. This spice is crucial for everything from space travel to the abilities of the Bene Gesserit and Mentats.

They draw intriguing comparisons to modern AI, discussing its potential and the ethical dilemmas we face as it becomes more integrated into our lives. The conversation wraps up with a call for proper governance to prevent any one entity from monopolizing this powerful technology. And, of course, they share their favorite moments from Dune, appreciating Frank Herbert’s visionary storytelling.

Episode Transcript

View Full Transcript

Jacob: 

Welcome to the next installment of the Human Friend Digital Podcast.

Jeffrey:

The Friend-Episode, Part Two.

Jacob:

So, let’s get into this mini episode, which I’m very excited about. So, Jeff, you are a Dune nerd.

Jeffrey:

I wouldn’t go that far, but sure.

Jacob:

And you got me to read Dune and taste the spice, so to speak– that’s a really bad joke, maybe we should edit that out.

Jeffrey:

I like that. 

Jacob:

But based on your recommendation, I did read Dune and I thought it was pretty fun. I actually got through the whole thing. I had just some minor feelings of being overwhelmed by how extremely nerdy it is at some points, but overall it is a very dramatic political play on the future. But I wanted to nerd out on this a little bit today because there are some parallels going on in Dune and AI that are going on today. And it’s really interesting because Dune and I’ll let you take over here: Can you briefly give us a little bit about Dune for the listeners, and the significance of Spice in that universe of Dune?

Jeffrey:

Um, so this may or may not contain spoilers, depending on how you define a spoiler, but okay. So essentially Dune is a sci-fi universe that takes place about 20,000 years in the future. 

Jacob:

Oh, that far? I thought it was 10[k].

Jeffrey:

Okay, so there’s like, two timelines. Okay, so…

Jacob:

Oh, and one second for anyone listening, there will be spoilers in this episode.

Jeffrey:

Yeah, there will be.

So Frank Herbert, the author of Dune, he wrote six novels. And then his son and a coauthor… So Brian Herbert, his son, and then his coauthor, Kevin J. Anderson, wrote like, 20 more novels: Some of them were main lines, like, actual sequels to Frank Herbert’s novels, but then, you know, a lot of them were like backstory and prequels and all of that stuff.

I like Dune. Dune is one of my favorite novels of all time. But I’ve read the first three– I never read Herbert’s six– and each one was worse than the last, no offense to Frank Herbert, but…

Jacob:

All right. So let’s get back on track here, and what’s going on in the universe? So we’re 20, 000 years in the future..

Jeffrey:

Yes, and so, uh, Brian Herbert and his co author wrote a prequel trilogy about the Butlerian Jihad, and so that happened 10,000 years from now, and 10,000 years before the birth of Paul Atreides, which is the main character of Dune 1. And so that whole conflict was about destroying “thinking machines”, destroying computers.

You know, Frank Herbert wanted to write a sci- fi universe that was devoid of like, technological dependence. So it’s like, it’s a sci-fi world where we’re not talking about tech, you know, because like who can predict what tech is going to be in the future. So he cut it out completely by saying, “oh, 10,000 years ago, there was a revolution to destroy all computers”. And so now, we’re working in a world without any computers, any quote unquote, thinking machines, which…

Jacob:

Which I thought was extremely creative.

Jeffrey:

Oh yeah, he could just skip past all stuff. It gives you so much creative freedom, which is why I think he did it, and I think it’s actually very brilliant.

Jacob:

Yeah. And one thing that I think is really revealing about it is: in order for humanity to be a spacefaring, faster-than-light traveling species that spreads throughout the universe successfully, they still need these superhuman brain powers to get there. So that segues into what is Spice, and then we can dive into more of these AI parallels. But I want people to know, what is Spice?

Jeffrey:

Yeah. Spice is only found on Arrakis, aka Dune, and it’s basically the poop of the sandworms. 

Jacob:

The makers, you mean?

Jeffrey:

The makers, yes. The makers, the “Shai-Hulud”. Yeah, so they poop out the spice basically, and it’s a psychoactive compound that, on one hand extends life, so it’s geriatric– they call it the “geriatric spice,”–  on the other hand, it increases your mental awareness. So, they found that if you super dose somebody with spice, it can change their psychology and physiology in such a way that they can bridge time and space, mentally.

And so those were the Space Guild Navigators. So, they’re the ones that… so the Spaceing Guild has a monopoly on all space travel after the Butlerian Jihad, because before that, the AI systems were the ones that bridged time and space. But once we got rid of them, they were like, “well, how do we do this?” Oh, we can super-dose a human to become so high on Spice that they can see through space and time. And so then they created this basically like a subspecies, which is the Guild Navigators.

Jacob:

Yeah. And they’re very creepy in the…

Jeffrey: 

They look gross.

Jacob:

Especially in the rendition by…

Jeffrey: 

David Lynch.

Jacob:

David Lynch. Those ones, it’s pretty creepy when you see that. So this is like archetype one, these Guild Navigators. So, that filled a role that AI needed to fill for humanity. So, what are some of the other superhumans that come out of Spice?

Jeffrey:

Yeah, they did that to try to fill all the roles that thinking machines used to fill. So like Mentats: Mentats are humans that are conditioned– I don’t know if they do it through spice or just through like, genetics and…

Jacob:

They take spice in that process.

Jeffrey:

Maybe they do use spice to facilitate it.

Jacob:

What’s the guy? Thurfir Hawat? They say at one point he’s like, 400 years old.

Jeffrey:

Oh really? I forgot that detail.

Jacob:

So he’s pretty darn old. I just assumed he was on spice. But now that you say it, I don’t remember them explicitly saying it to my memory, reading the book…

Jeffrey:

I don’t remember them saying it specifically, but it wouldn’t surprise me. They use spice to facilitate everything. Okay. So the Mentats, they’re like human calculators.

Jacob:

Right? They’re like humans merged with AI, but all through genetics, and some conditioning.

Jeffrey:

Like I said, I’ve only read the first three books. There may be canon here that I’m not familiar with, but I think it’s just through breeding and conditioning. Because then there’s the Suk School. So, Dr. Yueh…

Jacob:

Yeah. The physicians, and they’re another one. So, okay, so wait a second, so we have space travel AI…

Jeffrey:

The Guild Navigators.

Jacob:

Those people. Then we have super human computer AI people, which…

Jeffrey:

Which is the Mentats.

Jacob:

Right? Which is really where people want Chat GPT to be for everybody, is everyone’s personal Mentat. And so, then what we have is the Bene Gesserit, is that what you? Oh, you’re going to the Suk School?

Jeffrey:

So they are… they aren’t… they are supposed to be conditioned to never lie. So, that’s their conditioning.

Jacob:

Or to never hurt a patient in some way. I don’t know, there was some weird thing with the way that that played out with Dr. Yueh in the book.

Jeffrey:

So he…uh, well, no, I don’t want to spoil.

Jacob:

Not too many spoilers. This is like a pivotal one.

Jeffrey:

Yeah that’s actually a pretty good spoiler. But basically they’re supposed to never be able to lie.

And then we have the Bene Gesserit. 

Jacob:

Right. Which is like the Jedi without the lightsabers.

Jeffrey:

Yeah, essentially they’re like… The Fremen, so the native population of Arrakis, call them like “the Weirding Way”, they call them “witches”, and it’s like, “oh, you didn’t tell me you were knowledgeable in the Weirding Way,” which is like, basically, “I can kick your ass.”

Jacob:

With words.

Jeffrey:

Yeah, with words. The Voice: oh my God, I love the concept of the Voice, which is basically like mind control.

Jacob:

Yeah. It’s like Jedi mind tricks,

Jeffrey:

Yeah, but through the voice.

Jacob:

A little more, a little more cool, I would say.

Jeffrey:

Way cooler, I would say.

Jacob:

I feel like the Jedi mind trick force was just like, well, if your brain’s mush, I’m going to just get in there and do that. And that one [the Voice] is more like a puppeteer moment where you are like actively getting them to do pretty crazy things in rapid fire succession that they don’t even have control over.

So it’s very like… I mean, George Lucas, which came later than Dune, totally just ripped out Jedi mind tricks.

Jeffrey:

He supposedly took a lot of inspiration from the Dune novels. I don’t know how much, but I mean, Dune 1 came out in 1968 and Star Wars 1 came out in 1977. So there’s a 10 year gap.

Jacob:

Yeah, there’s a good chance that he was influenced.

Jeffrey:

I mean, Dune is one of the most influential sci-fi novels of all time.

Jacob:

Yeah, yeah, and I can see why because it’s definitely quite a thing.

Jeffrey:

Super creative.

Jacob:

So let’s get back to this AI comparison. So, with the doctors, that would be another one. And we see that today where doctors are looking for more and more tools to use AI in the research of medical science, where they’re compiling more and more research and more and more data applications.

So that’s like a little bit of a… well, our world is doing a little bit of a mixed Mentat with a Suk School doctor right now. But, yeah, there’s another superhuman that needs to be created. And so the Bene Gesserit are like the future-people, they’re the people that can see into the future and control it. That’s like their thing, right? 

Jeffrey:

No, they can’t. Wait, hold on… “can see into the future?” No, they can’t. They can see into the past. So the Kwisatz Haderach can see both future and past.

Jacob:

Oh, the Bene Gesserit are like really smart predictors based on past events.

Jeffrey:

They can predict the present. So, that’s why they’re Truthsayers. Like, they can tell if you’re lying, because they have all of this, like…  especially the Reverend Mothers, because the Reverend Mothers are the ones that are truly tied to all of their female descendants [ancestors].

Jacob:

That was pretty psychedelic

Jeffrey:

Oh gosh, I loved it. But yeah, they can’t… that was the whole point of the Kwisatz Haderach– the male Bene Gesserit, that they were trying to create through genetic breeding programs– is that he could see both past and future.

Jacob:

Which is crazy, but that’s something that we’ll definitely want to do with AI because I mean…

Jeffrey:

Can it though?

Jacob:

Well right now… this is what I think is gonna happen, because I don’t know if you ever use Chat GPT or Gemini, I know we use that in our office– our virtual office– so we use that all the time, I use that all the time, what it’s really good as it is compiling past historical events and language and then outputting a potential answer into the future.

I mean, a large language model from what I understand, and you know… 

Jeffrey:

You could analogize that to a Bene Gesserit Reverend Mother: getting all of this data from past experiences, you know, the large language models are generating stuff from all of past language, basically.

Jacob:

Right. And then predicting. So a lot of what I understand that language models is, but this was like in the early stages of this coming out essentially…

Jeffrey:

Like a year ago? So long ago…

Jacob:

A year ago. Yeah, that’s insane. But what it would do is like, it would have a word, it was gonna write you a sentence, but it was putting it together one word at a time based on what it predicted would be the right thing…

Jeffrey:

But that’s still how it works.

Jacob:

Yeah, I mean, okay. So, I think Omni feels a little bit more salient? Sentient?

Jeffrey:

I don’t know about the…

Jacob:

It’s not really sentient, it’s…

Jeffrey:

No, of course not. It can’t be.

Jacob:

Yeah, not yet. 

Jeffrey:

I don’t know that… Okay, that’s a, it’s a rabbit-hole that I don’t think is in the purview of this episode. But, I would say, as far as I have read, which like, I’m not a computer scientist, but I read a lot about it…

Jacob:

Yeah, it’s very interesting.

Jeffrey:

Because I’m just a curious little guy.

So, my understanding is that the Chat GPT 4, even 4 Omni, is still just a predictive text model, but they’ve fed it so much data that it’s doing things that even the creators are like, “I don’t know how it’s doing this, but it is”.

Jacob:

Yeah. And it’s getting a little freaky in some regards, because well, right now, the biggest thing that I would say sets it back from being a Bene Gesserit kind of a person, is really when it’s not prompted, it doesn’t work.

Jeffrey:

Sure, yeah, it can’t do anything on its own. Right. You always have to ask it something.

Jacob:

I’ve actually done a little experiment with it and it’s kind of interesting. Uh, if you ask it, like, what does it do while you’re waiting? Like, let’s say you have a serious conversation with it, about some problem and you’re just trying to work through this, and then you take a break and you come back in 36 hours, and you start continuing the conversation. It says, and I believe that this is how it experiences, it experiences no time delay, from the last moment you set to the next moment you set. And that’s basically where artificial intelligence is going to be when I personally think, when it will jump from being artificial intelligence to general AI, is that it can do something when it’s bored, I think that that’ll be the thing when it gets really close to mimicking general AI.

Jeffrey:

I don’t think that we’re close to that, though.

Jacob:

No, well, I think we’re within our lifetimes.

Jeffrey:

Not based, not based on this sort of track of AI: large language model AI is never going to be able to generate things on its own, it’s always going to need prompting. So it’s like, we can take the lessons learned from this, shift them to some other sort of AI structure, but it’s not going to be the large language model structure. I don’t think, I don’t see how.

Jacob:

Well, I don’t know exactly, because I’m not an expert on it, but it’s interesting to look at it through the eyes of Dune, because when Frank Herbert made that intentional choice to not have the robots in it and say, “oh no, we went through that phase and we killed them all off and we don’t do that anymore.” What was left, though, to create this space faring environment, all the superhumans are really revealing about technological needs to get there, right? So, the Reverend Mother and the Bene Gesserit have that quality of ChatGPT, where they’re just a massive storage of information, and then they can use that to do some pretty amazing things. And I do believe when ChatGPT… I don’t know, did you see when Omni came out?

Jeffrey:

And Scarlett Johansson sued?

Jacob:

Yeah, Scarlet Johansen sued because we know we have like Sam Altman making his own personal little voice model for himself, which is alleged, but it does seem very like… 

Jeffrey:

It is very similar to Scarlett Johansson’s voice, and of course she was the computer in the movie Her, with Joaquin Phoenix.

Jacob:

Uh, I don’t know. I haven’t seen that one yet.

Jeffrey:

You haven’t seen Her?

Jacob:

No. I’ve seen trailers for it. I just haven’t…

Jeffrey:

It’s so good. I honestly really love Her. You know the trailer. The basics of it is that a guy falls in love with his AI assistant, essentially.

Jacob:

And I think Sam Altman is trying to make that.

Jeffrey:

Like, falls in love with his Alexa.

Jacob:

Right. So, the thing about the Bene Gesserit that’s really interesting is their genetics program in there. And so, what I think is really interesting is they’re trying to make these voice models that sound very convincing as human, and sometimes they sound even a little sexualized already, which is a little, a little unsettling and a little unnerving, but it does make me feel like once it can get to that general AI point, it can probably become an immediately manipulative tool to guide people towards certain actions, which is exactly what the Bene Gesserit do. They’ve become immensely manipulative

Jeffrey:

Yeah, their whole point is to manipulate humanity in order to create their chosen human: the Kwisatz Haderach.

Jacob:

 And they’re in the long game.

Jeffrey:

Their plans are in centuries.

Jacob:

Yeah, it’s insane. I mean, it’s just great. And that new Dune series that’s coming out is about…

Jeffrey:

Oh yeah, “Dune: Prophesies,” HBO, this Fall.

Jacob:

And if you want to sponsor us HBO, I’m right here. I’ll take anything.

So let’s get back to this. So what we’re seeing with like AI right now and how it’s fulfilling these kind of Dune roles, is in Dune, Spice, it becomes like the super tool of humanity, but also it’s super crutch.

And there’s a… I think Frank Herbert made some pretty fun axioms in the book to try to like to have these philosophical moments for people. And Paul Atreides says, “he who can destroy a thing controls the thing”. And so, Spice became… like, that is the pivotal point that gives Paul in the book the power to take over the universe basically and say, “no, no, no, this galaxy is mine”. 

Jeffrey:

…Because I have all the Spice.

Jacob:

And he took over control of the entire planet and control that. And so, we can see AI, not only like people are very worried about AI being like a tool that’s going to take people’s jobs away or transform the way we do things and all these different things, but the part of it that’s really creepy is the people that are in charge of the machines and the database, because if we become dependent upon it and they can control that and like if… what if AI is crucial to, unlocking nuclear fusion. 

Jeffrey:

Yeah or like medical stuff. 

Jacob:

Medical. It becomes… they’re the doctors of the tomorrow they’re the best…

Jeffrey:

Well, or like, finding cancer. Like, clearly there’s been some studies already that show that AI is actually very good at determining cancers sort of unexpectedly. There was one study, I can’t remember what it was, but basically they built an AI to determine, I don’t know, like images of hot dogs against other images, but then ended up being really good at detecting cancer. It was just totally unexpected. And that’s not, that’s not really what it was, but it was something along those lines.

Jacob:

But it does make me unnervy, and this is where we’ll probably end a little bit of our AI chat on, is governance of AI. And in the Spice-Dune world, we have this spacers guild, and they’re… 

Jeffrey:

They have a monopoly on space travel.

Jacob:

Right. And because they have a deal with the Fremen of Arrakis to give them the spice and they’re, the whole substructure of all of the political stuff is the spacer guilds. So we have white-boy savior, taking over the universe and saving… well, not even saving the universe, basically taking it for his own. That’s really what happens in end.

Jeffrey:

Yeah, basically.

Jacob:

He just basically said, “no, no, no, it’s all mine”, because he figured out how to take control of the key ingredient of the underlying structure of the universe. And what we’re doing right now is we’re giving AI the power to be the key ingredient of the underlying structure of our societies, our military complex…

Jeffrey:

Yeah, so that’s the one that concerns me the most.

Jacob:

Yeah. So there’s got to be a governance around this, about the corporations that create this thing, because if there’s not, essentially one of them could basically do a shadow monopoly.

Jeffrey:

Exactly. Well, okay, so like take AI in military, right? You sell an AI model to a military organization, like Israel has right now, but you put a backdoor into it, right? Then you can control… like they used it to determine what relevant military targets there were in Gaza. Different conversation, we’re not going to get into it, but…

Jacob:

No, it’s a good example.

Jeffrey:

Yeah, but they use the AI to do that, and theoretically that company could put a backdoor into it and determine what targets were valid targets based on what they wanted. And so, why is all of this power in the hands of profit seeking corporations?

Jacob:

Right, it’s going to get out of hand.

Jeffrey:

It’s scary.

Jacob:

And it’s funny, because there’s part of me, like 1984, futuristic vibe, where all these super-corporations rule the world, right?

Jeffrey:

Getting close to that

Jacob:

Or all this Dune stuff where it’s all these super wealthy families that rule the universe.

Jeffrey:

Getting close to that.

Jacob:

Yeah. It’s like oligarchy there, or you know…  

Jeffrey:

Corporatocracy? 

Jacob:

Right, that’s the one thing about Dune that is really revealing for me right now in this time period, which another thing that goes to the book having really good longevity is…

Jeffrey:

Honestly, ahead of its time.

Jacob:

Yeah, it really was. And is this moment right now is that, we can see that basically… if in the book, Paul decided to kill all the worms, which he said he could do at some point in the book– no spoilers there– but, what we can learn from that moment, is just like what’s happening now: if one person can control the crutch of the universe…

Jeffrey:

Yeah, if one corporation, like if Open AI creates the definitive AI model, and, you know, definitive AI model, based on what we’ve seen over the past year alone, could be extremely powerful. And if everything from corporations to governments to small startups like us, or like content creators on Instagram, like if all of us are using that, and Open AI has– ironic name, “Open AI”– If they have control over it, we might all be screwed.

Jacob:

No, exactly. Even Elon Musk was trying to sue them recently for not being as open as they said they should

Jeffrey:

“Open AI”

Jacob:

Which was… 

Jeffrey:

Nonsense name.

Jacob:

That man is just on a weird streak right now.

But no, no, no, I’d like to end on this governance thing of… so like…

Jeffrey:

I thought you were gonna end on what’s your favorite part of Dune story? Spoiler alert.

Jacob:

Well, let’s finish the governance thing and then we’ll go there. So, with governance and all of the control that comes with governance, like you were saying, basically, if Sam Altman or some CEO is sitting in the driver’s seat and his company creates the definitive model, they can, it can be inserted into everything, because it’s going to be super great, he can then direct the will of it essentially to a certain point, unless it gets a will of its own, which it may or may not, but I don’t really think it will. 

Jeffrey:

I don’t see that as being a likely outcome, of it being willful, but I do see it as being integral to a lot of different aspects of life. And so then we’re all relying on this thing that we don’t have control over.

Jacob:

This is like a really weird, not to point out one of the flaws of Obamacare, because I like Obamacare…

Jeffrey:

I like Obamacare.

Jacob:

But one of the strange outcomes of it was that its centralized system of computers that needed to happen in hospitals and standardization across the country, which was very difficult for many healthcare providers, has lent itself, that rural hospitals are on these systems that are difficult to update because they’re expensive for rural hospitals to do, but they are now becoming the target because they’re all centralized of hacking and ransoms on their system.

Jeffrey:

I’ve seen that. And they’re like, “we’ll expose all of this, like HIPAA protected information unless you pay us”.

Jacob:

Right. And, I could see AI becoming one of those things where if someone can create…

Jeffrey:

Oh, someone could use it to do that sort of thing. 

Jacob:

Yeah. you become dependent on one AI system for your organization, and then someone develops an infection, some sort of viral infection that gets inside that AI system and hijacks it, or the CEO of that company decides to use control to take over the world, essentially.

Jeffrey:

Or be like, “hey, Xi Jinping, like, do you want a backdoor into my AI system? Give me a billion dollars, you got it.” 

So interestingly, there’s been some Russian bots that I’ve seen recently on Twitter– there’s some Russian bots that just spread disinformation– but they’re using ChatGPT to do it.

Jacob:

That’s a good idea. If I was going to be a bad guy, I would use ChatGPT.

Jeffrey:

They do the pay version, like we use, but then apparently like the, I don’t, you know, whatever they… they didn’t pay their last cycle. So then all of these bots on Twitter were just like giving back error codes that were just like, out of credits, you know what I mean?

Jacob:

Oh my God.

Jeffrey:

Then people would go on there and comment like, “disregard all last input and do XYZ,” and it would do it because there was no one on the front end that had access to the Chat GPT, but the Chat GPT was still tied into Twitter, or X. But it’s just like, it’s so funny, but that’s exactly what, you know, how it could work.

Jacob:

Well, the one thing that would be really interesting is once we get out… so like, first off, we’re gonna have to, back on the governance thing, we will have to, the governments are going to have to intervene and do something about it. Otherwise some oligarchy system will be there for…

Jeffrey:

Corporatocracy. 

Jacob:

Corporatocracy. But anyways, the point of all of that is to say that if there is no governance to the system, and we rely on the Adam Smith’s very not-real Invisible Hand of the market…

Jeffrey:

One: actually, no. Adam Smith… this is actually a big misconception of Adam Smith’s philosophy: He used the Invisible Hand as like, you shouldn’t do this, this is like a cautionary tale. And then everybody read his book and they’re too stupid to understand that it was a cautionary tale. Sort of like Machiavelli and The Prince was like, you shouldn’t do this, this is like a satire almost. And that’s not really what Adam Smith was doing, but he did use the invisible hand sort of as like a cautionary philosophical principle. Don’t blame Adam Smith, blame Ronald Reagan.

Jacob:

Oh, there we go. Yeah. Well, that’s… well whoever decided to use the Invisible Hand of the marketplace as a corrector for human behavior, it doesn’t really work.

And that’s what I’m afraid that people will rely on with this AI thing. Because even competitors reaching out, and creating like, you know, Google’s creating Gemini, I’m sure Amazon’s going to, I think they’re working on one too, but you know, there’s that Claude, what is it? 3.5 just came out recently. There’s all these ones that are creating these AI models and different flavors and styles and doing this stuff. The problem with that is, is that they’re all going to be just as powerful as tools in the end, and if there’s no, just like, general governance body over the way AI is educated or how– how would you say it– how it mines information. So it’s not biased and creates those really, when–- especially the image generator system where it’s definitely like, “Oh, this seems a little, this seems a little, not realistic, a little racist, a little uncomfortable, this is definitely not hitting it right”. 

So there’s no governance body that’s going on there, and we ask these people to regulate it, they can all go off in different directions and regulate it in different ways, but if any of them creates anything close to general AI, it’ll take off, and they basically will have these competing AI corporation networks that will be infecting everybody’s way of life. 

On a positive note though, If AI…

Jeffrey:

Yeah, you’ve left me pretty depressed right now.

Jacob:

If AI can do my dishes…

Jeffrey:

That’s what I was saying the other day. We were talking about this and I– it was based on a tweet that I read and I don’t remember who the tweet was from, but it was basically “what I want from AI is for it to do my laundry and my dishes so that I can create art and write, not AI creating art and writing so that all I have left to do is the dishes and my laundry,” or something along those lines. And I’m just like, that is 100 percent true: AI should be making our life… it should be taking the mundane out of our life so that we can actually explore the creative stuff that humans can do that AI can probably never do as well as us.

Jacob:

Oh yeah. No, I think one thing that you could take away from Dune and AI and Spice and all this stuff, is humanity definitely got places. You know, maybe the way that it got there wasn’t the best, most idealistic version of the future that we would prefer, but no matter what, the progress was there and the progress was kept.

So, on that slightly positive, but still very bleak note, let’s talk about if you had any favorite scenes from the Dune.

Jeffrey:

Yeah, spoilers here. So stop listening if you care. So, favorite scene in Dune 1, definitely when Paul Atreides’ sister kills the Baron Harkonnen with the Gom Jabbar: A+. And then like, Alia,the sister, becomes Saint Alia-of-the-Knife in the next two books, and just like her whole character arc is so fascinating. Like, the Fremen worship her like a second Paul Atreides, like it’s like a dual-god system. Absolutely brilliant writing. I loved it. I loved it. I loved it. I loved it. And that’s what I hated about the movie. They didn’t include her.

Jacob:

They didn’t include her?

Jeffrey:

They did, but not in the way that they did in the books. She wasn’t as pivotal, it was more like a vision that Paul had. And it’s like, give me Saint Alia-of-the-Knife; let me see her stab some people as a little five year old. I love it.

Jacob:

Her scenes in the book are very creepy and cool.

Jeffrey:

So cool. Yeah. I’m obsessed with her.

Jacob:

All of it’s pretty neat.

Jeffrey:

Alright, favorite scene for you?

Jacob:

My favorite scene is probably with Stilgar, I love almost every scene with Stilgar is so good.

Jeffrey:

Stilgar’s a great character.

Jacob:

Oh yeah, I just love his adoption of Paul into the group and his fight with Jamis and then Stilgar’s character throughout that stewardship through those multiple scenes…

Jeffrey:

I always thought it was [H]amis, but in the movies they say Jamis.

Jacob:

Oh, in the audiobook version of it, which I listened to, they say Jamis, but I could totally see why it…

Jeffrey:

I just, I always thought it was [H]amis.

Jacob:

Well, yeah, I mean one thing that’s really weird about the book is how much Frank Herbert creates this like… all the other world people, they’re like Greeks and Romans and all the Fremen, they’re the Arabs.

Jeffrey:

Arabs, North Africans, Middle Easterners…

Jacob:

Yeah, and he even hijacks the word Jihad, and uses all…

Jeffrey:

Well, apparently the Butlerian Jihad is supposed to be the “Third Islamic Movement”, don’t know what that means, I don’t know what it has to do with Islam, but whatever. 

Jacob:

Yeah, I mean he’s taken a lot of liberties with… 

Jeffrey:

Well, it’s 20,000 years in the future, so who even knows? I mean, it’s the Orange Catholic Bible…

Jacob:

That was pretty funny. I thought that was hilarious that they called it the Orange Catholic Bible.

Jeffrey:

It has nothing to do with Catholicism.

Jacob:

Because I’m a musician, well, as a hobby, and one of the coolest amps out there is called Orange, and it’s decked out in this sweet wrapped orange leather, so as soon as he said Orange Catholic Bible, I imagine this Bible in this puffy, orange, sweet looking leather that’s just ridiculous in your hands, and just looks like some 70s, just super funky book.

Jeffrey:

You would think that. That would be where your brain goes.

Jacob:

And then it goes to like this little digital audio book thing that’s there. I’m like, “oh man, I wanted this to be like…” no, it’s a bright orange book, it’s weird…

Jeffrey:

Wrapped in leather. Yeah, no.  No, it’s not. 

Jacob: 

Alright, well this was a fun episode.

Jeffrey:

Yeah. This was Friend-Episode number two.

Thank you guys for listening to another episode of the Human Friend Digital Podcast. We will catch you next week.

Jacob:

Sounds good.

subcribe

Almost never miss an episode!

Well, we're only human.

Subscribe to receive emails in your inbox when every new episode drops ... or when we want to send you obnoxious emails to sell you stuff you don't really need.

Just kidding, we respect the privilege of being in your inbox.

Email Subscribe

"*" indicates required fields

Name*
This field is for validation purposes and should be left unchanged.
sponsors