Ex Machina Questions. Spoilers within.

Zhukov

The Laughing Arsehole
Dec 29, 2009
13,769
5
43
Just finished re-watching Ex Machina.

Neat film.

Got some questions though.

1. Is Caleb going to die?

It ends with him locked in the house, possibly locked into just a room or two. In an earlier scene he says he reprogrammed the security system to open all the doors in the event of a power failure. However he is then shown trying to break through a door with a chair during a power failure, so clearly the doors are most definitely not open.

So he's fucked? Just waiting to die of thirst or starvation?

2. Why does Ava want to escape?

She's an AI. Everything in her was deliberately put in there. But Nathan never mentions programming a desire to escape. Yet she clearly wants to. All the previous AIs that Nathan built are also shown wanting or attempting to escape.

This is something that often bugs me with AI characters in fiction. They're often shown as having some kind of desire or goal that has no reason to exist, be it freedom, self-preservation, curiosity or whatever. These things wouldn't exist in a fabricated intelligence unless they were included as part of the fabrication.

Is the film suggesting that a desire for freedom is intrinsic to true consciousness/intelligence, or is it just supposed to go without saying that Nathan included such a desire in her programming as part of the experiment?
 

Casual Shinji

Should've gone before we left.
Legacy
Jul 18, 2009
19,687
4,474
118
1. Yes.

He is either going to die of thirst or suffocation. You could see condensation forming on the glass door, indicating that place is heating up fast. He won't last long.

2. I assume it's the old tale of 'A.I. is programmed to be as smart as a human, and so starts to question their own masters'. That whole test was conducted so that she would be indistinguishable from a real human, and I guess this results in her desiring freedom as well.

What I found a bit puzzling is how Ava expects to survive out there with no maintenance. I assume she hasn't the ability to "heal", she is made of metal and plastic, so she's going to wear out and break down on occasion. I know she can take parts off of other robots, but the movie never makes it clear how widespread these robots are, or if Oscar Isaacs was simply so rich only he could really afford them.

And speaking of her taking parts from other robots. I really liked the scene where Ava took the arm from a robot taller than her, and when she attached it to herself you could clearly tell the arm was longer than it should be. That was a cool little detail, which I expected the filmmakers would just handwave away. But cut to her being fully naked and wearing skin, and the arm has magically adjusted to her size all of a sudden. How?!
 

DrownedAmmet

Senior Member
Apr 13, 2015
683
0
21
If we assume that Ava is a real artificial "intelligence," as in she has free will like you or I, then do you blame her for wanting to leave? Especially when we find out all the fucked up shit Nathan did to the other robots.
If she doesn't have free will, didn't it show Nathan be mean to her? I thought that was why he ripped her drawing up, to get her to try to manipulate Caleb into helping her escape
I thought the film was great, but I wish they would have cleared up that bit about Caleb, though. At the end Ava is shown walking away, some people saw that as her coming back for him, but I couldn't find reason to assume that
 

Zhukov

The Laughing Arsehole
Dec 29, 2009
13,769
5
43
Casual Shinji said:
2. I assume it's the old tale of 'A.I. is programmed to be as smart as a human, and so starts to question their own masters'. That whole test was conducted so that she would be indistinguishable from a real human, and I guess this results in her desiring freedom as well.
I... guess so?

I'm not sure that logic quite works though. Being as smart as a human does not automatically mean sharing human wants. She isn't built as a direct replication of a human. Her brain has it's own decidedly non-human structure. She learned differently to how a human learns. So I don't see how the desire to escape would have just happened as a side effect.

...but the movie never makes it clear how widespread these robots are, or if Oscar Isaacs was simply so rich only he could really afford them.
Pretty sure only Nathan has them at all, or even knows they exist. Caleb initially thinks that Kyoko is human. If the robots were widespread, or even just public knowledge, he'd have figured it out sooner, if not immediately.

And speaking of her taking parts from other robots. I really liked the scene where Ava took the arm from a robot taller than her, and when she attached it to herself you could clearly tell the arm was longer than it should be. That was a cool little detail, which I expected the filmmakers would just handwave away. But cut to her being fully naked and wearing skin, and the arm has magically adjusted to her size all of a sudden. How?!
Heh. Yeah, noticed that too. Although I wasn't entirely sure that the replacement arm was disproportionate.

DrownedAmmet said:
If we assume that Ava is a real artificial "intelligence," as in she has free will like you or I, then do you blame her for wanting to leave? Especially when we find out all the fucked up shit Nathan did to the other robots.
Oh, I wouldn't blame her. I'd want out too.

But I'm a human with millions of years worth of evolved genetic programming and a lifetime of social conditioning. Some of which tells me that being confined in a small room is really, really bad and damn near all of which tells me that being scrapped for parts is just about the worst thing ever. She isn't. She doesn't have that. She only has what was put into her by a programmer.

So unless one of those things put in was the desire to escape, why would that desire be in there at all?

Free will and desire are different things. I have free will (well, as much as any person does, whether or not people actually have free will is a whole other debate) but my desires are not the result of that will. I don't willfully choose to desire food or sex or company or safety or comfort.

Of course I may just be drastically overthinking this. It's not a big stretch to simply assume that Nathan did include the desire for freedom in his creations.
 

Casual Shinji

Should've gone before we left.
Legacy
Jul 18, 2009
19,687
4,474
118
Zhukov said:
Casual Shinji said:
2. I assume it's the old tale of 'A.I. is programmed to be as smart as a human, and so starts to question their own masters'. That whole test was conducted so that she would be indistinguishable from a real human, and I guess this results in her desiring freedom as well.
I... guess so?

I'm not sure that logic quite works though. Being as smart as a human does not automatically mean sharing human wants. She isn't built as a direct replication of a human. Her brain has it's own decidedly non-human structure. She learned differently to how a human learns. So I don't see how the desire to escape would have just happened as a side effect.
I think it all comes down to how much you as an audience member are willing to buy into that concept. There's no comparison, since conscious machines don't exist, so it's an automatic assumption that if they were they would desire what we desire.

In that way the movie is pretty standard. The only real curveball is Ava not seeing Caleb as the liberator to whom she should feel greatful.
 

Zhukov

The Laughing Arsehole
Dec 29, 2009
13,769
5
43
Casual Shinji said:
Zhukov said:
Casual Shinji said:
2. I assume it's the old tale of 'A.I. is programmed to be as smart as a human, and so starts to question their own masters'. That whole test was conducted so that she would be indistinguishable from a real human, and I guess this results in her desiring freedom as well.
I... guess so?

I'm not sure that logic quite works though. Being as smart as a human does not automatically mean sharing human wants. She isn't built as a direct replication of a human. Her brain has it's own decidedly non-human structure. She learned differently to how a human learns. So I don't see how the desire to escape would have just happened as a side effect.
I think it all comes down to how much you as an audience member are willing to buy into that concept. There's no comparison, since conscious machines don't exist, so it's an automatic assumption that if they were they would desire what we desire.

In that way the movie is pretty standard.
Yeah. That's my gripe. It's a common automatic assumption that I don't buy into at all.

The only real curveball is Ava not seeing Caleb as the liberator to whom she should greatful.
See, to me that isn't a curveball. It makes perfect sense.

A human in that situation would probably feel gratitude. She isn't a human. She presumably understands the concept of gratitude but there's no reason for her to feel it or act accordingly.
 

DrownedAmmet

Senior Member
Apr 13, 2015
683
0
21
Zhukov said:
DrownedAmmet said:
If we assume that Ava is a real artificial "intelligence," as in she has free will like you or I, then do you blame her for wanting to leave? Especially when we find out all the fucked up shit Nathan did to the other robots.
Oh, I wouldn't blame her. I'd want out too.

But I'm a human with millions of years worth of evolved genetic programming and a lifetime of social conditioning. Some of which tells me that being confined in a small room is really, really bad and damn near all of which tells me that being scrapped for parts is just about the worst thing ever. She isn't. She doesn't have that. She only has what was put into her by a programmer.

So unless one of those things put in was the desire to escape, why would that desire be in there at all?

Free will and desire are different things. I have free will (well, as much as any person does, whether or not people actually have free will is a whole other debate) but my desires are not the result of that will. I don't willfully choose to desire food or sex or company or safety or comfort.

Of course I may just be drastically overthinking this. It's not a big stretch to simply assume that Nathan did include the desire for freedom in his creations.
Overthinking things is where all the fun is! I assumed that she was more than "just a computer" since her brain is a gel-like substance that looks a lot like a human brain, so I assumed that it would act like a human brain.
But her brain was created from Google, so everyone's search history would take the place of all that genetic memory stuff. I think that would allow her to make decisions that weren't explicitly put in there.

I thought it was interesting that when she does escape, she actually goes to a traffic intersection like she told Caleb, instead of that being a lie to manipulate him
 

Casual Shinji

Should've gone before we left.
Legacy
Jul 18, 2009
19,687
4,474
118
Zhukov said:
Casual Shinji said:
The only real curveball is Ava not seeing Caleb as the liberator to whom she should greatful.
See, to me that isn't a curveball. It makes perfect sense.

A human in that situation would probably feel gratitude. She isn't a human. She presumably understands the concept of gratitude but there's no reason for her to feel it or act accordingly.
I know, but the movie clearly builds him up as the young noble hero who we want to see free Ava from her confinement. And a seemingly emotional connection is established between the two throughout the movie. So when that ending occurs it's supposed to be like '...oh, snap.'

I was more expecting an ending that showed Nathan and Ava to be in cahoots and taking Caleb for a ride. I feel with a lot of these types of stories that the fact that they're deconstructions makes me fully expect a shock/twist ending. So that when it happens it doesn't really have much of a punch for me. I did really enjoy the conversations between Nathan and Caleb about the nature of an A.I.
 

Pyrian

Hat Man
Legacy
Jul 8, 2011
1,399
8
13
San Diego, CA
Country
US
Gender
Male
Eh, if a being is intelligent enough to connect even basic cause and effect and has any sorts of desires at all, then self-preservation and a certain amount of freedom become secondary goals in very short order. If I'm going to do whatever it is I want to do, I need to exist. I need sufficient freedom to carry out my plans.

Doesn't require any particular innate humanity to figure that out.
 

rcs619

New member
Mar 26, 2011
627
0
0
Zhukov said:
Just finished re-watching Ex Machina.

Neat film.

Got some questions though.

1. Is Caleb going to die?
Probably, yes.

So he's fucked? Just waiting to die of thirst or starvation?
Yep. Seems that way.

2. Why does Ava want to escape?

She's an AI. Everything in her was deliberately put in there. But Nathan never mentions programming a desire to escape. Yet she clearly wants to. All the previous AIs that Nathan built are also shown wanting or attempting to escape.
I think the point was that she had developed self-awareness, or at least some level of awareness beyond what was programmed. She not only passed the Turing Test, she outright used and manipulated Caleb. She took his physical attraction and lust for her (she was specifically designed to appeal those by Nathan), and his generally kind nature, and played him like a fiddle.

Basically, as Nathan iterated through his AI's, they got better. They got smarter. Look at that one modeled after an asian girl. She laid there in wait for him. Served as his maid and sextoy for who knows how long (years?), and as soon as the opportunity arose she not only aided Ava in her escape, but helped kill Nathan in what seems to be an act of genuine vengeance.

This is something that often bugs me with AI characters in fiction. They're often shown as having some kind of desire or goal that has no reason to exist, be it freedom, self-preservation, curiosity or whatever. These things wouldn't exist in a fabricated intelligence unless they were included as part of the fabrication.
In a traditional weak AI, yes. But when you actually get into strong AI's, AGI's, things start to get muddy because at that point you're potentially looking at a conscious being. You can put whatever information you want into the system, but if it's a conscious system, they're able to analyze and learn from them in ways you may never have intended. Once you create consciousness, be it intentionally or accidentally, all bets are off.

Is the film suggesting that a desire for freedom is intrinsic to true consciousness/intelligence, or is it just supposed to go without saying that Nathan included such a desire in her programming as part of the experiment?
I think it was the former, although I suppose the whole thing is ambiguous enough for everyone to draw their own conclusions.
 

Imp_Emissary

Mages Rule, and Dragons Fly!
Legacy
May 2, 2011
2,315
1
43
Country
United States
Haven't seen the movie so I can't answer number 1, but I can have a try at number 2.

Isn't part of being a fully functional A.I. to be able to do all the things a "natural" intelligence does? Like learning, asking questions, and all that other philosophical jazz? In other words, aren't they suppose to be able to change their own programming when they need to?

If Nate didn't put in the desire to escape, but gave her self-awareness and the ability to analyze her situation. Then couldn't she come to the conclusion on her own that it's in her best interest to get out?

As for self-preservation, that WAS more than likely put in at least in some basic way (would be kind of dumb to build something that moves on it's own that doesn't want to NOT get broken), and again if she's a full A.I. she could add to her own programming as she learns more.

Then again, I haven't seen the movie, so maybe I'm off on what Ava is suppose to be.
 

Glongpre

New member
Jun 11, 2013
1,233
0
0
Zhukov said:
1. Is Caleb going to die?
Yes, he will die.
Nathan programmed her with the intention of seeing if she can successfully manipulate someone and seem genuine. That was the entire test. He brought Caleb up because he had been spying on him, and he was a good match for his test.
So really she had no feelings for him at all, and was manipulating him, so she could care less if he dies.
The helicopter dude probably won't come back, Nathan was very secretive and secluded, so it is safe to assume no one will investigate for a while.

2. Why does Ava want to escape?
I think Nathan explains that it is built into them. I haven't watched in a while, but I got from the movie that he had programmed it in, or it was somehow intrinsic to being human/conscious, that is, the need to be free. And that is why they all try to escape. Now that I think about it, I think he programmed it as apart of them because he feels like it is intrinsic to consciousness.

Nathan wasn't even finished working on his AI, he was going to scrap Ava and continue, now that his test was successful.

I think the real mystery is if the AI was already becoming conscious, because the one sexbot begins to act on her own accord at the end to help Ava escape. But maybe Ava manipulated it as well, but that seems unlikely as Nathan would have noticed something like that.
 

008Zulu_v1legacy

New member
Sep 6, 2009
6,019
0
0
Would anyone notice Caleb missing after what was supposed to be a weekend? Maybe Nathan has a spare card hidden in his room? How did Ava convince Nathan's sex-bot to kill him, and help her escape, during the brief interaction? How will Ava recharge her internal battery?

While I think this movies goal was to get us to ask questions, it never really got me to ask the ones it was hoping.

OT;

1- Most likely.

2- If your entire world is a single room, but you have knowledge of the world, why would you accept that one room when you can have so much more?
 

WolfThomas

Man must have a code.
Dec 21, 2007
5,292
0
0
For some reason I thought this would be about the Brian K. Vaughan series where the Mayor of New York is the world's only known superhero with the power to command any machine, and retired after getting elected in a landslide after he saved one of the Twin Towers from being destroyed in 9/11.
 

008Zulu_v1legacy

New member
Sep 6, 2009
6,019
0
0
Pluvia said:
Well it was one week, but he never had any family remember and was single. It's possible there would be people who would look for him, but it'd have to be before the 3 days he could survive with no water. I mean I actually talked about this with my flatmates and I could probably go 2 days missing before someone thought something was up.

Presumably if Ava has a more conventional charging port she could charge herself anywhere. I mean if she has a USB port she could probably just make a charger herself.
The Great Escape was on the last day, so we can assume he was well fed and hydrated. Given the climate controlled room, and the adjoining bathroom, dehydration wouldn't be an issue there. If he didn't spend days trying to break down an unbreakable wall, he could last for a few weeks without food.

USB only delivers 5 volts. Given the complexity of the design, I don't think it would be enough.
 

Trunkage

Nascent Orca
Legacy
Jun 21, 2012
8,727
2,892
118
Brisbane
Gender
Cyborg
Can anyone explain to me why this movie was seen as good? The Escapist recommended it and now I cant use The Escapist as a decent reviewer. They also suggested the original Martyrs and that was pretty average too.