Just finished re-watching Ex Machina.
Neat film.
Got some questions though.
1. Is Caleb going to die?
It ends with him locked in the house, possibly locked into just a room or two. In an earlier scene he says he reprogrammed the security system to open all the doors in the event of a power failure. However he is then shown trying to break through a door with a chair during a power failure, so clearly the doors are most definitely not open.
So he's fucked? Just waiting to die of thirst or starvation?
2. Why does Ava want to escape?
She's an AI. Everything in her was deliberately put in there. But Nathan never mentions programming a desire to escape. Yet she clearly wants to. All the previous AIs that Nathan built are also shown wanting or attempting to escape.
This is something that often bugs me with AI characters in fiction. They're often shown as having some kind of desire or goal that has no reason to exist, be it freedom, self-preservation, curiosity or whatever. These things wouldn't exist in a fabricated intelligence unless they were included as part of the fabrication.
Is the film suggesting that a desire for freedom is intrinsic to true consciousness/intelligence, or is it just supposed to go without saying that Nathan included such a desire in her programming as part of the experiment?
Neat film.
Got some questions though.
1. Is Caleb going to die?
It ends with him locked in the house, possibly locked into just a room or two. In an earlier scene he says he reprogrammed the security system to open all the doors in the event of a power failure. However he is then shown trying to break through a door with a chair during a power failure, so clearly the doors are most definitely not open.
So he's fucked? Just waiting to die of thirst or starvation?
2. Why does Ava want to escape?
She's an AI. Everything in her was deliberately put in there. But Nathan never mentions programming a desire to escape. Yet she clearly wants to. All the previous AIs that Nathan built are also shown wanting or attempting to escape.
This is something that often bugs me with AI characters in fiction. They're often shown as having some kind of desire or goal that has no reason to exist, be it freedom, self-preservation, curiosity or whatever. These things wouldn't exist in a fabricated intelligence unless they were included as part of the fabrication.
Is the film suggesting that a desire for freedom is intrinsic to true consciousness/intelligence, or is it just supposed to go without saying that Nathan included such a desire in her programming as part of the experiment?