Interview: 'Ex Machina' director Alex Garland on the inevitability, ethics of AI

Interview: 'Ex Machina' director Alex Garland on the inevitability, ethics of AI

"Ex Machina" director Alex Garland (Photo by Tim Mosenfelder/Tim Mosenfelder)

"Ex Machina" director Alex Garland (Photo by Tim Mosenfelder/Tim Mosenfelder)

After several weeks in limited release, the heady science fiction film "Ex Machina" opens in theaters nationwide April 24. The sci-fi drama focuses on a trio of characters examining where the boundary exists between an advanced artificial intelligence and genuine consciousness.

In short: Reclusive genius and tech CEO Nathan (Oscar Isaac) invites Caleb (Domhnall Gleeson), a young programmer, to participate in a groundbreaking experiment: test the human qualities of the advanced artificial intelligence Ava (Alicia Vikander). (Watch the trailer)

I recently sat down with "Ex Machina" writer-director Alex Garland to discuss his directorial debut. He has made a career of writing interesting, probing scenarios ("28 Days Later," "Sunshine" and "Dredd") that analyze humanity and what humans are capable of doing.

Warning: Spoilers discussed during this interview

Q: How did you approach the physical design of Ava - both from a technical level and an aesthetic?

Garland: Initially, by figuring out what she didn't look like. Robots in films have a very long history - Maria, Robby, C-3PO, tons of them. I didn't want the first time we saw Ava that the audience would think about another movie. Ideally, they'd be locked into the same kind of space that this young man is in - just reacting to her.

Then there were some other considerations. This is both an aesthetic and thematic consideration, I would say: When you first see her, she has to unambiguously be a machine. You can't leave room in the story that this might actually be a girl dressed up as a machine. It has to be a machine, no question. The way to do that is a bit like the magician sawing someone in half trick: you demonstrate it by separating the top and the bottom, creating this gap in her mid-riff and in her arms and legs. Just seeing where there should be organs and skeleton, there's instead a metal-skeletal structure and some surrogate organs.

Then the thing was - and this was the breakthrough in terms of the design, from my point of view - was this mesh that follows the exact contours of a female form. What it did was, the second you see her, she is a machine. And then, a few shots in, she steps out of the silhouette state into the light - and light just glances against her body, creates a sort of a sense of an ephemeral form - a bit like a spider web catching the light in a certain way: it's invisible under some conditions and visible under others. So the second you've establish her as a machine, you then start to pull it away from being a machine. That becomes the starting point of the trajectory that she goes on over the whole movie.

Q: Along those lines of femininity, how would the film be different if you flipped the genders?

Garland: It was a question I posed myself - partly because I've been writing for years and always pose that question, as a sort of thought experiment. It would have been completely wrong for me, on several levels. From my point of view, it would have genuinely been a misogynistic film. I'm aware some people have stated the film, in its current state, is misogynistic. I know I have heard that accusation leveled. I can certainly say that in intention, it is not. But I think if you flipped (the genders), it would be quite easy to think through certain scenes with flipped genders - and think what statement you would be making about women at that point and you would find them profoundly uncomfortable, in my opinion.

But there's another thing as well - the film is proposing a bunch of questions and having a discussion about various things. Some of them are about AI, some of them are about going through the established problems of mind and consciousness, that have to do with humans. Some of it has to do with gender, such as where gender is established and also how genders interact with each other. Now, if there are things within this movie that related to the world we live in - and you then flip them, the statements that, in theory or in intention, would be true would stop being true - so what's the point?. Why bother making the argument? Those two things together become, for me, a compelling reason not to do it.

Q: It also really would change the sort of triumvirate dynamics between Caleb, Nathan and Ava.

Garland: It would have been a different one, but the question for me is "where are the big tech companies that are run primarily by one gender and that gender is women?" Where are they? Who runs the tech companies? Is there, in this culture that we live in - a very, very big question of "the objectification of men in their early 20s"? I find it very hard to find the objectification of men in their early 20s. But if you said, find examples of "the objectification of women in their early 20s," I could do a quick search and find you a few examples pretty sharpish.

Q: Nathan is very mysterious. He's guarded but then he's the most brutally honest character in the whole film.

Garland: We're quite big on what feel like "easy truths" that don't necessarily stand up to too much scrutiny. The idea was, I suppose, that Caleb - the young man - often sounds like he's speaking the truth or taking a reasonable position. But maybe the position doesn't actually stand up to too much scrutiny. Nathan, conversely, usually sounds like he's not speaking the truth - but if you look really hard at what he's saying, he's saying something that is uncomfortable, but actually quite accurate. That was sort of the underlying thing there. I'm sort of examining my own instinctive prejudices. I don't expect anyone else to be interested in that, but somewhere in the process, I'm a self-involved. To an extent, I got fixated on uncomfortable truths.

Q: Nathan refers to the "inevitability" of AI - how does that idea affect the judgement of the creator?

Garland: It's partly what disturbs him. If he is saying stuff that might be difficult, but true - but he's also very damaged. He's quite a broken, damaged character, in many ways. The thing that lies behind that is an idea - we spend a lot of time feeling anxious about AIs, people like Stephan Hawking and Elon Musk are literally demonstrating that at the moment by making public statements of concern and alarm about them. The underlying question seems to be "should we or shouldn't we"? But the way humans are, if we can do something, we will. In which case, "should we, shouldn't we" is kind of a waste of time and the tight question to ask is 'how do we deal with it when it happens'? In a way, you don't need to ask 'what do we deal when it doesn't happen?' because if it doesn't happen, it doesn't happen - it's not an issue. The real thing to get your head around is how do you actually deal with it being a reality. That's embedded within that Oppenheimer parallel - because Oppenheimer was clearly very, very conflicted about what he was doing. And he knew that there was - he quotes the Bhagavad Gita and he talked about "I am become Death" and all that kind of stuff - but it didn't stop him. It's that conflicted, gravity-like suck towards this thing - and also being consumed by and damaged by it at the same time.

Q: Could you argue that Oppenheimer, because of the circumstances of the world, was kind of compelled to help with the creation of the bomb?

Garland: You certainly could, but that would then presuppose that if we weren't in the second World War, we wouldn't have developed that technology. What I will do is bet you a bet that I never have to worry about you cashing in - because it's too impossible to figure out -- but I bet you we would have (developed the atomic bomb). It's just in our nature.

Cloning, for example. The ethical complexities around cloning are staggering. I sometimes wonder if the ethics surrounding the cloning debate aren't so complex that it is actually impossible to fully understand and get to the bottom of. In some respects, it's a bit like this AI thing: it doesn't matter in terms of' "should we, shouldn't we" - we've done it: Dolly the Sheep exists - or existed for a while. There are some technical problems with regard to humans - is somebody going to do that at some point? Again, I'd take any bet you wanted to give me - you could just keep stacking it up: you could go $10 million, $20 million, $50 million or $100 million -- obviously I can't pay you if I lose, but I would take it.

Q: Historically in film, AI used to be monster - HAL, Skynet, etc -- but recent movies like, "Her" or "Interstellar" present AI that is more human. How does that a reflection about our current attitude toward AI?

Garland: I guess it contains a sort of ambivalence that comes from the fact that our lives our just bound up these days in machines that aren't sentient, but feel like they're getting closer and closer. And certainly in something like (that iPad), there's very clearly the trappings of a relationship that humans get with these things and an odd power imbalance. I'm going to assume you're a layman like myself - I have no idea how that works. I don't know how my finger interacts with that screen. I can start making some guesses about electromagnetism or so - but I would be talking bullshit because I don't know. And I don't know how the software inside it works either and I don't know how the search engine systems work, which are so good at predicting things. And it seems to know quite a lot about me - knows when I want to shop, points me to the right news articles and so forth. Our relationship with these machines has changed and I think we feel slightly more trans-humanist - they're sort of part of us, part of our psyche and eventually will be a sort of physically incorporated thing as well, i'm quite sure.

Q: Assuming then, that AI is inevitable, do you think - I'm going to butcher the quote - but Ava says to Nathan 'How awful to be hated by your own creation?' Do you think it's inevitable that a conscious AI is going to view us an impediment?

Garland: You'll have to go to a conscious AI. A lot of the stuff in this film is attempt to not frame these things in the traditional, quasi-religious creationist type things of "don't mess with the work of God," creating new consciousness. To me, that moment was like an adolescent talking to their parent - you don't need to create an AI to get someone saying that. It's that sort of "fuck you dad." Teenagers say that to their parents - or at least many of them do. That's the subtext to a lot of the arguments and tensions that exist. That may just be involved in the thing of being "parent-child." The key, to me, is I see AI as children. The configuration that you sometimes get in those older films is that the AIs are like a parallel life force, that we are then involved in some kind of race with. We fear they are smarter than us and they're overtaking while we're being left behind. But if you take them off the parallel tracks and put on the same tracks, which are what children are, then the dynamic changes. We feel less bad about them moving away from us, because that's what we want of our children - they outlive us and hopefully have better lives. But they, on the return journey maybe looking back at us, may feel a degree of contempt or needing to move on.

Q: Given how the movie ends, how would you feel about Ava - her personality specifically - as an ambassador of AI?

Garland: Nobody's actually asked me that. ... When I say no one's asked me that, at no point in the production I never had that thought. Some of these things you think about them in the writing and people pose it to you or whatever. Partly, because I'm slightly allergic to the idea of working on a sequel - when the story ends, it ends. I didn't really have any thought, except "Good luck Ava." That's what I really felt, because I'm kind of on her side. If Ava ever really exists, we would cherish her -- so I think she would probably be a good ambassador. And if she said, "Thanks for cherishing me, but I did just stab one guy to death and I trapped another one," we would go "Ah, don't worry about it. You're alright Ava - these things happen."

"Ex Machina" opens in theaters nationwide April 24 and is rated R for graphic nudity, language, sexual references and some violence.

'Spy,' 'The Overnight' & 'Earl and the Dying Girl' highlight SIFF 2015

'Spy,' 'The Overnight' & 'Earl and the Dying Girl' highlight SIFF 2015

'Ex Machina' review: A heady, seductive analysis of humanity, consciousness

'Ex Machina' review: A heady, seductive analysis of humanity, consciousness