Alex Garland is the man behind some of the greatest Science Fiction films of the last few decades. His first foray into film was adapting his novel, “The Beach,” into a screenplay for Danny Boyle. Now, he’s sitting in the director’s chair and continuing to pen his story, this time venturing into Artificial Intelligence. Garland spoke with The Cougar on the phone last week to discuss his film “Ex Machina,” Artificial Intelligence, and what is revealed about humanity when we interact with A.I.
The Cougar: Thank you for doing this. If I was still in Seattle, I would be there in person.
Alex Garland: No, worries man. It’s a pleasure.
TC: First off, congratulations on the film. I caught it at SXSW…
AG: Ah, you were in Austin?
TC: Yes! Really loved it up there and I absolutely loved the film. I actually just saw it again a couple of days ago.
AG: I think, honestly, on a second viewing– if anyone is patient enough to do that… the characters that change the most are Ava and Nathan. The two of them are playing a game the whole way through and you don’t really get to see that, in some respects, until the end of the movie.
TC: This film does reveal a lot about the difference between humanity and artificial intelligence. Recently, in viewing Ava– I’ve had some people tell me that they watched the trailer and they asked if there was any sex in the movie. I thought that was such a weird first observation, but there is something enticing about Ava on some level.
AG: I do think that she’s seductive, that’s correct.
TC: We’ve seen in “Her” that you can find a certain type of love for an A.I. that you can’t see and in Neill Blomkamp’s ‘Chappie’, you grow attached to this character who doesn’t know any better and you start feeling something for him.
AG: I haven’t seen “Chappie,” but I totally get the child-like innocence.
TC: What do you think of the concept of becoming emotionally attached, on some level, to something that we clearly know is artificial intelligence.? What do you think that says about our emotion, love, and how we decide which forms of that are acceptable?
AG: It really doesn’t seem strange at all, if the machine has sentience. What humans react to is sentience in other things… Animals do have sentience. They have a much more limited intellect and don’t have language in the way that we do, but if you show a dog it’s reflection in a mirror, it knows that it’s looking at itself… I would have to assume that the same would be, at the minimum, true with a sentient machine. In fact, you’d probably get an equal relationship that you could have with a human, because of the extra sophistication with things like language and humor.
TC: Certainly… If you look deep enough into this film and what it means to program something for your own use or the way it should be, you can look at our society just in the last 100 years and look at how we’ve programmed certain groups of people to behave the way that we believe is socially acceptable.
AG: Yes. I actually have to say that I agree with that. I increasingly feel out-of-step with the world, I think, or at least out-of-step with the world I live in… (With this film), what I do hope for is that it will cause (the viewers) to pose some interesting questions. That they approach it as an idea that could provoke a conversation, maybe between them and their friends. Maybe even just with themselves after they get home after watching the movie. I guess I don’t expect to be able to prescribe the answers to the film, but I’d like to prescribe the questions.
Editor’s note: This interview has been edited for clarity and length.