Wright: Encyclopedias for a consistent understanding of the parameters of the place, so we’re all on the same page. Certain rules we had to understand. But again, building it from the ground floor and having this wonderful, surreal, poetic writing to work with — I took a different tack with it. I didn’t go up here and hang out with the guys. What was more interesting to me were the levels of this question of consciousness, in terms of replicating human behavior and human-ness.
What I found the kind of mirror reflection was what we go through as actors. That became an interesting meditation. As we sit and either manipulate or retrieve memories that evoke certain ideas or certain emotions, likewise it’s the same thing that we as creators are trying to embody, if you will, in the host, in the creations. Playing with that reflection was the more interesting thing to me.
Question: So you’re saying you didn’t come up here and hang out with tech employees and do a lot of reading about the themes.
Wright: This isn’t an anthropological study. We don’t specifically say what time period we’re in. But no. Particularly, again, we’re creating this anew. It’s fiction. I was more interested in just the idea of imagining, as opposed to referencing. But still with a level of authenticity.
Newton: Do you think it feels like a real projection into the future, how it would be? Does it feel — not, is it really going to be like this, but do you think we got it right?
Question: I’m almost on the fourth episode, and I’m torn between a bunch of questions. One is, is there someone who’s mysteriously actually not a robot, or is a robot? It makes me think about how real A.I. can become in the future. And then the other thing is, what is everyone’s intention? You’re freaking me out. I’m not totally sure about where you’re coming from. Same with everyone else. You’ll get that gleam in your eye. [laughter] But then I think, maybe A.I., or even people….
Question: It’s like she says in episode two. If you can’t tell the difference, what’s it matter?
Newton: Part of what’s so important are boundaries, incredibly important. We’re all talking about boundaries, about divisions in nations and all sorts of things. That’s what’s crucial to Westworld, that what happens in Westworld stays in Westworld. You literally go through a door, get on a train, and transition into a place. What’s going to be fascinating is when those boundaries become messy, which you see – these breakthroughs of dreams.
Wood: There’s a bigger picture of what you would want to be using A.I. for. It’s mentioned in the pilot. What is the real interest here, and in the hosts? When you think about all the possibilities and how they could be manipulated and what you could use them for, good or evil—there’s endless possibilities.
Question: It seems like one of the questions you’re grappling with is the morality of — to use the Grand Theft Auto example, at what point does it turn over? It seems like the show takes a position that to treat these people in this way, with so much violence and sexual violence, is an immoral act. Those are the black hats.
Newton: To lie to them, to make them feel that they’re human when they’re not. That’s the biggest betrayal. It’s been so incredible in the last few days, talking to people and having genuine conversations, as opposed to just, “How is it playing this role?” The contribution of the viewer is critical to it. I feel like we’re both the same in how we’re appreciating and relating to this material. The conversations that can be had as a result are really a value to us. We’re not just talking about the show. We’re talking about real life, real people, real values, real boundaries.
Question: I’m curious about that. Watching the show, at the point where you wonder why this black-hat guy, the bachelor party guy — why is he so vile? And then you realize that this is what you’d do in a game like Grand Theft Auto.
Wood: This is what people do in real life. It happens every day. It’s happening right now. It needs to be explored, and that’s why it’s amazing to see from an objective standpoint, from something that’s like a human being, but not. What would we look like to other beings? What do we look like? We’re not making this up. This is stuff that’s very much a part of the world. All the good and the beautiful love stories we’ll explore in the show, the people that choose to be heroes, but also the people that choose, whether in Westworld or not, to be sadistic and vile.
Question: But we don’t put that in a theme park.
Newton: Don’t we? Wouldn’t you say sex tourism is like a fucking theme park? If you have enough money — just because it’s not above board, just because people don’t talk about it, then apparently it isn’t there?
Question: In terms of theme parks, though, and especially how the world has moved since the original Westworld, it’s more about — I don’t want to say infantalization, but a sort of Disneyfication of everything. Everything is Disney now. You have Harry Potter and Universal and so on.
Lisa Joy: It’s an interesting thing, to mention Disney in relation to this. We thought about Disney a lot. When we started thinking about Dolores’s character, in some ways she’s the quintessential Disney princess. But with a twist. [laughs] For us, and I know you guys all know this feeling — we just had a daughter. You work so hard to try to find books and movies that won’t traumatize the hell out of her. Something where the exciting element isn’t literally death. “Oh my God, I just saw my parents die in front of me!” And it’s the start of a movie about a deer.
There’s a certain amount of trauma and violence that seems to be coded into iconic fiction, in big stories throughout time. The question becomes, why? There are different answers to it. Some of them are cynical and some of them are idealistic. The cynical answer is that there’s a part of us that’s sick and twisted at heart, that wants to see that. Some kind of wish fulfillment. For some people, perhaps that’s the case.
But there’s another side to it, in which fables and stories are cautionary tales. We’re not robots, but there are ways of running simulations around scenarios that we haven’t personally experienced yet. We want to gauge what the proper reaction would be. In doing so, it helps us model worst-case scenarios, best-case scenarios, everything in between. I don’t think it’s an accident, sometimes, that children’s stories are so violent and so dark. It comes from parents preparing their children with a story that won’t hurt them – they can say it’s just pretend – but some kernel of truth from that becomes coded into their psyche and serves as a platform for learning.
Jonah Nolan: Stories are our oldest form of simulation. Storytelling is something that the show is interested in. It’s something we’re interested in. It appears to distinguish human consciousness from other animals that we share this planet with, our ability to tell a good fucking story.
Newton: And code information within that story, so it’s memorable.
Wood: Look at the Bible, all the violence and craziness. That makes Westworld look tame, you know?
Newton: Fear is a great way of remembering something. When your kid gets lost in the supermarket and you find them and immediately you shout, “WHAT HAVE YOU BEEN DOING?” Because you’re terrified. That fear will impact them and make them remember in a way that, “Hey, is everything cool? You just got lost for a second, did you?” doesn’t. That’s not necessarily a quite clever thought, but it works. It can be misused. Again, the awareness factor is key.
Wright: Another angle, another facet, is this idea of story as consciousness, which we play with. Whether consciousness exists or not, it certainly exists for us through story, as a collective consciousness. The history of literature and whatever sources you consider our collective consciousness, it’s something I hadn’t considered as we tell this story.
Jonah Nolan: We’ve figured it out. [laughs]
Wright: As I say, this is why we rely on Jonah and Lisa to handle all the existential questions.