add a link

Westworld showrunners on AI's threat and promise: 'This is imminent'

viết bình luận
Fanpup says...
I remember visiting this website once...
It was called Westworld showrunners on AI's threat and promise: This is imminent | EW.com
Here's some stuff I remembered seeing:
Nick Jonas, Janelle Monae, James Corden and more at EW\'s ultimate pop culture festival.
'Westworld' star: 'Each script is a psychedelic birthday cake'
quickly hurls a heady number of near-future quandaries at viewers. Like:
What is consciousness? Does an artificial intelligence have the same rights as a person? Is AI a threat? How will it change us?
Below, married showrunners Jonathan Nolan and Lisa Joy took some of our questions about their provocative technology-based morality play. Don’t worry, there are no spoilers. We’ll also continue this conversation after the show’s debut Sunday night where we’ll get into some specifics about the first episode.
ENTERTAINMENT WEEKLY: I’m going to start with a hard question that goes right to the heart of your show: If we create something that thinks and feels, what responsibility do we have toward it?
LISA JOY: That’s a great question. When we were creating this show and this pilot, we actually created organic life — we had our first child. So we both thought a lot about the responsibilities of creating something. I see parallels between the two. When you’re responsible for bringing something into this world, they absorb some of the things you give them — the good and the bad. Our child watches what we do — how we interact with each other, whether we treat each other with respect. With an AI, it’s the same thing. Computers have this kind of recursive system where they can learn more and more things from the Internet. But if you look at the Internet, it’s a lot of cats, pornography, and gossip. A machine is what you put into it. So I think the obligation that we have is to put in good, as well as bad. I think also, like children, computers can get a mind of their own. They should become better than us, and in some ways transcend us. [A computer] could leapfrog human intelligence. So our responsibility, as a creator who might be outpaced by our creation, is to teach them a code and parameters and values and hope that those values can guide their future choices. That’s practically happening right now in the discussions in Silicon Valley about what you do with AI technology and what kind of fail-safes you implant.
You must have seen the stories about Microsoft’s AI Twitter bot that became a racist sexbot within 24 hours. It reflected the culture it was raised in.
JONATHAN NOLAN: I’ve been dealing with AI now on different projects for many years now. There’s an urgency here because we really think this is imminent. The only lens for looking at it is that of the parent-child relationship, because this is a completely new phenomenon. When you think of the history of our planet, the only one we’re familiar with, there hasn’t ever been an instance of a species creating another species. We’re replaced by other species. We disappear and something else rises up to fill the food chain, but there hasn’t been another instance before of one organism creating another one and then taking over that god-like power of design. So we feel a certain amount of responsibility, and Stephen Hawking and Elon Musk and others are trying to hold up a sign and say, “Let’s take a beat, just a beat,” as we did with genetic research in the 1970s, and just talk about it. Let’s just talk about what we’re doing, because it’s happening very quickly. At some anonymous office park in Mountain View or in Shenzhen or somewhere, someone is trying to very urgently deliver what we would consider a true artificial intelligence. And we’re not culturally talking about this because we’ve thought of it for so long that we haven’t bothered to have a conversation about what this thing will be before we make it, which seems like a mistake.
Do you think that AI is the threat to humanity that some of the great minds you mentioned seem to think it is?
NOLAN: I think it’s both [a threat and a benefit]. Those are things we try to explore. Film and television have played with this question for decades. The vast majority of it has been a dystopian [vision] — AI is going to kill or enslave us. You started to see for the first time [the AI relationship] done beautifully in Spike Jones’ film
, which detailed a love affair and the ways that that would be qualitatively different. We also tried to thread some of this into
I’ve always been more interested to look past the apocalyptic scenarios — although,
at them — but then look past them and look at the ways in which AI will be childlike at first and very influenced by our culture and our values, and then might come to help us. That was the question for
the way through the question and into the other side, and ask not what will we think of [AI creations], not what will they do to us,
How will they feel? How will they think? And really try to step inside the question — not to the point where the audience falls asleep, but try to step inside that question of consciousness and explore it from the inside out. What does it mean and what would it feel like for something whose consciousness had been built?
It’s one thing to empathize with the android hosts in your story, but it’s another to ask the viewer to be invested in them like they would a human main character in a TV show that could run for years. Was there concern about whether the viewers would accept non-human protagonists?
JOY: We were very conscious of it when we were writing the scripts. We’re going up against a hurdle of that not having been the historical norm. And our concerns faded away when we saw the performances of our incredible actors. Because when I see Evan Rachel Wood perform, when I see Thandie Newton on screen, my heart bleeds for them.
NOLAN: This is a grey area for a lot of people. What’s interesting about human beings is that if I took a broom handle and painted a smile and two eyes on it, you would [feel] for it a little bit. You would feel bad if I snapped it off my knee. Tom Hanks built an entire film [
] out of this [with his character’s relationship with Wilson, a volleyball]. It’s very poignant, and when Wilson is swept away, it gets you. Humans are these incredible empathy machines. We’re incredible at applying it to things that are almost absurd. I watch my daughter with her stuffed animals, her toys. We’re very good at empathy.
NOLAN: There is. And we’re all quite good about selectively turning it off. You need an incredibly talented cast, because you’re asking them to approach consciousness, get close to it, and then just shy away from it. With all of the actors, 99 percent [of playing an android] was in their performance. Every so often we’d do a thing where we shoot part of their face in slow motion or manipulate the way their eyes or mouth closes — just to give it a little uncanniness. We were watching early cuts of the pilot and found if we went too far with it — and “too far” is a
too far — people immediately stop caring about the character. They were so lifelike, but the second you see [too much] uncanniness, you stopped seeing them as the heroes. It’s an interesting intellectual challenge.
It’s a graphic show. Do you have certain boundaries in terms of what you’ll show a guest doing to these hosts?
NOLAN: For us, it’s not about fetishizing it, but it is about exploring this question of, “If you created a universe where there are no apparent consequences, how far would you go?” One of the questions the show is asking is why do we watch these things in the first place? What is this human fascination with violence? In the games that we play, the movies that we watch, violence is a common feature of everything you’re looking at. We wanted to get underneath that and find out why people come to this park in the first place. So there is a fair amount in the series of exploring that question. We’ve tried to do it in a way that’s tasteful and compelling as possible.
A coworker who watched the premiere was talking about an act of violence that one character does to a host and said it was disturbing, because if the person would do that to this likable life-like android then they’d probably be inclined to do it in real life too. I countered that I’d commit acts of violence in
that I’d never consider in real life. But that’s a debate that’s generated by the episode.
NOLAN: When [Michael] Crichton made the original film, there were no video games, nothing. There was
, and the arrival this year of fully immersive commercially viable VR games. We demoed VR last year and came away from the experience really shaken. On the way home we just talked about how much time our daughter is going to spend in these virtual worlds; a significant portion of her life, a real phase change for humankind. The rules set for these virtual spaces is flexible. When people play
, they don’t think about the moral repercussions of what they’re doing. So one of the questions the show asks is, as you approach consciousness within these creatures, what is the point at which you start to feel bad? You don’t feel bad when you turn the phone off and the Pokemon disappear. You don’t feel bad when you turn off the Xbox and those characters go away. In the most sophisticated and immersive RPGs, you can be completely immersed in the narrative. A huge amount of effort goes into creating non-player characters in these video games, and that’s really what our hosts are; they’re non-player characters. At what point does that line blur? We loved approaching artificial intelligence from that perspective, of how it creeps in. You arrive at this moment in which it gets really complicated.
It seems particularly challenging for actors playing the hosts — every little movement you make gets scrutinized.
NOLAN: The conversations are endless, especially for the actors who are playing hosts. They did a tremendous amount of research themselves.
JOY: Just watching the different ways in which they worked and channeled these characters was kind of incredible. Evan is so young and strong and full of vitality. She can be sitting there, chewing gum, listening to her headphones in one second and the next second, you yell “action,” and she’s sobbing hysterically. She just transforms into this incredible heroine.
Part 2 of this interview will be posted Sunday night after the
premiere, where we’ll discuss the events in the first episode. We’ll also have a deep-dive recap and an interview with Evan Rachel Wood. Follow @jameshibberd for live tweeting during the episode and updates.
Kevin Hart and Conan O\'Brien trade insults in hostile workout session
Bruce Springsteen signs absence note for fifth grader
The Weeknd shares energetic new track, \'False Alarm\'
Does Woody Allen’s \'Crisis in Six Scenes\' live up to expectations?
\'Outlander\' Season 3 First Look: Clash of the Tartans
read more
save

0 comments