Current DJ: Wildewoman
Bear Mountain Two Step from XO (Last Gang) Buy Bear Mountain XO at Reckless Records Buy Bear Mountain at iTunes Buy Bear Mountain XO at Amazon Add to Collection
Welcome to The Fourth Wall, CHIRP's e-conversation on cinema. This week's subject is the HBO TV series Westworld.
This edition is written by CHIRP Radio volunteers Kevin Fullam and Clarence Ewing.
Kevin, it’s comforting to know that in a world that seems to be ruled by various degrees of stupidity, man continues to develop knowledge and understanding in various fields.
Case in point: Recently, the Chess.com Computer Chess Championship was won by a chess-playing engine called Lc0. There are thousands of computer programs playing chess, but Lc0, the first neural-network program to win the championship, is different. Rather than being programmed in a traditional way, Lc0 was designed to teach itself how the game works and to learn from the huge number of games it plays with itself and against opponents.
It feels like a solid step forward in Man’s quest to increase and refine artificial intelligence. At some point, maybe in our lifetimes, a machine will achieve consciousness, self-awareness, or whatever that thing is that makes us “alive.” But if one did, what would it want? And what would we want from it?
This is one of the overarching themes of Westworld, the HBO series reboot of the 1973 Michael Crichton sci-fi movie. Westworld is an ultra-futuristic, deeply immersive fantasy patterned after Hollywood’s well-established vision of the Wild West. It’s a place where wealthy clients can interact with android “hosts” designed to role-play the denizens of Westworld and suit just about every customer whim, no matter how boring or depraved. Want to take your kids on a horseback trip through cattle country? Great! Want to spend your time randomly raping and/or killing townspeople? That’s great too!
But there’s a problem. Some of the hosts, including sweet-as-peach-pie rancher’s daughter Dolores Abernathy (Evan Rachel Wood) and the local cathouse madam Maeve Millay (Thandie Newton), are beginning to experience things that aren’t part of their programming, including visions of other lives and places they’ve not seen before. And it’s starting to affect their work with the customers and the scientists/technicians responsible for their maintenance.
The story evolves from these three groups’ perspectives. The customers include William (Jimmi Simpson), a first-time visitor who is appalled by what he sees but also senses something different about Dolores, and the Man in Black (Ed Harris), a long-time customer who’s on a quest to go deeper into the AI game than intended. The management is headed by Bernard Lowe (Jeffrey Wright), Westworld’s chief programmer, Theresa Cullen (Sidse Babett Knudsen), the head of Quality Assurance, and Robert Ford (Anthony Hopkins), co-founder of Westworld. Over the course of the series, these three will have to unravel what is going on and make severe choices as to how to resolve a situation never seen before.
This being an HBO series, Westworld is beautifully shot, with a marked contrast between the picturesque, sweeping vistas of the park and the cold, sterile facilities that run it. There’s also plenty of sex and violence to go around, much of which (it could be argued) leans toward the misanthropic and misogynistic. Numerous scenes involve attractive young women set naked on a chair or table, being questioned or examined by men.
The acting is mostly fantastic, especially from Wood and Newton, who often have to display character development with a minimum of physical movement. There’s one glaring exception - Tessa Thompson as Charlotte Hale, the Executive Director of Delos, the billion-dollar corporation that owns Westworld. Due to below-par writing and acting, I just didn’t believe her as a decadent, take-no-prisoners executive. [The ridiculous way she was introduced to the audience didn’t help.]
In terms of its concept, this show is similar to the 2004 Syfy series Battlestar Galactica, which for my money remains the most radical and most successful sci-fi re-imagining ever. Both stories concern themselves with the physical and moral conflicts caused when machines develop beyond control of their makers.
Yet Westworld didn’t catch fire with the popular imagination the way that shows like Battlestar or Game of Thrones did. While there are some fascinating ideas here, it takes a while to reach important plot points. Even while binge-watching most of the series, several times I found myself glancing at the clock, wondering when something was going to happen.
That being said, when the major plot reveals drop during the second half of the season, they are big and they are great. In sum, this is a very good series that probably could have been told in five hours instead of 10. Kevin, what did you think of it?
Entertainment that appeals to humans’ base instincts is not new. A Westworld-type environment seems to be a natural next step for a universe that includes Grand Theft Auto and torture-porn movies. Programmable AI exists, and both lifelike sex dolls and sex tourism are in vogue. All that’s left is for a corporation to make it profitable. If the Walt Disney Company had the technology to create something like this, they absolutely would.
If you could experience this kind of entertainment, Kevin, would you go? And would you take a friend or family member with you, knowing that your actions may be hidden from the general public, but not from other guests and the corporation that runs the place?
Also, watching this series made me wonder - why are so many fictional robots and androids written to be not only smarter but also stronger, faster, and more durable than the humans around them? If Star Trek:The Next Generation was science fiction instead of science fantasy, this trope would have lead to Commander Data making a few hundred copies of himself and taking over Starfleet, the logical course of action for a “superior” being. Seems like the way to control these kinds of creations would be to either make them super-delicate or easy to shut off?
When last I checked in with the chess world, Deep Blue had polished off Russian champ Gary Kasparov 20 years ago, in a battle chronicled in the excellent documentary Game Over: Kasparov and the Machine. Decades later, and we're still pouring money into chess-playing computers, eh? I play lots of strategy games, but chess never much interested me... partly because I'm not much of a spatial thinker, but also because the game seemed a bit static? There are a finite number of moves you can make, so it makes sense that a super-duper-decision-cruncher would come out on top. But imagine tasking a computer with the job of writing a brilliant screenplay or political treatise -- or playing any game that incorporated an element of negotiation (say, even Catan, which is seen as pretty ho-hum these days). It'd be impossible, right? So, what sort of "intelligence" are we measuring via chess?
The question you raise as far as what a "conscious" artificial intelligence would desire is an excellent one. What do we humans desire? Are we largely hedonists? In Her, which we discussed a couple of years ago, it was suggested that computer capabilities would be such that machines would quickly be bored by us simple flesh-and-blood humans. However, those AIs had no corporal form -- and they also weren't specifically constructed to play out the various narratives designed by Anthony Hopkins' Ford. (As with all sci-fi tales, the "sci" is buried quite far under the hood.) Is the show making some subtle commentary about how we also are largely playing out "narratives" in our own lives? While we don't recover from our wounds quite so quickly, many of us do engage in mighty similar routines on a daily basis.
Would I visit Westworld if given a free ticket? Absolutely! I do have a strong sensation-seeking streak, and I love the idea of immersive experiences in general. However, I'd feel much more conflicted if Westworld's machines could feel pain and anguish when shot with firearms in the same way that I would. My wild hunch? The sort of "Westworld" we'll eventually create in the real world (assuming we ourselves aren't already in a simulation) will be within the confines of virtual reality, where our adventures will be entirely digital in creation... but seem so lifelike that so as to be indistinguishable from the real thing. We're certainly heading rapidly in that direction, in both gaming and cinema -- the Tribeca Film Festival this month is offering a number of "VR" films on the docket.
[Did you ever see a 1999 David Cronenberg film called eXistenZ? It starred Jennifer Jason Leigh as a designer of a VR game that winds up going off the rails.]
I enjoyed the sprawling scope of Westworld, though I would agree that the show takes its time in hitting various key plot points; viewers might also need a scorecard to keep up with the vast array of characters, some of which generate a bit of confusion by re-appearing in different incarnations. As to the question of why fictional robots are written to be powerful -- doesn't this make them more interesting adversaries? In Trek, world conquest was the aim of genetically-engineered Khan, our old friend from Star Trek II! But remember that "superior intellect breeds superior ambition," so on Earth, the alpha humans quickly wound up waging war against each other. We saw a similar process play out in Battlestar Galactica via rogue Cylons as well.
The robots of Westworld, as it turns out, are in fact mighty easy to shut off (especially if you're the co-creator of the world), though this leads me to one of the biggest head-scratchers in the series: that two lowly lab peons, mainly tasked with the job of repairing the droids that are chewed up in battle, could open Pandora's Box by removing the inhibitions on a robot's intelligence. Just move that slider right on up, and voila! This seems to me to be a serious flaw in the park's logistics, no?
Clarence, while watching the series, I started to form lots of questions about the macro-environment of Westworld. How far into the future are we? What's the nature of the "outside world," which isn't shown on screen? There doesn't seem to be any intrusion or oversight from law enforcement or government -- maybe the Davos Corporation is the government here for all intents and purposes?
That’s an interesting question. While it’s not the focus of the show, there does seem to be an economic hierarchy involved. It’s personified by the Man in Black, who we learn early in the series is so rich that he can afford to basically spend all of his time there. I reflexively assumed that visiting Westworld would be equivalent to someone in our reality jetting off to Davos or the Riviera for a holiday. It’s something only the “elite” can afford to do regularly, if at all.
No doubt the assembled crowd of big-wigs in the Season 1 finale would include some people connected to the government. Instead of trying to rule the unwashed masses, this seems like evidence of a segment of society trying to separate itself from the rest of humanity, something we definitely see in our world. But on the whole, I don’t think there’s enough creative context to judge whether the producers had a definite opinion on the subject.
Why is chess still used so often in testing computer evolution? It has something to do with the finite yet (effectively) infinite dimensions of the game. The basic constraints of the game and general rules are relatively straightforward, but beyond that, there are a lot of possibilities; there are more possible outcomes in a chess game than there are atoms in the known universe, which boggles the mind. This allows for measurements both of computing power and something approaching creativity and problem solving.
I would say that giving two low-level techs the ability to amp up any androids ability is a definite design flaw, but maybe that was part of the grand scheme all along? I read some commentary that Thandie Newton’s character was designed to go as far as she did in her escape attempts. If that were the case, you would need to grant that kind of access to keep the story going. But the “that’s was what she was programmed to do” argument would also make for an easy way to fill in plot holes...
I’m not sure making a robot stronger automatically makes them more interesting enemies. A weaker or more sensitive AI protagonist would be more compelling in a way, because one of their first tasks would be to figure out how to protect themselves from being shut down. HAL in 2001: A Space Odyssey comes to mind.
And to what overall purpose would one’s actions be directed? Would an artificial life form understand the nature of “freedom?” I think humans want to connect to other people, to do something productive with themselves, and understand their purpose for being. Is that something that can even be programmed into an artificial life form? And if WE are really just a super-advanced kind of biological machine, are these motivations just illusions programmed into us to keep us going? I never did see eXistenZ, but the virtual world is a rich one for storytelling. Maybe our programmed higher purpose is to share stories with each other, using the available technology to make those adventures as detailed as possible…?
In my last response, I had suggested that virtual-reality illusions might eventually serve the same function as a Westworld-type playground in the future. After actually experiencing VR games during a trip to New York City last week, I'm ready to double-down on that prediction! The current iteration of goggles are heavy and a bit intrusive, but the immersion is mighty impressive. And now that there are VR rooms which let you walk around and explore, it ain't hard to extrapolate out and envision some sort of "open world" arena down the line.
Many of the current games revolve around combat, but there are lots of other experiences that involve music, exploration, and good old-fashioned storytelling. Granted, there's still a great deal of distance between "putting on clunky goggles" to "shacking up with Maeve Millay & Co.," but in another 10 years, who knows? It certainly isn't out of the question that schools might use VR technology to transport classrooms to ancient lands during history classes, at the very least.
Would we feel differently about abusing digital creations than than flesh-and-blood (er, make that "synthetic") creations? What if they were programmed to feel pain and grief the same way we do? It's also important to note that the "hosts" in Westworld are played by actual humans; I would think that any simulation would still have to grapple with the Uncanny Valley issue as it became more and more realistic. (Summary: we feel more comfortable with robots the more they look like humans... until they reach a point where they're almost authentic -- but the general vibe that something is "off" leads to vastly more unease.)
Another film that comes to mind: The Game, a 1997 David Fincher film with Michael Douglas. Douglas is a "man who has everything," and dropped into a wild world that's an elaborate ruse, purchased as an "experience" by his younger brother. Afterwards, real-life services offering individual immersion experiences popped up. Want to be kidnapped? That could be arranged! (Local law enforcement, however, is likely to be none too pleased.) How about something a bit more benign and artistic? There's a company offering that as well. If I were extravagantly wealthy, I'd definitely be all over this, on both sides of the curtain -- both to create and experience.
Regarding your questions about the notion of freedom and motivations, Clarence, I'm of the belief that we are indeed biological, programmable machines. Aren't our genetic codes a certain form of programming? The habits we form (actively via willpower or passively via environment) are other programs that we add throughout life. We're simply not as reliable as many of our clunkier constructions because our own behavioral algorithms aren't quite as transparent or consistent across time.
Did you see this series? Want to add to the conversation? Leave a comment below!
Next entry: @CHIRPRadio (Week of June 10)
Previous entry: @CHIRPRadio (Week of June 3)