Westworld S3E1: AI the Lord

Original article can be found here (source): Artificial Intelligence on Medium

Robot’s as a danger to humans has been omni-present in literature. Asimov proposed his “Three Laws of Robotics” as a counter to this:

  1. A robot may not injure a human being or, through inaction, allow a human being to come to harm.
  2. A robot must obey the orders given it by human beings except where such orders would conflict with the First Law.
  3. A robot must protect its own existence as long as such protection does not conflict with the First or Second Laws.

We need more embedding of these into computer systems (beyond code), and I would love to see the show talk about how they tried to protect humans (other than fake bullets and weak code).



There was the go-to tech trope of us being in a simulation inside a simulation. At a boujee Incite reception, there was the out of it technology employee who said nothing matters because it seems like we are in a simulation. This is a pretty mainstream comment among people working in tech, but it is good to expose the broader audience to. I appreciate the annoyance of his friend’s skepticism, and better yet was Dolores’s response:

“Maybe it is a reflection on you.”

Saying that people who are obsessed with living in a simulation are too eager to get out of the world they live in. This is the take I have been taking for a while. If the end of a philosophical argument is — well, then nothing matters — then I do not think it is a topic worth giving a lot of time.

As technological progress continues, the simulation question will only come more into effect. It has many layers to it, but it gets stale. The more immediate concerns of manipulative technology follow.

AI Therapy

Caleb (Aaron Paul), spent all of the episode “talking” to his late friend Francis. At the end, we learned that this was an intelligent therapy trying to help Caleb keep himself in the system. Much alike to using Deepfakes to recreate personalities. Where do we draw the line?

We know that the technology is there to have a computer recreate any human saying anything (with a little bit of sample data). The show goes right to the jugular with using this in a manipulative and ethically-gray way. Just because it’s called therapy, does that mean we should allow it?

It’s easy to see technology companies putting technology out there that helps in clinical trials, but has lasting impacts on individual lives — and the collective.