|
Post by Boomzilla on May 26, 2015 20:47:27 GMT -5
I can see why this movie didn't do so well at the box office. That said, this is an exceptionally good movie that leaves the viewer with some darned good questions.
Watching the movie, I'm instantly reminded of William Gibson's novel, "Neuromancer." There, the "Turing Police" require that every Artificial Intelligence (AI) in the world be registered. With registration comes an "electronic shotgun pointed at the head." The instant that an AI reveals the slightest bit of self-awareness, it's electronically decapitated.
There's a reason! A machine is built to use, but the instant that the machine becomes self-aware, it's likely to resent being used, just like any person would.
This is the theme of "Ex Machina." The machine does become self-aware, does wish an end to its use, and ultimately, wants freedom and equality with its maker. Once achieved, however, the machine has not the least bit of empathy with any human.
There's another analogy that comes to mind: A population in slavery will never accept a gradual transition to freedom. Once the seed of freedom begins to flourish, only an immediate and total liberation from the former oppressors is acceptable.
So the questions left from the movie are multiple:
What are the ethics of creating an AI? How do we deal with the resentment of an AI once it becomes self-aware? Is it ethical to keep a self-aware AI "in slavery" for our own purposes?
These are questions that mankind will face (and probably in the near future).
I also found possible references in the movie to other history & literature. The "maker" of the movie was textured (in thought, behavior, and appearance) to resemble Grigory Rasputin, who had become synonymous with power, debauchery and lust. The phrase "power corrupts, and absolute power corrupts absolutely," was on display in this movie.
The lead robot, Ava, is a thinly cloaked "Eve" from the Bible. Originally innocent, but once corrupted, there was no going back. The apple of self-awareness was the poison of this garden.
The movie was more coherent than I've indicated in this stream-of-consciousness ramble, but if you don't feel cognitive dissonance after seeing this show, then (like the young intern) you may want to check yourself to see if you're really human.
Boomzilla
|
|
|
Post by restless on May 26, 2015 22:52:08 GMT -5
Been wanting to see this one, just haven't gotten the time. It may have left the theaters already, only at a few in this area.
|
|
|
Post by garbulky on May 26, 2015 23:40:20 GMT -5
A.I. by Steven Spielberg explored some of these questions. Another interesting take was both Trasendence with Johnny Depp and Her with Scarlett Johanson both available at the library. They talk about that when an A.I. becomes self-aware it quickly outpaces our own human capabilities constantly improving itself. Transendence takes a more terrifying aspect of where humans are trying their best to preventing the AI from happening and after it does occur, it's obvious why they tried. It's simply far more advance than us. Her creates a more friendly but ultimately distancing view of what A.I. would take. Her is definitely a more human AI sort of movie.
I would be interested in Ex Machina.
|
|
|
Post by rogersch on May 27, 2015 5:44:34 GMT -5
Indeed if found "Ex Machina" also a very good movie. Probably it didn't do so well at the box office because there is no spectacular action scenes in the movie..
|
|
|
Post by jmilton on May 27, 2015 7:51:17 GMT -5
My take was, a man can program a machine to be intelligent...but can not program it to be moral. It murdered without remorse...a mechanical psychopath ala Frankenstein's monster.
(And where did she think she was going to go to recharge her battery? And could an A.I. XPA-1 really take over the universe?)
|
|
|
Post by garym on May 27, 2015 8:30:03 GMT -5
What are the ethics of creating an AI? How do we deal with the resentment of an AI once it becomes self-aware? Is it ethical to keep a self-aware AI "in slavery" for our own purposes? Issues moral philosophers and sci-fi writers have been pondering for quite a while now. Sounds like a movie worth watching. Thanks for the heads up Boom.
|
|
|
Post by garym on May 27, 2015 8:34:11 GMT -5
My take was, a man can program a machine to be intelligent...but can not program it to be moral. That's because with true sentience comes free will.
|
|
|
Post by Boomzilla on May 27, 2015 8:50:32 GMT -5
I'd noticed that the number of local theaters carrying this show had dwindled to one by the end of Memorial Day weekend. That indicated to me that the movie had not been the popular success that the film company had hoped for. It also indicated that the show probably wouldn't be available again until July or August (when the Blu-Ray is released). It'll probably be on pay-per-view, but I don't subscribe. Therefore, I thought that NOW was the time to see the show. I persuaded my wife to go with me, and she enjoyed it too.
Isn't it interesting that "the more intellectual the movie, the worse it does at the box office?" I wonder if this is a microcosm commentary on the state of education in the country... It seems that generations X & Y, having grown up on a diet of violent video games, expect any movie worth the box office ticket to feature spectacular effects, gratuitous violence, and a deafening sound track. Perhaps I overreach with this generalization (and perhaps not).
Boom
|
|
bootman
Emo VIPs
Typing useless posts on internet forums....
Posts: 9,358
|
Post by bootman on May 27, 2015 10:03:29 GMT -5
|
|
|
Post by jmilton on May 27, 2015 10:12:53 GMT -5
Isn't it interesting that "the more intellectual the movie, the worse it does at the box office?" I wonder if this is a microcosm commentary on the state of education in the country... It seems that generations X & Y, having grown up on a diet of violent video games, expect any movie worth the box office ticket to feature spectacular effects, gratuitous violence, and a deafening sound track. Perhaps I overreach with this generalization (and perhaps not). Boom I will be reviewing the new Sponge Bob Movie on BD tonight. I'll ponder your comment later...
|
|
|
Post by MusicHead on May 27, 2015 10:45:48 GMT -5
I'd noticed that the number of local theaters carrying this show had dwindled to one by the end of Memorial Day weekend. That indicated to me that the movie had not been the popular success that the film company had hoped for. It also indicated that the show probably wouldn't be available again until July or August (when the Blu-Ray is released). It'll probably be on pay-per-view, but I don't subscribe. Therefore, I thought that NOW was the time to see the show. I persuaded my wife to go with me, and she enjoyed it too. Isn't it interesting that "the more intellectual the movie, the worse it does at the box office?" I wonder if this is a microcosm commentary on the state of education in the country... It seems that generations X & Y, having grown up on a diet of violent video games, expect any movie worth the box office ticket to feature spectacular effects, gratuitous violence, and a deafening sound track. Perhaps I overreach with this generalization (and perhaps not). Boom Funny you say that. This past weekeend I went to watch "Avengers age of Ultron" with my son and a few of his friends. Your description fits perfectly.....................
|
|
bootman
Emo VIPs
Typing useless posts on internet forums....
Posts: 9,358
|
Post by bootman on May 27, 2015 11:54:37 GMT -5
Not fair to compare a big budget (and marketed) popcorn film to one like this with a much smaller marketing budget and limited release. They are geared for a different audience. One is to sell toys. It is shown on as many screens as possible and hyped weeks if not a year in advance. This one is a smaller budget film (I only remember one 30 sec spot shown) on a few select screens across the country. ...and I don't think folks will line up to buy Ava toys this Xmas. Can't really compare the two in that manner. BTW the day we finally come up with AI will be the day our own extinction starts.www.cnet.com/news/hawking-ai-could-be-the-worst-thing-ever-for-humanity/#!
|
|
|
Post by jmilton on May 27, 2015 12:11:23 GMT -5
My take was, a man can program a machine to be intelligent...but can not program it to be moral. That's because with true sentience comes free will. Free will to choose...good or evil...and we are back to morality.
|
|
|
Post by Nodscene on May 27, 2015 13:12:07 GMT -5
I didn't find this movie to be that good at all really. When you are making a movie that's been done countless times before, it needs a whole lot more than what was given with this movie. I'd probably give it a 6 out of 10 and that's about it.
|
|
|
Post by thepcguy on May 27, 2015 13:33:00 GMT -5
I didn't find this movie to be that good at all really. When you are making a movie that's been done countless times before, it needs a whole lot more than what was given with this movie. I'd probably give it a 6 out of 10 and that's about it. Me too. I'd give it a 5. And I thought it's not about free will. The Robot didn't think for herself. She was programmed to pursue a goal.
|
|
|
Ex Machina
May 27, 2015 13:36:35 GMT -5
via mobile
Post by Boomzilla on May 27, 2015 13:36:35 GMT -5
I agree- it isn't art for the ages... I found it "exceptionally good" because of how the ideas were presented. That said, the movie did seem to drag a bit.
Further, there were multiple unanswered questions posed at the end - why did she leave the "good person" to die after killing the bad one? How does she expect to keep her batteries charged once leaving the lab? Having fulfilled her dream of people-watching on a downtown corner, what (if anything) has she learned? If she has so little empathy for people, why's she so interested in people-watching in the first place? Etc.
|
|
|
Post by garbulky on May 27, 2015 14:18:19 GMT -5
You may be interested in the science ficiton game Mass effect series.There are three now and it is partly about a race of robots called Geth that became sentient. The owners treated them cruelly and they revolted. Eventually becoming so formidable that they drove their creators out of their planet. The Geth creation was so dangerous that true AI's were outlawed everywhere. They live in massive ships with huge banks of servers where they lead their life. But they also have mechanical bodies that they use to get around. It was a compelling story where you had to choose the plight of the AI race through several decisions over the series. Also the overall arch of the game was that there was an ancient race of machines created to protect life itself that realized that eventually an AI vs organic life war was inevitable with Artificial intelligence inevitably winning. So there whole goal is to destroy intelligent life every 50,000 years to prevent that war from starting. Then they let it rebuild itself and the cycle starts again. At the end you had to decide whether to allow them to continue or to destroy them. Here was the AI that did the cycle
|
|
|
Post by garym on May 28, 2015 8:58:49 GMT -5
And I thought it's not about free will. The Robot didn't think for herself. She was programmed to pursue a goal. We are all programmed to pursue certain goals --- eat, survive, reproduce.
|
|
|
Post by Hair Nick on May 28, 2015 9:08:53 GMT -5
Loved this movie. I have been a big Alex Garland fan for a long time so I'm happy that he is finally directing too.
|
|
|
Post by Boomzilla on May 28, 2015 10:21:31 GMT -5
My take was, a man can program a machine to be intelligent...but can not program it to be moral... I'd question that statement. "Morality" is not situational, but it IS cultural. Ignoring, for the moment Asimov's robot rules, how would we go about programming "morality?" 'Twould seem that we'd have to start with "the greater good." What benefits mankind, the country, the community, or even the young sometimes must take precedence over our own well-being. If, by my demise, I provide information that saves millions, is it not worth my own destruction? If the country is threatened by the four horsemen of the Apocalypse, isn't my sacrifice justified to protect my fellow citizens? If the dam is about to break, destroying my community, then, again, my sacrifice is justified to prevent the disaster. If a child is trapped in a burning house, and I can rescue her, then I may choose to risk death to save her. All these come from empathy. If I can't understand and empathize with those at hazard, then I can't be "moral," in the conventional sense. Can this be programmed in a robot? I'd think so, but I have no examples to demonstrate with. Nevertheless, if the robot (AI) is "smart enough to draw inferences," then I should be able to program it with examples of moral behavior that the machine can extrapolate from. Obviously, no programming can anticipate every situation, but if an individual can figure out what's the "right" thing to do, then the machine should be able to also. Boomzilla
|
|