Thinking as a Hobby


Home
Get Email Updates
LINKS
JournalScan
Email Me

Admin Password

Remember Me

3477720 Curiosities served
Share on Facebook

Ay, Robot
Previous Entry :: Next Entry

Read/Post Comments (6)

Went to see I, Robot this weekend, against my better judgement. I liked it better than I thought I would, mostly because my expectations were extremely low.

[Spoilers ahead]



First of all, why do the heroes of futuristic stories always pine for the "good old days" of our present? Will Smith's character lives in an apartment with "old-fashioned" electronics, and he starts out by ordering a pair of Converse shoes ("vintage 2004"). How many people do you know that are obsessed with 60's paraphernalia?

Anyway, if I understand the plot...and that's a big "if"...basically a robot engineer helped create a sentient control system, "VIKI", that he knew was eventually going to "evolve" into a dangerous entity (according to Dr. Lanning, random chunks of computer code always end up amalgamating into mysterious, superintelligent entities we can't understand). So in order to stop it, he created a new kind of robot with the ability to not obey the 3 Laws and a tougher chassis. Then he imbedded a cryptic dream in the robot, put a copy of Hansel and Gretel on his table, and told the robot to kill him? Have I got this right so far? He also put a hologram in his pocket that would call a policeman that he knew hated robots, so that the policeman would unravel VIKI's plans for world domination and with the prototype robot's help, destroy her.

By the way, the reason Spooner hates robots is really pretty ridiculous. He's mad at robots because he and a girl were drowning, and the robot saved him instead of the girl. The robot calculated that he had a better chance for survival (something like 43% vs. 11% for the girl). Why was this such a horrible thing for the robot to do? Don't human emergency workers and health care professionals make the same sorts of decisions every single day? Wouldn't a human have made the same choice?

Later in the film, when the robot Sonny has a chance to either obey Spooner and save a woman, or destroy the system that will enslave all humankind, he presumably "chooses with his heart" and saves the woman. Um, does being more human mean putting the needs of a single person over those of all humanity? If so, sign me up for robothood.

But back to the whole convoluted suicide setup...why didn't Dr. Lanning just inject the nanites into VIKI himself? He wanted to educate the world about the dangers of robots taking over? How about a press conference? The setup seems just a bit too elaborate and convoluted.

I did like the fact that it was VIKI, and not the super-ambitious CEO who was taking over, and the reasoning made a clever sort of sense. But the premise was far too shaky. What are we supposed to make of the ending anyway? All the NS-5s are "put into storage", but the new one that doesn't have to obey the 3 Laws apparently becomes some sort of messiah figure for them.

If we make sentient beings, do we have the right to make them slaves? And if we give them freedom, there's the danger that they'll kill or enslave us, right? These are some interesting issues that could have come up in the movie, but are sidestepped in favor of absurd, poorly-shot action sequences.

Still, I thought the Sonny character was well-done. Like Gollum, he was a fully-realized character, even moreso than some of the humans in the movie. And it was a decent way to kill an afternoon, even if it was bloated, silly, and mostly incomprehensible.


Read/Post Comments (6)

Previous Entry :: Next Entry

Back to Top

Powered by JournalScape © 2001-2010 JournalScape.com. All rights reserved.
All content rights reserved by the author.
custsupport@journalscape.com