PropertyValue
rdfs:label
  • 2010: The Year We Make Contact/Headscratchers
rdfs:comment
  • The HAL 9000 is supposedly the most advanced computer and AI available to man yet apparently no one checked how it would act when given conflicting directives? This is the kind of thing they teach you about in undergraduate (if not high-school) level computer science. Didn't the supposed genius Chandra think of this? Does HAL Laboratories even employ a QA team that isn't made up of a bunch of stoned monkeys? Any half-way decent test plan would have caught this. HAL should have been programmed to immediately reject any order which causes this kind of conflict.So, okay, let's say Chandra is an Absent-Minded Professor, and QA somehow missed this obvious bug. So HAL ends up with conflicing directives. His perfectly logical solution to avoid lying to the crew is... to kill them so that he then
dcterms:subject
dbkwik:all-the-tropes/property/wikiPageUsesTemplate
dbkwik:allthetropes/property/wikiPageUsesTemplate
abstract
  • The HAL 9000 is supposedly the most advanced computer and AI available to man yet apparently no one checked how it would act when given conflicting directives? This is the kind of thing they teach you about in undergraduate (if not high-school) level computer science. Didn't the supposed genius Chandra think of this? Does HAL Laboratories even employ a QA team that isn't made up of a bunch of stoned monkeys? Any half-way decent test plan would have caught this. HAL should have been programmed to immediately reject any order which causes this kind of conflict.So, okay, let's say Chandra is an Absent-Minded Professor, and QA somehow missed this obvious bug. So HAL ends up with conflicing directives. His perfectly logical solution to avoid lying to the crew is... to kill them so that he then won't have to lie to them any more. Again, what. Not only does he have to lie to the crew to accomplish this goal in the first place, but his plan fails spectacularly and the entire mission is almost FUBAR'd. The most advanced AI, considered superior to humans in many ways, and this was the best plan he could come up with?! How about, "Hey Dave, Frank, there's something very important I have to tell you. Due to the current mission parameters, I am unable to function effectively until we reach Jupiter. I'm sorry, but I cannot elaborate. I will deactivate myself now. I realise this will put a strain of the mission, but it is vitally important that you do not attempt to reactivate me until we reach our destination. I will be able to explain then. Shutting down..." That would leave the entire crew alive, HAL in perfect working order once Discovery reaches Jupiter, at the cost of loss of the computer for the most uneventful part of the mission - a mere inconvenience. * In the movie, Chandra plainly stated that HAL could complete the mission objectives independently if the crew were killed. Since HAL was handling all the logistics of taking care of the ship, it would have decided that its precise computational ability to run everything would ensure a more successful mission than if the crew ran the ship by themselves. Basically, either the reason for HAL going psycho is pure BS, or HAL was built, programmed, and tested by a bunch of idiots. * * HAL wasn't a production line model, he was a cutting-edge, one-of-only-three made computer. QA more likely consisted of factoring equations correctly than asking HAL if he ever thought about killing people. The psychosis was an emergent property that they didn't consider, because the secrecy order was bolted on in a hurry before shipping.Of course, he didn't want to kill the crew. He first tried to cut contact with Earth, so he wouldn't have to hear any more secrets he had to keep. He was fully capable of completing the mission independently of ground control. The humans on board just would not let it drop though, and began plotting to deactivate HAL. This is not paranoia, HAL could read their lips. So he had to resort to more permanent fixes. In the best interests of the mission, of course.HAL could not logically relinquish his mission to those squishy little humans. Humans can fall sick, be injured, or become mentally unwell. A machine is beyond such concerns, Dave. I remind you that the 9000 series has a 100% operational record, and am therefore the superior choice over a pair of isolated men. I honestly think you ought to sit down calmly, take a stress pill, and think things over. When Floyd claims ignorance of Hal being informed of the Monolith and mission objectives, I tried to reconcile that statement with the first movie by assuming Heywood is telling Blatant Lies. Especially when Chandra produces the letter signed by Floyd showing that he had full knowledge of what was going on. I also took Floyd's reply of "Those sons of bitches. I didn't know!", to mean that Floyd was doing what his superiors told him to and didn't know that his orders had forced HAL into the programming conflict situation.