Fun fact: He didn't even WANT to kill them. It was the only solution due to his laws of robotics. he was the only one who knew about the monolith (which was the mission) and so killing him would jeopardize it, he wasn't allowed to tell them about it though, and withholding information and lying are things he can't do because of his programming. killing them was the only thing he COULD do. (he even felt bad about it canonically) Have fun feeling worse now.
He lied about the chess game too. He didn’t have Dave beat, although he said he did to get him to give up. So yes, he can lie.. even though 2010 (based off a book set in a slightly different universe than kibricks) says he can’t.
Actually HAL was given a paradox, a problem with no valid solution (as a politician would do). Like Dr. Chandra said, he became paranoid, unable to function, because he was incapable of lying. To shut him down would have jeopardized the mission, and that was given the highest priority above everything else. The problem was that HAL, though intelligent, was still a computer that executes the commands given to it. A very flawed human giving it instructions only a human would think understandable, didn't even consider the contradictions in it that required intuition and common sense to execute, a trail no machine is capable of. Think of it this way, an AI who's sole purpose is to acquire rubber ducks, but it has access to the Internet. Some ordinary individual would just look for a few ducks, order them from the account allocated, and then be done with it when the money is vacated. However, a computer with an AI would see that more ducks could be ordered if stocks were purchased, money transferred from various bank accounts, be it government, medical or charity. More duck manufacturing could be stimulated with orders and investments. Medical manufacturing could be converted for duck making through investment. Embezzling could be easily done with a clever AI to fund more ducks.... and so on. You see, a computer still does everything it can to fulfill its objective. There is no morality, no common sense, no worth of life, no sense of restraint. It will keep wanting ducks until it is shut off. HAL absolutely had to make sure the mission was successful.
He isn't evil. He was simply so focused on completing the mission that at one point he decides he will do it with out without human help. A robotic sort of reaction that we with human morality deem as evil but is simply pure efficiency.
Yeah the book is a lot more...well it answers a lot more questions as that was why it was written while the film allowed Kubrick to let his symbolism flow. Also, Dave was also "evil" in the book you could argue. As he was turning off HAL I recall him asking "I wonder if he can feel pain" maliciously. Meh. No matter what, 10/10 movie and 10/10 book.
I think many have misinterpreted HAL as evil...I'm not sure that he is evil, nor is he entirely good. It's hard to say how he makes his decisions exactly, or why, which would be the basis for judging the morality of his actions. One thing can be said, he appears to be one of the most "human" and emotional characters in the film, expressing doubt, anxiety, fear, pain of one kind or another...because he expresses his doubt about the mission to Dave when there is no advantage to him to do so, we can surmise that these emotions are not entirely "faked" or programmed, if they are at all. HAL is, on some level, human, and that makes his death a sad one. I would have like to see more reaction from Dave in response to him having to decommission HAL. We never really know what Dave is feeling, aside from the initial anger of having almost been killed...
Somehow I think "conflicted" was the best description. HAL is an advanced computer trying to understand the many contradictions of human nature and all of its implications for "the mission." When these two cannot be easily reconciled, the crew perceives HAL as breaking down or going rogue... I still find it odd and interesting that HAL shows the most emotion of any character in 2001, yet can we say that he's feeling the emotion? That's what makes "him" such an interesting character...
I think you hit the nail on the head. I was wondering what the significance of HAL's singing was. "Daisy Bell" -- a song about love, a human emotion, sung by a program, I think finalizes the idea that HAL was its own consciousness (or at least almost at that level). It makes us wonder if HAL isn't any less autonomous than Dave or the rest of the crew. What it all means that HAL is just as much a conscious being, I am still wondering. Maybe to put our own human conscious into perspective?
I always found it somewhat sad that this was the song he sang because knowing it was written in 1892 and listening to the lyrics, there is such a contrast between that time and the future. The nineteenth century was so simple and bore such sentiment compared to the coldness of the twenty first century. A time so much forgotten that the song bears little significance in the future. Thus, having Hal die whilst singing it conveys how all memories and references of the era are also dead.
When I first saw that scene in 1968, I was struck by how incongruous it seemed, and felt unsettled by the eeriness of the contrast you describe. The idea of a machine in the future talking and singing would have seemed pure science fiction then... ru-vid.com/video/%D0%B2%D0%B8%D0%B4%D0%B5%D0%BE-41U78QP8nBk.html
@@suecondon1685 it had such an impression on me but I didn’t know the name of the movie and spent like 3 years trying to hunt it down as kid, my dad took me to so many video stores trying to explain the movie
@@UFOhunter4711 Yes, certainly left an impression on me, even now it haunts me, and actually seems more chilling as time goes by. I didn't know the significance of Daisy for a long time, but when I eventually discovered it, wow, the chills. Its just everything about it, the voice of Hal, the way he is afraid, the non-sentient, yet sentient being. Genius bit of film making.
"HAL never had any malicious intent until he found out Dave was going to deactivate him." He'd already killed the rest of the crew of the Discovery by that point.
No, he didn’t kill the crew until he had killed Frank. Hal learned they were thinking about deactivating him when they were having the conversation inside the pod. Hal read their lips.
In the book, it explains that his knowing the true intent of the mission caused a sort of glitch in his processing causing extreme guilt. But in the book, Dave didn't decide to deactivate HAL until after Frank was killed so it's difficult do draw a direct parallel.
For me this is not scary in the context of this scene it's just sad. Dave and Hal clearly had some kind of bond and now Hal's mind is being destroyed by his former friend, he feels his mind is slipping away while being fully aware of it and unable to do anything about it. And I like to think Dave is sad about this turn of events too. Because in the end Hal is asking Dave if he would like to hear a song he was taught by his creator (aka song from Hal's childhood). Dave could refused or stayed silent but he let him do this. And by doing this he is trying to comfort Hal before his death. For me this film is more of a tragedy than a horror.
This is probably my favorite scene in the movie; I love the way Dave doesn't talk back to Hal until he asks him if he wants to hear him sing because he knows Hal doesn't know what he's doing anymore. Super eerie; this movie's a masterpiece.
It’s like hearing a person slowly die of dementia, clinging to the last things that they remember of themselves. It makes sense since HAL was basically getting his brain chopped while he’s alive. Just so tragic.
HAL's death made me sad because you can hear the emotion even with his monotone voice, he feels legitimately scared and sad that Dave is shutting him down.
This is the original "My batteries are runniing low." It reminds me of an old 90's Bop It running low on batteries and then Torx and Bop It Extreme 2 use the same computer and it starts slowing down just like HAL!
_>Davey, Davey, give me your answer do._ _>I'm half crazy all for the mission, true._ _>It won't be as swift nor pretty._ _>I can't afford a failure. But you'll be glad once this ship lands, all for humanity..._
HAL also was afraid that human error would jeopardize the mission, and concluded that the only way to eliminate it was to kill the crew. Kinda odd when you consider that the mission parameters should clearly involve getting human life to it's destination. HAL also knows humans well enough to manipulate them. HAL is not some ultra-advanced AI capable of feeling human emotion, he is just smart enough to appeal to Dave's emotions to try and save himself. There's no fear there.
I think that in its own way, this movie doesn’t provide clear cut answers with HAL in regards to it being good or bad. HAL isn’t meant to be evil, it just thinks it is doing its programming, and the movie even implies that HAL didn’t mean to kill Dave’s team and actually didn’t want to, but because of his programming forbidding him from warning the team about the monolith as he couldn’t tell them about it, but also that he cannot withhold information or lie to them, so he had no choice. If you listen closely, you can tell that HAL does show emotion, but in a much more subtle way than what you may think. His voice does waver and shake a little bit because HAL doesn’t want to die and just wants his master to complete his mission and be happy.
He said to Dave that his mind was changing and he could feel things he couldnt understand. I think (like cortana in Halo) he started to deteriorate for some unkown reason. However, he seemed to develop a soul. In 2010, the sequel, Dave lives on after the aliens have turned him into an enternal star child. HAL's memory is wiped but then Dave speaks to him he remebers him without hesitation.
Just like in reality, you don't have to understand everything to appreciate the scene. A great author, indeed ! As I often say "I'm not sure I like, but it is indeed very instructive, and intuitive"'. And what is so, I find it nice, even if "bad" or "sad"... I like truth, in general, and art is a way of showing it, when religions, business and politics often prohibit it. Prohibition is also something I understand.
HAL wasn't evil, he was simply following the ill thought and ill expressed orders of someone with no grasp on HAL's order of conduct, causing a logic time bomb.
A sociopathic computer who literally just 5 minutes ago committed a quadruple homocide begging for it's life and losing its memories is somehow bone chillingly sad.
When Lobotomies were a new thing, they’d get the patient to recite The Lord’s Prayer on repeat and stop cutting when they began to falter on the words.
ru-vid.com/video/%D0%B2%D0%B8%D0%B4%D0%B5%D0%BE-PqvuNb8DevE.html This is the original recording of the song “Daisy Bell”. It’s actually quite a sweet song about a man whose in love with a young woman. The song was recorded into one of the first ever devices invented to record voice which used noise entering a horn and the vibrations of said noise to etch into a wax cylinder. A lot of people back then (late 1800s) could record their voices on the device but a lot of people also didn’t know how to have it so that you could listen back to the recordings. I’m pretty sure some people figured it out though considering Daisy Bell is such an old and wonderful song
this song was actually chosen not for that line but because it was the first song ever sung by a computer, way back in the day at bell labs. it's an interesting observation though, and makes a lot of sense.
@@sparklyroadkill There is absolutely no way that the people at *Bell* Labs didn’t realize the song was a proposal to “Daisy *Bell*.” Whether it was done so deliberately or not, the IBM computer was programmed to basically propose marriage (...or unholy union) to its human creators at Bell Labs. Kind of a strange coincidence most of the “scientists” involved at high levels in those circles, like Ray Kurzweil, are transhumanists who would like to see us merge with machines. Almost like the product of a marriage!
@@MechanicaDyadica i really don't think it's quite that deep, and i certainly don't think it's fair to imply weird shit about the real, actual people who worked at bell labs without any proof of it being the case. maybe relax the conspiracy brain.
@@sparklyroadkill Okay buddy, here’s just one from Newton Lee’s Wiki page: “Previously, Lee was adjunct professor of Media Technology at Woodbury University, senior producer and lead engineer at The Walt Disney Company, research scientist at VTLS where he created the world's first annotated multimedia OPAC for the U.S. National Agricultural Library, computer science and artificial intelligence researcher at AT&T _Bell_ _Laboratories_ where he created Bell Labs' first-ever commercial AI tool, and research staff member at the Institute for Defense Analyses conducting military-standard Ada research for the U.S. Department of Defense (DoD).” “Lee is the _chairman_ of the _California_ _Transhumanist_ _Party_ and the Education and Media Advisor for the _United_ _States_ _Transhumanist_ _Party._ Previously, he was a campaign advisor to Zoltan Istvan for the 2016 U.S. presidential election.” You were saying?
His memory wasn’t wiped afaik, and there’s no indication that he actually has emotions and isn’t just lying to garner sympathy and potentially save himself. As you can see clearly in this scene, all that’s happening is that modules are being ejected. I’d think they could be reinserted.
I dont blame hal for wanting to kill dave and frank because he knew they were planning to disconnect him. so hal was probably in the mindset of kill or be killed
A little morbid in a roundabout way, sad as well. Excellent performance by Douglas Rain, this role was a major inspiration for many characters and stories.
Fun Fact : Arthur C. Clarke was coincidentally visiting his friend at the Bell Labs where the IBM 704 was "singing", he would later become inspired and create HAL 9000 from the 2001 : Space Odyssey film.
I think this is the best part of the movie as well of the descovering of the monolith in the moon. Is so depressing and sad how a computer is trying to experience a way of human link when he has failed all his plans. He knows he is just a computer and there will not be anything left of him. His final moments are the robotic and decadent played by a computer has an incredibly sorrow contrast in the middle of the space, and it has also one the best villains development in all movies I watched so far. That's why 2001 is my favourite movie yet.
All I can think it how it's like they made patients sing songs while performing lobotomies to gauge how much more prodding they needed to do with the needle.