2014 was a year that saw some big strides in bio-pics as the lives of such influential characters as Martin Luther King Jr. (Selma) and Stephen Hawking (The Theory of Everything) among others were brought to the screen. One of the lesser known contributors to mankind who had his story adapted to film was that of Alan Turing. While lesser known as a household name, it could be argued that Turing made the single greatest contribution to the success of the allied forces in WWII. Morten Tyldum’s Imitation Game focused on Turing’s invaluable achievement in cracking the Enigma machine, but along the way it went headlong into an aged-old moral dilemma which we would like to turn our focus to.
The lives of the many vs. the lives of the few. If you could save millions, is it worth the cost of sacrificing thousands? In the film, Turing and company put themselves in the position of saving lives immediately and potentially losing that ability to save lives in the future, or waiting in hopes of saving more lives gradually.
Nate and Logan will tackle this dilemma. Were the decisions made the correct ones? Was the value of life placed at the highest esteem? Were there viable alternatives which could have resulted in even less lives lost? Nate will take the position that the decisions made in the movie (an, in fact, in reality) were the right ones. Logan will oppose that claim and each will give their reasoning. They’re each adopting these positions for the purpose of discussion and may not hold to these views in real life. Be sure to read through each, because in the end we want to hear from you.
Nate
The background to the moral dilemma, as the movie lays it out, is this: Three British servicemen die each minute at German hands overwhelmingly due to the Enigma machine that affords the Germans the ability to secretly communicate. Three per minute extrapolates out to a ridiculously high number of men dying at the hands of the Germans. The movie’s opening credits montage informs us that 800,000 children have been displaced due to threat of German bombs. And the movie doesn’t even touch on the six million Jews being murdered by the Nazis. This is all a long way of saying that, by the time Alan Turing and gang crack the Enigma code, Hitler’s Germany is killing (and has already killed) a vast number of Europeans.
Turing has decided to wait in order to use their knowledge of Enigma to win the war. The cost of Turing’s decision will be more British servicemen’s lives, as well as civilians. However, waiting is the right thing to do and here’s why. The Bible tells us that human life is valuable and is sacred (Psalm 139:13-16; Matthew 12:10-13; Luke 12:6-7). I’d be surprised if Logan does not make a similar argument; and, if he does, then we agree as far as it goes. But there is another point that follows from this (that is also supported in Scripture): Human life should not be taken without justification (Genesis 9:6; Exodus 20:13; Romans 13:4). Both of these biblical principles are being violated by Germany at this point in the film (and in history).
If Turing and gang decide to save the few, the Germans will change the design of Enigma and Britain will have lost its ability to potentially save millions. The potentiality of such devastation is a proper justification to wait. Christian philosopher Arthur F. Holmes once asked if we bear a negative responsibility for consequences that follow from our doing nothing. I think the answer to his question is: yes, we do. And I believe Scripture affirms this notion when James says that it is a sin to know the right thing to do and refrain from doing it (4:17). In other words, Turing and gang have a moral obligation to save the lives of potentially millions of people. And, much like the doctor in the E.R. that has to decide which person to save and which one to let die, deciding to save the millions is a moral good. That is, given this difficult set of circumstances, ascribing more weight to the lives of millions (over the few) is morally righteous.
Logan
In the classic science fiction film Star Trek II: The Wrath of Khan, Spock makes an infamous statement during a moment of self-sacrifice that will forever live in infamy: “The needs of the many outweigh the needs of the few . . . or the one.” It sounds great. It sounds romantic. That’s why many people have adopted it as their chosen code of ethics when it comes to situations that involve losses in human life. That’s kind of ironic, considering that he was brought back in the next movie, with Kirk saying “The needs of the one outweigh the needs of the many” as justification for doing so.
When it comes to the film in question for this Moral Dilemma Dialogue, The Imitation Game, things get a little more sketchy, because we’re not talking about self-sacrifice. We’re talking about choosing sacrifice for a group of people. In the film, Alan Turing and his team finally succeed in cracking the Nazi communication code. Having done so, they choose not to save a ship that is in immediate danger, fearful that the Nazis would figure out that the code had been cracked, change it up, and then they would be back to square one. In justification, they say that they are choosing the lives of the world over the lives of the few.
The needs of the many outweigh the needs of the few.
When you separate that philosophy from the emotion-driven leaning of the film, however, the philosophy begins to break apart. This is essentially a real-world application of what is known as teleological ethics, which is defined by Webster as “A theory of ethics according to which the rightness of an act is determined by its end.” The end of this act is to stop the Nazis, so surely choosing not to save these people is a good act.
And thus, without really thinking about it, you have endorsed the philosophy that the ends justify the means.
That’s the first problem, that we are failing to criticize the action itself and instead look at what the end result is. The end result certainly has some relevance, but to stake an entire claim on that while failing to acknowledge the possible immorality of the action in question is, at best, inconsistent. So let’s look at the action itself.
Here are the facts: The action itself was to choose not to save a group of Allied soldiers from a Nazi attack, condemning them to death through inaction. While not a perfect example, this action reminds me of Batman Begins, wherein the Caped Crusader says to Ra’s al Ghul, “I won’t kill you, but I don’t have to save you.” To which we all said “Yeah, come on. That’s not really any different.”
Now certainly Alan Turing and team are not treating these soldiers as villains. But how can you call an action moral which condemns valiant men fighting on the same side to their deaths? Unless you accept the idea that the ends justify the means, which is a moral philosophy fraught with difficulties, then it becomes difficult at best to justify the action.
There you have it! What do you think folks? Did Nate or Logan sway you to think more deeply about this dilemma? Are there additional items to consider that neither of them touched on? Be sure to vote below and, as always, your comments are welcomed and encouraged!
I agree with Nate. Even though it’s bad to let thousands of unsuspecting people die, it’s better to save as many lives as possible. In this case, Alan Turing & company had a difficult choice. They could alert British High Command about every decoded German radio communication, with the risk of letting the Nazis know their Enigma machine had been cracked. While saving the lives of thousands of people, they would run the risk of losing the war, which could lead to the deaths of millions more people. On the other hand, they could send only a few decoded German messages to the British High Command, letting the Nazis kill thousands of people, but keeping the fact that they had cracked Enigma secret so the Germans wouldn’t know what had happened.
This is not a case of saving human life or allowing people to die. Whichever choice they would make, someone would die. When one has to choose between the lives of the minority or the majority, human life (which is precious) is lost. It is wrong to know the right thing to do and not do it, as Logan says, but more lives are saved this way. Alan’s team strived to do the best possible thing they could, calculating the minimum amount of messages were needed to win the war, while solving the maximum amount which would escape the notice of the Nazis. Their intentions were good, and they certainly didn’t use uncaring methods of achieving those ends. In the end, the work of Alan’s team shortened the war by 2 years and saved over 2 million lives.