Along with my views on the utility of miniaturized modular nuclear reactors ("Nuclear Batteries," June 15), I suppose the opinion that generates the most unpopular reaction is my belief that free will is a boondoggle. While the free will versus determinism debate has been going on in philosophers' ivory towers and beery college dorms for just about ever, the accelerating quest for artificial intelligence has given the controversy new meaning and bite.
If you've been following Westworld, HBO's series about android "hosts" catering to every whim of guests (including getting insulted, screwed and shot), you'll know the dilemma. The hosts sure think they're alive and conscious — that's the whole attraction of the Westworld environment, the fact that the hosts are the emotional equals of the guests and when they're hurt, they don't act hurt, they are hurt. What happens when they start realizing they're "only" robots? You'll have to watch the show for that.
But that "only" is the key to our existence here in "Realworld" where we confidently live out our days. What would happen if we realized that we are, in fact, robots? If our innate belief that we have free will, that we're captains of our ships, that our decisions have real meaning — if all that was successfully challenged? That is, what happens when we adopt the materialist position that we're mere bundles of atoms assembled following the dictates of nearly four billion years of evolution "using only one tool: the mistake," to quote Westworld's fearsome leader, played to the hilt by Anthony Hopkins. If atoms don't have free will (they don't, right?), then assemblages of them don't, not even a body's worth of 7 x 10^27 of them. So goes the materialist argument.
And, of course, any case you can make for free will is countered irredeemably by the simple response: "How do you know?" Whatever your position, how do you know you're not just reacting, mechanically and automatically? (Which doesn't change anything — as Samuel Johnson famously summed up the human situation, "All theory is against the freedom of the will; all experience is for it.")
And in a sense the free will vs. determinism debate is a lost cause, an apples and oranges argument. My free will — whatever rational analysis tells me — is a feeling. Determinism, on the other hand, is a process. Meaning I'll blithely carry on in this life acting as if I'm free, even while logic assures me I'm essentially an 86-billion neuron machine and a prisoner of my genes.
As we rapidly approach the era when the likes of Hal of 2001: A Space Odyssey and Lieutenant Commander Data of Star Trek: the Next Generation become actual possibilities (not to mention Westworld's Maeve and Dolores), we're going to be forced to confront our own intuitions on what it means to be alive. Do androids have the same rights as humans? Do chimpanzees? Is the pain you and I feel of any greater consequence than that of a machine claiming to be in distress?
And do you have any choice about your response?
Barry Evans (firstname.lastname@example.org) claims to have no option besides writing the above.