by TAMSIN SHAW
‘The CIA contracted Bruce Jessen, left, and James Mitchell to design, lead, and direct harsh interrogations of a Qaeda operative,’ The New York Times wrote of the two psychologists in December 2014. ‘The men are pictured in images from ABC News.’
In 1971, the psychologist B.F. Skinner expressed the hope that the vast, humanly created problems defacing our beautiful planet (famines, wars, the threat of a nuclear holocaust) could all be solved by new “technologies of behavior.” The psychological school of behaviorism sought to replace the idea of human beings as autonomous agents with the “scientific” view of them as biological organisms, responding to external stimuli, whose behavior could be modified by altering their environment. Perhaps unsurprisingly, in 1964 Skinner’s claims about potential behavior modification had attracted funding from the CIA via a grant-making body called the Human Ecology Society.
Skinner was extremely dismayed that his promise of using his science to “maximize the achievements of which the human organism is capable” was derided by defenders of the entirely unscientific ideal of freedom. When Peter Gay, for instance, spoke of the “innate naïveté, intellectual bankruptcy, and half-deliberate cruelty of behaviorism,” Skinner, clearly wounded, protested that the “literature of freedom” had provoked in Gay “a sufficiently fanatical opposition to controlling practices to generate a neurotic if not psychotic response.” Skinner was unable to present any more robust moral defense of his project of social engineering.
In spite of the grandiosity of Skinner’s vision for humanity, he could not plausibly claim to be a moral expert. It is only more recently that the claims of psychologists to moral expertise have come to be taken seriously. Contributing to their new aura of authority has been their association with neuroscience, with its claims to illuminate the distinct neural pathways taken by our thoughts and judgments.
Neuroscience, it is claimed, has revealed that our brains operate with a dual system for moral decision-making. In 2001, Joshua Greene, a philosophy graduate student, teamed up with the neuroscientist Jonathan Cohen to analyze fMRIs of people’s brains as they responded to hypothetical moral dilemmas. They inferred from looking at neural activity in different regions that moral judgment involved two distinct psychological processes. One of the processes, a fast and intuitive one, took place by and large in areas of the brain associated with emotional processing, such as the medial prefrontal cortex and the amygdala. The other process, which was slow and rational, took place by and large in regions associated with cognitive processing, such as the dorsolateral prefrontal cortex and the parietal lobe.
Greene interpreted these results in the light of an unverifiable and unfalsifiable story about evolutionary psychology. Since primitive human beings encountered up-close dangers or threats of personal violence, their brains, he speculated, evolved fast and focused responses for dealing with such perils. The impersonal violence that threatens humans in more sophisticated societies does not trigger the same kind of affective response, so it allows for slower, more cognitive processes of moral deliberation that weigh the relevant consequences of actions. Greene inferred from this that the slower mechanisms we see in the brain are a later development and are superior because morality is properly concerned with impersonal values—for example, justice—to which personal harms and goals such as family loyalty should be irrelevant. He has taken this to be a vindication of a specific, consequentialist philosophical theory of morality: utilitarianism.
But as the philosopher Selim Berker has pointed out in his important paper “The Normative Insignificance of Neuroscience,” the claim here is that personal factors are morally irrelevant, so the neural and psychological processes that track such factors in each person cannot be relied on to support moral propositions or guide moral decisions. Greene’s controversial philosophical claim is simply presupposed; it is in no way motivated by the findings of science. An understanding of the neural correlates of reasoning can tell us nothing about whether the outcome of this reasoning is justified. It is not the neuroscience but rather our considered moral judgments that do all the evaluative work in telling us which mental processes we should trust and which we should not.
Many of the psychologists who have taken up the dual-process model claim to be dismissive of philosophical theories, generally. They reject Greene’s inferences about utilitarianism and claim to be restricting themselves to what can be proved scientifically. But in fact all of those I discuss here are making claims about which kinds of moral judgments are good or bad by assessing which are adaptive or maladaptive in relation to a norm of social cooperation. They are thereby relying on an implicit philosophical theory of morality, albeit a much less exacting one than utilitarianism. Rather than adhering to the moral view that we should maximize “utility”—or satisfaction of wants—they are adopting the more minimal, Hobbesian view that our first priority should be to avoid conflict. This minimalist moral worldview is, again, simply presupposed; it is not defended through argument and cannot be substantiated simply by an appeal to scientific facts. And its implications are not altogether appealing.
…
Recent developments in the profession of psychology have been discouraging in this respect. In July 2015 a team of investigators led by David Hoffman, a lawyer with the firm Sidley Austin, published a report, commissioned by the American Psychological Association in November 2014, into the collusion of APA officials with the Department of Defense and the CIA to support torture. The report details extensive evidence of collusion. The APA revised its own ethical guidelines in order to facilitate collusion over and participation in torture by providing a set of very loose moral constraints on the participation of psychologists in interrogations. In doing so, the APA leaders were apparently motivated by the enormous financial benefits conferred on the profession in the form of Department of Defense funding. The episode demonstrates well the fragility of morality.
The authors of the report say in their conclusion:
We have heard from psychologists who treat patients for a living that they feel physically sick when they think about the involvement of psychologists intentionally using harsh interrogation techniques. This is the perspective of psychologists who use their training and skill to peer into the damaged and fragile psyches of their patients, to understand and empathize with the intensity of psychological pain in an effort to heal it. The prospect of a member of their profession using that same training and skill to intentionally cause psychological or physical harm to a detainee sickens them. We find that perspective understandable.
It is easy to imagine the psychologists who claim to be moral experts dismissing such a reaction as an unreliable “gut response” that must be overridden by more sophisticated reasoning. But a thorough distrust of rapid, emotional responses might well leave human beings without a moral compass sufficiently strong to guide them through times of crisis, when our judgment is most severely challenged, or to compete with powerful nonmoral motivations.
The New York Review of Books for more