In Science, Whose Department is the Ethics Department?: Part 1

November 23, 2012

The role of ethics in science was the subject of a popular 1960s song by American songwriter and satirist Tom Lehrer. In his song about German rocket engineer Wernher von Braun, he sang a catchy and biting tune which includes the phrase "Once the rockets are up, who cares where they come down? / That's not my department,' says Wernher von Braun."

Lehrer made the point that scientists such as ex-Nazi rocket scientist Wernher von Braun seemed singularly unconcerned with the consequences of their research; any moral or ethical implications of damage done "to the widows and cripples in old London town" (as Lehrer sang) were explicitly disavowed. Lehrer's song was an effective anti-authoritarian anthem, especially during the heady 1960s.

To be fair, I was unable to find any evidence that von Braun actually made that statement or expressed that sentiment. It does not appear among the collections of quotes I researched (e.g., Seldes, 1985; Peter, 1977; and of course that ultimate authority of reliable knowledge, Wikipedia)-though Lehrer is of course a satirist, and hyperbole in the service of satire makes his larger point.

Lehrer raises the question: Should scientists accept responsibility for the uses of their work? Or is science an inherently value-neutral enterprise whose only moral implications arise from the ethical (or unethical) actions of individual scientists?

We may begin by noting that science is a systematic method or process, a tool for finding out what is true about the world; science is not technology, nor it is ideology (Vaughn, 2008, p. 385). As such, science per se is value-neutral, just as any tool or research method is value-neutral. The question is not whether the tools and methods are good or bad but instead if they are useful and effective for the task at hand. Just as a pair of pliers or a wrench are neither good nor bad, but either well- or ill-suited for a particular task, science is neither good nor bad (except insofar as gaining valid information about the world is inherently good and beneficial).

The product of good science is information, and, like science itself, information is also value-neutral. (Again, assuming the information is valid and correct; wrong information is inherently harmful for a variety of reasons, including that it retards progress and misdirects resources and further investigation.) Whether that information is a recipe for a chocolate cake or a chemical bomb, the information itself has no ethical implications, but is instead merely a collections of words, illustrations, and measurements.

But what about the scientists themselves, such as Wernher von Braun? Assigning responsibility to scientists for the consequences of their research and work is not as clear cut as Lehrer might suggest.

Murky Scientific Implications

One of the biggest impediments to assigning responsibility to scientists for the results of their work is that future implications of a given scientific achievement are never clear. Certainly, in some cases the misuse of information or scientific knowledge is clear and detrimental. A terrorist who uses his knowledge of chemistry to create a bomb to kill innocent people is an extreme example. But once we move past this obviously unethical scenario the situation becomes very murky.

Could Wernher von Braun be considered morally responsible for knowing where the rockets he designed come down (i.e., on top of London's widows, cripples, aristocrats, and other miscellaneous miscreants)? One could make the argument that the sentiment Lehrer attributes to von Braun is a perfectly valid, defensible position, especially if taken literally. His "department" is indeed rocket science and engineering, not the individual, specific uses of the products of his department. Von Braun of course would not be consulted by military strategists about which of the enemy's resources to target with the rockets he helped create. Those rockets could be used to destroy important offensive weapons that could be used to attack innocent Americans-or they could be used to bomb hospitals and orphanages.

Indeed, this fact is formally recognized by the military's chain of command and beyond. For example in 2007 a U.S. Army Apache helicopter in Baghdad fired upon a group of civilians, killing eleven of them with cannon fire (McElroy, 2010). Several people were blamed for the deaths, primarily the soldiers who fired the weapon. Yet no one suggested that the weapons manufacturers who created and sold the 30mm rounds that hit the civilians were responsible for their deaths; nor did anyone even hint that the Army helicopter maintenance crew that allowed the Apache to fly that day had any ethical role in the killings.

Instead, common sense clearly showed that the full responsibility for the incident lay with the officers who made the decision to fire. One can question whether or not that was a lawful order that should have been followed, but even raising that question acknowledges that the responsibility lies with the end user of that weapon. If what helicopters do once they are airborne is not the responsibility (or department) of helicopter engineers, then surely what rockets do once they are airborne is not the department of rocket engineers.

Once you try to assign such moral culpability, where does it stop? Is Alfred Nobel morally responsible for anyone killed or injured by his scientific invention, dynamite? Is George Eastman ethically responsible for child pornography because a century after his research creating photographic film, some of that film was used for illegal and immoral purposes? Where does one draw the line?

Because the scientific body of knowledge is incremental, any given technological advance or development will necessarily be the result of contributions by dozens-perhaps hundreds-of scientists. Take, for example, an intercontinental ballistic missile (ICBM), potentially capable of killing tens of thousands of people. It's easy to suggest that the scientists who developed the explosive warhead might be ethically responsible for the death and destruction that it brings about.

But without the capability of delivering that warhead the ICBM is useless; it needs a guidance system. Are the scientists who developed the guidance system just as culpable as those who created the warhead? And those guidance systems, in turn, are likely guided by satellites in geosynchronous orbit. Are those who designed and launched the satellites responsible for the consequences of the devices and technologies they enable? Of course, even with the perfect guidance system, the warhead must be physically delivered. What about the scientists and engineers at the chemical companies that formulate and manufacture the rocket propellant? Are they, also, morally responsible for the end use of their work and product?

There clearly exists an admittedly somewhat arbitrary (but nonetheless discernible) continuum of responsibility for the consequences of indirect actions. By way of analogy, gun manufacturers are not held responsible for crimes committed with their products. A gun is as harmless as a stapler unless it is loaded with a bullet and fired; the person who performs that act is held responsible for the consequences of that action-not the gun or bullet manufacturers.

Furthermore, like science itself, all the fruits of science and technology can be used for good or bad. Bullets can kill innocent people, but they can also protect innocent people. Satellites can guide devastating weapons directly to their targets with amazing accuracy, but can also allow people around the world to communicate with each other in real time (and give people hours of advance warning about disasters such as earthquakes and tsunamis).

Part 2 will appear next week.


References


Bergaust, E. (1976). Wernher von Braun. Washington, D.C.: National Space Institute.

Criminal Law Lawyer Source. (2010). Comparative negligence. Retrieved from
http://www.criminal-law-lawyer-source.com/terms/comp_neg.html.

DePree, C. & Axelrod, A. (Eds.). (2003). Van Nostrand's concise encyclopedia of science. New Jersey: John Wiley & Sons.

Glynn, I. & Glynn, J. (2004). The life and death of smallpox. New York: Cambridge University Press.

McElroy, D. (2010, April 6). Calls for inquiry into Apache attack on Iraqi civilians. The Telegraph (U.K.). Retrieved from
http://www.telegraph.co.uk/news/worldnews/middleeast/iraq/7560561/Calls-for-inquiry-into-Apache-attack-on-Iraqi-civilians.html.

Peter, L. (1977). Peter's quotations: Ideas for our time. New York: Bantam Books.

Seldes, G. (1985). The great thoughts. New York: Ballantine Books.

Vaughn, L. (2008). The power of critical thinking: Effective reasoning about ordinary and extraordinary claims. New York: Oxford University Press.

Comments:

#1 ChristineRose (Guest) on Friday November 23, 2012 at 12:37pm

Maybe I should wait until the second part, but the obvious distinction is that not all filmed material is child pornography and that not all helicopter rides end in death. Even 30mm rounds aren’t usually shot at people, and occasionally one does hear criticism of the people who make those.

Commenting is not available in this weblog entry.