In Science, Whose Department is the Ethics Department?: Part 2

December 4, 2012

There clearly exists an admittedly somewhat arbitrary (but nonetheless discernible) continuum of responsibility for the consequences of indirect actions. By way of analogy, gun manufacturers are not held responsible for crimes committed with their products. A gun is as harmless as a stapler unless it is loaded with a bullet and fired; the person who performs that act is held responsible for the consequences of that action-not the gun or bullet manufacturers.

Furthermore, like science itself, all the fruits of science and technology can be used for good or bad. Bullets can kill innocent people, but they can also protect innocent people. Satellites can guide devastating weapons directly to their targets with amazing accuracy, but can also allow people around the world to communicate with each other in real time (and give people hours of advance warning about disasters such as earthquakes and tsunamis).

The Good and Bad of Smallpox Research

Let us examine another case of scientific responsibility, this one in the biomedical field. One interesting question arises in the realm of potentially harmful viruses such as smallpox.

Smallpox has been described as "the most devastating pestilence in human history" (DePree & Axelrod, 2003, p. 682), and one that caused incalculable misery and death. As Ian and Jenifer Glynn note in their book The Life and Death of Smallpox, "Most children born in eighteenth century London has smallpox before they were seven, and more than 6,000 people died in the epidemics in Rome in 1746 and 1754." In an unvaccinated population, between one and three out of ten smallpox patients would be expected to die. There were several forms of smallpox; "In severe cases the pustules could be so crowded that they fused together, and the number dying rose to about 60 percent; in the rare and most horrific form there was severe bleeding and over 90 percent died." Those lucky enough to survive were often badly disfigured and blinded. Indeed, "At the end of the eighteenth century about a third of all cases of blindness in Europe are thought to have been the result of smallpox; and the death toll was terrible-about 400,000 a year in Europe, excluding Russia" (Glynn & Glynn, 2004, p. 4).

At first glance, the ethical implications seem clear: a virus that is responsible for so many deaths should be destroyed so that it cannot do further damage. Yet the smallpox example serves as an interesting case study in the ethical ramifications of science.

Due to the research of scientists such as Edward Jenner, smallpox was globally eradicated in 1977. It was a remarkable scientific and public health achievement. Yet there's a darker side to the science of smallpox, a sort of smallpox paradox. The same processes that were used to eliminate smallpox can be used to recreate it: "though [smallpox] eradication was recognised by all countries as a stunning success, a few countries, and in particular the USSR, saw it as an opportunity. In a world in which most people have either never been vaccinated, or have not been vaccinated for many years, we are likely to be as vulnerable to smallpox as the Native American Indians were in the seventeenth century-which makes smallpox a horribly tempting weapon for biological warfare" (Glynn & Glynn, 2004, p. 228).

Again we see that science is value-neutral, and can be used for good or ill. The smallpox virus is itself of course not inherently evil; it is merely another form of life (or infectious agent) competing for resources and carriers in its evolutionary niche. If Edward Jenner is to be honored and credited with laying the foundation for the annihilation of one of the world's most dreaded diseases, is he also to be blamed and vilified for the potential malicious re-introduction of smallpox as a biological weapon? Surely not.

Legal Assignments of Responsibility

In my research I came across another quote from von Braun which offers a somewhat more nuanced view of the role of ethics in science and technology: "If the world's ethical standards fail to rise with the advances of our technological revolution, the world will go to hell. Let us remember that in the horse-and-buggy days nobody got hurt if the coachman had a drink too many. In our times of high-powered automobiles, however, that same drink may be fatal...." (Bergaust, 1976, p. 166). This von Braun-in contrast to Lehrer's happily amoral and agnostic caricature of scientific irresponsibility-seems to acknowledge that indeed ethics and science do go hand in hand. In fact his example of the drunk driver is especially relevant.

Though these issues are complex there is a precedent we can look to for guidance in matters of personal responsibility: the law. Imperfect though it may be, our legal system routinely assigns relative responsibility for the consequences of specific actions. In some states the law holds bars and restaurants partially responsible for the consequences of drunk drivers. While the person who chooses to drink past the point of inebriation is clearly the most proximate cause of an accident (and thus both morally and criminally liable), the law recognizes that other agents contributed to the problem. As one law source explains, "Comparative negligence laws spell out how the responsibility for an accident will be shared between the parties directly involved in the accident.... When both contributed to the accident, comparative negligence statutes will determine who will receive compensation for their losses and how much they are eligible to receive. [If] it is determined that both have some degree of fault in causing the accident, comparative negligence laws will determine how much compensation each party is eligible for based on their percentage of fault in the accident" (Criminal Law Lawyer Source, 2010).


There are no easy answers to these issues, but perhaps a code of ethics for scientists, based on legal principles and common notions of personal responsibility, would be most likely to succeed. One crafted too broadly would unfairly demonize innocent scientists, while one that was too limited in scope would allow all but the worst offenders to avoid responsibility.

Previous attempts to codify ethical scientific behavior such as the Nuremberg Code (adopted in 1947) and the Belmont Report (1979) are widely recognized and generally successful. Like any set of rules attempting to guide moral behavior they are of course incomplete and will never be perfect. The process is ongoing, and it is certain that in the years and decades and centuries ahead, other codes of ethics will be drafted and passed-and ignored by some high-profile miscreants such as those who participated in the Tuskegee experiments (and, more recently, the American syphilis experiments conducted in Guatemala between 1946 and 1948).

We must not make the mistake of discarding or rejecting a code of ethics merely because it is not uniformly or universally followed; something is better than nothing. As von Braun apocryphally (but correctly) noted, where the rockets came down was not his department. But raising the question of the role of ethics in science (and keeping it in the public's mind) is not only Tom Lehrer's department, but indeed all mankind's department.




Bergaust, E. (1976). Wernher von Braun. Washington, D.C.: National Space Institute.

Criminal Law Lawyer Source. (2010). Comparative negligence. Retrieved from

DePree, C. & Axelrod, A. (Eds.). (2003). Van Nostrand's concise encyclopedia of science. New Jersey: John Wiley & Sons.

Glynn, I. & Glynn, J. (2004). The life and death of smallpox. New York: Cambridge University Press.

McElroy, D. (2010, April 6). Calls for inquiry into Apache attack on Iraqi civilians. The Telegraph (U.K.). Retrieved from

Peter, L. (1977). Peter's quotations: Ideas for our time. New York: Bantam Books.

Seldes, G. (1985). The great thoughts. New York: Ballantine Books.

Vaughn, L. (2008). The power of critical thinking: Effective reasoning about ordinary and extraordinary claims. New York: Oxford University Press.