On Fads and Funding in Science

December 4, 2014

The current model for funding of "Research and Development" (R&D) descends largely from the experience of the Second World War and its successful employment of "big science" through the Manhattan Project. Building from that success, the current National Science Foundation and eventually the National Institutes of Health began to account for most of the federal funding of science and technology. The brainchild of Vannevar Bush, whose work with the Manhattan Project convinced him of the possibility of achieving great things in science and technology given enough money and people, these organizations fund much of the basic research in both academia and the private sectors, and now work cooperatively with industry thanks to the "private-public partnership" model developed in the 1980s to help make up for declining public will to fund basic research without clear payoffs. Next year's US budget allots about 3.5% of the federal budget to "R&D", but a good chunk of that is earmarked for various defense departments (with the bulk of that going to the Air Force). Moreover, about half of the total R&D budget goes to development rather than "basic" research. In the past decades, these trends have resulted in the US spending less on potentially groundbreaking programs, such as the Superconducting Supercollider, which might have beaten CERN to the punch at detecting the Higgs Boson, for instance. It was started but never built. Space exploration too has suffered with no viable human capable launch vehicle currently deployed by the US, and the next one a decade away from completion. Research into basic physics, mathematics, and other non-medical fields where "return on investment" is either not guaranteed or in some cases immaterial, must scrounge for access to a declining pool of funds. Science is now beholden to the promise of profit, and takes a backseat if it cannot justify itself.

The same is true in academia in general. A liberal education was at one time deemed necessary to be a civilized member of the upper classes. Access to university education was a luxury and treated as such. "Career" came second to the well-rounding of a person's experience and personality. Over time, universities became more accessible, and the benefits of a complete education more available. In the US, with the advent of the GI Bill, the formation of numerous affordable and sometimes free state university systems, universities and colleges flourished, even as money for basic science also abounded. Along the way, the drive to create, produce, innovate, and thrive economically in a burgeoning capitalist system began the slope toward turning basic knowledge into "useful" things, which often meant "marketable" things. As the recessions of the 70s and 80s began to put pressure on funding sources, and arguments were made to cut funding for sciences whose "development" could not be assured in favor of technologies that could be marketed and make the US more competitive in the international marketplace, competition for funding in basic sciences (not to mention the humanities) heated up. There is much to say about how this has corrupted basic science, corrupted the academic environment, and denuded the notion of a liberal education for its own sake, but here I wish only to point out a most unfortunate effect in how academics seek funds, and how this tense and competitive atmosphere cheapens research: scientific (and philosophical) fads. 

For a time it was everything "cyber," then it was anything "genomic," "nano," and now "neuro-"  .... these fads can be traced through successful funding achieved with these prefixes attached to them. To get funding, to get attention, to achieve success (which is often measured by universities through dollars brought in), one must show that one is on the forefront of the "next big thing." Currently, "neuro" anything reigns, as witnessed by the big two projects announced in the US and Europe: the US "BRAIN" initiative and the Human Brain Project in Europe. These are conceived as Manhattan projects for neuroscience, both intended to pour money at one of the most perplexing problems science (and philosophy) face: understanding the human brain. Both huge projects operate on the assumption that by mapping all of the structure of brains we will understand them better, perhaps even develop working models of them, and who knows, perhaps succeed finally in achieving artificial intelligence. Setting aside the enormous philosophical leaps involved in the assumptions underlying the science (which may yet prove useful scientifically for other reasons), these projects are consuming a tremendous amount of resources in a declining pool, perhaps in part to the faddishness of the study of anything "neuro." But as Sally Satel and Scott Lillienfeld have exposed in their recent book Brainwashed: The Seductive Appeal of Mindless Neuroscience  the reality of what we know or could hope to know in the near future through observation of brains is very little (largely because our tools are still quite crude), meanwhile these two massive projects are attempting to model these systems about which our knowledge is currently quite rudimentary. Recently, a supercomputer was able to model one second of human brain activity (for one percent of the brain) in a simulation that took 40 minutes to calculate. Even assuming the most optimistic figures, and drawing the most optimistic conclusions, there are significant reasons to question whether this immense thrust in funding is premature, or whether the science would better have been carried out more incrementally, the funding allotted more diffusely, and our expectations not raised so highly by this particular fad.

The fact is "neuro" research is hot and marketable. Neuroscientists now provide marketable (but not necessarily valuable) services to advertisers, for instance, and philosophy departments are in on the act too, focusing studies on "neuroethics," etc. The brain is one of the most complex things in the universe as far as we can tell, and uncovering its secrets will provide us with enormous knowledge, but even realists like myself know that understanding the structure doesn't necessarily mean uncovering the function. John Searle's famous Chinese Room objection to artificial intelligence suggests that we may never know whether an artificial intelligence is actually intelligent, whether it actually "knows" something, or merely simulates it. For that matter, the objections holds for other minds as philosophers still debate. None of which is to say that the research shouldn't be done. A large part of the appeal of this research is, I'm afraid, based on faddishness and not necessarily founded upon realistic expectations about the scientific value of inquiry at this time about this subject. Meanwhile, other basic sciences suffer for lack of funding and perceived appeal. Perhaps we should decouple our funding of research, separate out the research from the development, provide a pool of funds for only basic science, take no heed nor attach importance to potential return on investment, and let scientists themselves apportion those funds apart from any and all political considerations or public appeal. Perhaps then we might defuse the appeal to fads and their potentially pernicious effect on basic research in all fields. It would be interesting to see how the money and interest among scientists would flow uncoupled from the need to market and sell in order to make a case for scientific value. It's just a thought.

 

 

Commenting is not available in this weblog entry.