I found some a great quote and analogy from an essay in the most current Current Biology by Peter Lawrence titled The mismeasurement of science. This essay takes a look at how science is measured and examines the use of impact factors and other metrics used to measure scientific progress for individuals, departments and institutions.
The quote is actually from Leo Szilard, the famous Manhattan project physicist. When asked by a wealthy entrepreneur who believes that science has progressed too quickly, what should be done to retard this progress, Szilard replied:
"You could set up a foundation with an annual endowment of thirty million dollars. Research workers in need of funds could apply for grants, if they could make a convincing case. Have ten committees, each composed of twelve scientists, appointed to pass on these applications. Take the most active scientists out of the laboratory and make them members of these committees. …First of all, the best scientists would be removed from their laboratories and kept busy on committees passing on applications for funds. Secondly the scientific workers in need of funds would concentrate on problems which were considered promising and were pretty certain to lead to publishable results. …By going after the obvious, pretty soon science would dry out. Science would become something like a parlor game. …There would be fashions. Those who followed the fashions would get grants. Those who wouldn't would not."
The analogy is Lawrence's own and relates to song writers being assessed in the same way as scientists, an analogy I can understand having came to science from the music industry.
"It is fun to imagine song writers being assessed in the way that scientists are today. Bureaucrats employed by DAFTA (Ditty, Aria, Fugue and Toccata Assessment) would count the number of songs produced and rank them by which radio stations they were played on during the first two weeks after release. The song writers would soon find that producing junky Christmas tunes and cosying up to DJs from top radio stations advanced their careers more than composing proper music. It is not so funny that, in the real world of science, dodgy evaluation criteria such as impact factors and citations are dominating minds, distorting behaviour and determining careers."
A Scientific "Audit Society"
Lawrence suggests that our scientific "audit society" has put meeting the demands of the holy impact factor above understanding nature and disease. He predicts that citation-fishing and citation-bartering will increase. Citation-fishing is putting your name on authors lists when you basically did nothing, like provide a reagent. I am not entirely sure what citation-bartering means, but I suspect it has to do with journals "encouraging" submitters to cite more articles out that journal to facilitate acceptance of the paper.
One problem I have thought about using standardized metrics to evaluate scientific progress is the aggregation of high citation potential research in only a handful o the "top" journals. Instead of a more even spread of the literature in journals appropriate for certain paper, the trendy research of the day is published in high profile journal, while the more topical journals are left in the dirt. This prevents lower "impact" journals from escaping a certain impact factor range, making them less appealing to new researchers whose papers fit more in the journals interests and could have been potential read by more people in their field. Getting your paper read by the right people is the real impact in my opinion. I recently submitted a species description to a journal in which several descriptions of species in the in family have been published, in hopes it would reach the broadest audience. My next paper I hope to be open access though.
"Impact" and Taxonomy
Standardize metrics, namely the impact factor, have a tremendously negative impact on taxonomy and taxonomists. This has in part to do with the behavior of most scientists with regards to the field. Taxonomic works are virtually never cited in references of papers. In ecology this is the most pervasive because ecologists often use detailed keys and species descriptions in their work to confirm the identifications. The species is the fundamental unit of biology, especially in ecology. Yet you never see statements in the materials and methods section of such papers that "polychaetes were identified to genus using Fauchald 1977". Meigen 1830, which described Drosophila melanogaster, should be the most cited publication that ever existed, based on the amount research published using that species.
Ecologists often ID specimens on hearsay. By this, I mean someone told them what this species was so therefore that is what it will be called. This is how I started out. Senior grad students had done most of the identifying work for me, I just needed remember the species list or compare what I find with what was on the shelfs. The problem with this was either the preliminary IDs were sometimes wrong or or closer examination resulted in one species being 2 or more. The former has resulted in the shrimp description now in review and the latter resulted in an several anemone descriptions still in progress. I can't underscore the importance of checking the facts for yourself!
The profile of taxonomy will greatly benefit if people include citations to the works they used, whether individual species descriptions, large monographs, revisions or identification keys. Taxonomic works are consistently the only publications that get more used and more systematically ignored by the "big science" community, including medicinhttp://www.blogger.com/img/gl.link.gife, molecular biology & biochemistry, and evolutionary biology in addition to ecology. This attitude towards taxonomy, and the managerial approach to modern science practices, has devaluated the stature of the taxonomist to providing a service, essentially for free, for the greater community without due recognition. This is exploitation. While jobs become scarce, funding even scarcer, demand increases because of the top heavy age structure of taxonomists. This is especially true in the U.S. where most biology majors will never see a systematics class in their undergraduate handbook.
I would also like to highlight a blog, called DC's Goodscience run by David Colquhoun at University College London, that I recently discovered that
"...exposes the idiocy of ‘metrics’ for the assessment of research quality by means of actual real-life examples. The aim of the blog is to provide a forum for discussion of the effects of the cult of managerialism on universities in general, and on science in particular."
Update: Christopher Taylor at the Catalogue of Organisms gives a perspective on the importance of taxonomy using an example from Harvestman (Opiliones, an arachnid).