Use and Abuse of the Impact Factor: Scientists Rebel!
Yves Desjardins, ISHS Board Member Responsible for Publications
For many years, horticulture scientists have expressed doubts and dissatisfaction with the use of the impact factor (IF) as a criterion for their promotion and the evaluation of the quality of their research. Dr. Jules Janick, former Board member and ISHS editor, wrote appropriately about the "Tyranny of the impact factor" and the misuse of this bibliometric parameter for assessing the impact of research (Janick, 2008). It seems that horticulturists are not alone anymore in rejecting the concept of IF and its use and abuse. Actually, the entire scientific community may be on the verge of a small rebellion against IF. The echoes of this uprising come from San Francisco.
Before talking about San Francisco, let's backtrack. The IF was conceived by Eugene Garfield in the 1970s as a tool intended for research libraries to judge the relative merit of scientific journals to intelligently spend their limited and shrinking subscription budgets. This index was, and still is, a measure of the number of citations an article receives over the total number of articles published in that journal during the two preceding years. It is considered a measure of the overall journal quality, but is not a measure of the quality of individual papers. Indeed, Seglen (1992) reported that only 15% of the articles in scientific journals account for more than 50% of the citations, and 90% of papers have fewer citations than the average!!! Over the years, use of the IF has drifted from its original purpose. Now, international scientists are ranked by the weight conferred by the IF of the journal in which they publish. This conceptual shift is absurd. The IF has been appropriately criticized for many years. Rating papers on the average ranking of the journal in which they were published is inappropriate.
Indeed the IF concept has been denounced by many scientific organizations for its shortcomings and lack of objectivity. For one, the index is discipline dependent. In other words, it is largely influenced by the size of the community in a specific field of science. Thus, small scientific communities have limited opportunities to cite a paper. IF is also affected by the speed at which a paper is published and by the vitality of the domain. For example, a paper on AIDS stands more chance of being cited than a paper on the effect of nitrogen on postharvest storage of mangosteen. Thomson Reuters recommends not to compare IF between disciplines but unfortunately there are many universities and organizations simply ignoring this basic rule.
The IF can be tricked and manipulated by an aggressive editorial policy whereby a large number of review articles that receive higher citations are published, the number of articles published is restricted (e.g. Nature and Science with acceptation rates of less than 10%), higher impact papers are published early in the year, so they can be cited over a longer time, or a coercive policy of self-citations is applied. IF has also been blamed for promoting "me-too" science and discouraging high risk and potentially ground-breaking work because of fear of not being cited or being cited by too few persons.
It's in this context that many scientists (+ 8800) and reputable science organizations (+ 350) have signed the San Francisco Declaration of Research Assessment (DORA, 2012), as an outcome of the last meeting of the American Society for Cell Biology in December 2012.
The declaration aims to correct the distortions in the evaluation of scientific research quality and to stop the use of the IF for this purpose altogether!
The declaration, which has sent shockwaves through the science community, calls for
- a ban on the use of journal-based metrics such as journal IF in funding, appointments and promotion considerations;
- judgement of research on its own merit rather than on the basis of the journal where it is published; and
- use of the new opportunities conferred by on-line publication to loosen limits on specific length, number of words, figures and references in articles, while exploring new indicators of research significance and merit.
Many recommendations of the 18 listed in the declaration stand out and are of direct relevance to us in academia. For instance, "when involved in committees making decisions about funding, hiring, tenure or promotion, researchers should make their assessment based on scientific content rather than publication metrics and should challenge research assessment practices that rely inappropriately on journal IF". Scientists should be aware and use a range of article-based metrics ("altmetrics") when available for assessing the quality of research output. Publishers are also invited to "limit their emphasis on IF as a promotional tool and make available a range of article-based metrics". Other recommendations are made to funding agencies and institutions to be explicit about the criteria they use to evaluate scientific productivity.
During its last meeting in Matera, Italy, the ISHS Board enthusiastically signed the DORA, a declaration that conforms to what the Society has been claiming for years: the IF does not properly assess the scientific significance of papers published in the field of horticulture science, e.g., the low IF of most horticultural journals (< 1.5). This is particularly important for the recognition of Acta Horticulturae®, which is not considered by ISI-Thomson Reuters' Web-of-Knowledge® as a scientific journal, even if it is fully refereed in the ISI conference database. Acta is referenced by Scopus and thus used in the calculation of the H-Index (the set of a scientist's most cited papers divided by the number of citations that they have received in other publications) and altmetrics.
The recent decisions by the ISHS Board to provide a digital object identifier (DOI) to each published Acta Horticulturae® article, to invest in the infrastructure to fully cross-reference articles and bibliographies and to provide the capacity for on-line publication of articles are opening new opportunities to comply with the San Francisco's DORA declaration. Opening the door to the WEB 2.0 into PubHort will allow the generation of new altmetrics, including conventional ones like the number of times an article is viewed, downloaded and cited, but also new ones such as the use of social media (Twitter, Facebook and LinkedIn), online reference managers (CiteULike, Zotero and Mendeley), collaborative encyclopedias (Wikipedia), Blogs (both scholarly and general-audience), and scholarly social networks (ResearchGate or Academia.edu). At this stage, we are ready to provide the number of Twitter tweets generated by an Acta Hortculturae® article (the so-called Twimpact Factor), a reflection of the "buzz" generated by an article in the scientific community and the media, a metric that is gaining popularity as a prediction of the impact of a paper in the community at large.
As the ISHS Board member responsible for publications, and on behalf of the Board, I stand and promote the publication of quality research. Articles in Acta Horticulturae® and other horticultural publications will be better recognized through the use of the H-index and altmetrics than with IF. We support the DORA and so should you. I invite you to read the "Declaration" and sign it in support.
ISHS does not want to play the IF game anymore and is taking steps to use altmetrics when and wherever possible.
We hear the echo of San Francisco DORA... and retweet it!