An Avalanche of Low-Quality Research?
An interesting article was recently published in the Chronicle of Higher Education. The authors note that the number of scientific researchers, and therefore the number of research publications, has increased dramatically in the past few decades, and claim this is indicative of an "avalanche of low-quality research" that must be quelled for the good of science. With all due respect to the authors, I'd like to address this cynical and short-sighted view of scientific research in the modern information-centric world. The article is linked above, and worth a read if you are so inclined, but at the heart is the conclusion that:
The amount of redundant, inconsequential, and outright poor research has swelled in recent decades, filling countless pages in journals and monographs.
The argument goes something like this:
- According to Ulrich's Periodicals Directory, the number of publications is increasing at a rate of 3.26% per year, doubling in merely two decades
- Studies show that the number of scientific publications cited within five years of publication appears to have dropped by 4.4% in the last two decades
- Therefore: More research is being done while less is being cited, so less quality research is being produced
Let's start with what I agree with. The statistics are accurate; more researchers are creating more publications and citation percentages are going down. But does this necessarily indicate an increasing proportion of low quality research? Researchers cite less when they are exploring ideas outside those previously published. According to a report from the United Nations Educational, Scientific, and Cultural Organization, the bulk of new researchers are coming from developing countries as they create new scientific research programs. A culture entering the research community en masse may produce researchers who approach their experiments in entirely new ways, working outside of current research trends.
It is also likely that the decrease in citations of published papers indicates that researchers are reading a smaller percentage of published findings, not that the papers out there are low-quality. The authors are correct in the observation that it is often impossible for researchers to read all of the publications in their field today. Rather than presume that the pool of publications necessarily needs to be culled, the forward-thinking analysis of this situation is that the system by which researchers to find and consume publications relevant to their work needs an overhaul. Research is not considered credible unless published in a peer-reviewed journal, but journals publish findings in walled gardens, charging for access, spreading content across innumerable digital and physical distribution mediums, and neglecting to employ modern digital search techniques to allow researchers to easily locate relevant work. The truth is that the increase in the number journals and other publication venues, coupled with the limited distribution system of most journals/conferences, makes it more and more difficult for researchers to find and consume pertinent previous work, leading to less knowledge of their field, and therefore a reduction in citations. High quality research is still being created and published, but researchers are finding it more and more difficult to locate and consume it.
This is a significant problem to be overcome by the scientific research community as a whole, and not by reducing the amount of research being generated. How might we best get significant research results to the eyes of those in the field who need it to guide their own work? Research findings need to be more open, and modern distribution, search, and even crowdsourcing techniques must be applied to research findings in order to properly stimulate research communities by effectively disseminating important advances.
Let's examine another claim from the article:
Too much publication raises the refereeing load on leading practitioners—often beyond their capacity to cope.
While this statement is absolutely true, the implication that the solution requires publication to remain on a scale that the present small group of "leading practitioners" can referee is ludicrous. Rather than shrinking the number of publications, and thus lessening
the possibility of innovation, why not increase the size of this group of "leading practitioners"? This can be done not by loosening the requirements on expertise, but by creating expertise. Open research, easily and freely accessible, will propagate knowledge through the community at a higher rate. Building online communities where researchers can congregate will create an engaging dialog and a motivational environment by challenging members to gain standing among a group of peers. This will lower the barrier of entry to esoteric research communities, allowing more individuals to gain understanding and contribute significantly to the research community both as a practitioner and as a reviewer. Collaborative, crowd-sourcing models such as those employed by Wikipedia (general knowledge) or Stack Overflow (computer programming) have proven successful at disseminating knowledge and both attracting and creating expertise...why not apply similar techniques to scientific research communities?
Some steps in the right direction have been made. Most notably, recent proposed legislation in the U.S. attempts to make all federally-funded research publications available to the public free of cost. New journals (e.g. Midas, Insight journals) have adopted the concept of "open access", attempting to leverage crowd-sourcing techniques for review and rating of publications and requiring submissions to be completely open to the public. These journals also encourage reproduction and validation of published work, something woefully ignored in most modern research fields. Though these implementations may not catch on beyond small communities within computing research, their existence demonstrates a desire for easier, open access to research and review by the entire research community, rather than small committees of anointed experts. It is imperative that communities are formed within research fields outside of individual labs and journal/conference communities if findings are to gain appropriate visibility in the modern research environment.
Like any force of nature, this "avalanche" of research is powerful. We can focus our efforts on the task of stifling it, or we can harness its energy to benefit the world.