|Year : 2017 | Volume
| Issue : 3 | Page : 219-221
Beall's list vanishes into the blue… what next?
Department of Community Medicine, Dr. DY Patil Medical College, Hospital and Research Centre, Dr. DY Patil Vidyapeeth, Pune, Maharashtra, India
|Date of Web Publication
Department of Community Medicine, Dr. DY Patil Medical College, Hospital and Research Centre, Dr. DY Patil Vidyapeeth, Pune, Maharashtra
Source of Support: None, Conflict of Interest: None
|How to cite this article:
Banerjee A. Beall's list vanishes into the blue… what next?. Med J DY Patil Univ 2017;10:219-21
There are two models of scientific journal publications, the traditional and the open access. In the traditional model, the readers and libraries pay to sustain the journal. In open access, the cost is recovered from the authors, sometimes this is indirect, though, as when the cost is paid by the funding agency. While the traditional model limited readership to only those who could pay, in the open access model, anyone with access to the internet can read the paper free of cost. Briefly, either “readers pay” or “authors pay.” In addition, there is another model called the “platinum open access” which is free to readers as well as authors. Such journals are usually financed by universities or associations, for example, this journal which is financed by Dr. DY Patil University (Vidyapeeth) Society, Pune, India.
The open access movement is being driven by developments in information technology and trends toward information sharing reflecting the ethos of the present-day digital networks. Sharing of information through open source vindicated Moore's Law in the field of computer science. The outcome of data sharing has been very rewarding in the field of information technology, driving innovation at a rapid pace by collaborative efforts across the globe. As Friedman states, “There is something wonderfully human about the opensource community. At heart, it's driven by a deep human desire for collaboration and a deep human desire for recognition and affirmation of work well done––not financial reward… Millions of hours of free labor are being unlocked by tapping into people's innate desire to innovate, share, and be recognized by it.”
In view of this phenomenon which has yielded remarkable results using the net for sharing information, the open access model in scientific publications is here to stay. It is a strong tool which has immense potential to drive research. In developing countries, where, in the “readers pay model,” academics or their institutions may not be able to afford access to the exponentially growing medical literature, the “authors pay” model may be a necessity. However, like all strong tools, the open access model is double edged. Quantity over quality of publications tends to overload the published literature. An important paper may get buried under the clutter. A more serious concern is unethical publication practices which increased with the growth of the open access model.
Enter Beall. Jeffrey Beall, a scholarly initiatives librarian, at the University of Colorado, Denver, coined the terms “predatory journals” and “predatory publishers” which exploited the open access model for monetary gains at the expense of scientific integrity. These prey on novice researchers by soliciting manuscripts with the promise of quick publications. These questionable journals and their publishers follow dishonest publication practices without proper peer review of manuscripts. Fast publications are ensured as long as the authors pay. This may be a hefty sum demanded after acceptance of the manuscript. The terms and conditions of the payment model are fraudulent.
Academics may be lured into the editorial boards of these fake journals, sometimes without their knowledge. These journals are also convenient means for faculty and academics who are not serious researchers but require publications for tenure and promotion. They thrive in the academic anarchy unleashed by these questionable journals and publishers.
Beginning in 2008, Jeffrey Beall began monitoring these “Predatory Journals and Publishers” in his blog, “Scholarly Open Access.” Beall called a spade a spade, and listed all journals and publishers he judged to be shady. As time went on, Beall's list grew to more than 1000 such publishers, and many more journals.
Beall woke up the academic community from its slumber. Serious academics and institutions around the world woke up to the menace of predatory publishing. Some Indian universities such as Savitribai Phule University (Pune University) referred to Beall's list while framing guidelines for research publications. A number of papers on publication ethics cited his list. Beall's contribution in exposing fraud in scientific publications also featured in the lay press both in the USA and India.,
At the peak of popularity of Beall's list when it gained acceptance by academics and institutions globally as a means to identify questionable journals, the list abruptly disappeared around mid-January 2017. No reasons were given. Subsequently, he mentioned “pressure and politics” without giving details. Beall's list just vanished into the blue.
Beall has had his share of critics. Some alleged that he was hostile to the open access movement. This is debatable as borne out by the fact that Beall himself has published in an open access journal. Besides, he has reviewed articles for bona fide open access journals.
Other criticism of Beall's list may have more substance. The black and white categorization of the list approach misses the nuances necessary to determine the quality of a particular publication. Beall has been criticized for relying on websites of suspect journals rather than one-on-one discussion with the concerned publishers, which may have led to premature conclusions in some cases. His approach may have failed to differentiate between low-quality (not necessarily unethical) journals and blatantly dishonest journals. His list had high sensitivity, with perhaps a high number of false positives.
No system is foolproof. Therefore, one should not expect any single method of judging questionable journals to be perfect. Jeffrey Beall's list raised awareness among serious academicians about unethical publication practices. What next?
A number of people and associations have picked up the cue. Some have suggested that authors should review their experiences with journal and publishers – a crowd-sourced “author-reviewed” journal evaluation website  that is something like Foodiebay or Zomato, which display reviews of restaurants by customers on the website. It has been suggested that authors evaluate a journal based on factors such as time taken for reviews, editorial and review fairness, and constructive suggestions given by the review and editorial team. A website from the Netherlands, Quality Open Access Market, has such a mission. Quality scoring of journals is based on crowd sourcing with respect to four critical aspects: editorial information, peer review, governance, and workflow. Authors share their experience by completing a valuation score card.
A statement by the Open Access Scholarly Publishing Association (OASPA) mentions, “The publishing community needs stronger mechanisms to help identify reliable and rigorous journals and publishers, regardless of access or business model.” The OASPA requires transparency regarding author charges. The OASPA also cautions that any direct marketing activities publishers engage in shall be appropriate and unobtrusive, unlike predatory journals which aggressively campaign for academics to submit articles or serve on editorial boards using phishing or spam E-mails.
The Grand Valley State University Libraries, Allendale, Michigan, USA, has developed a set of Open Access Journal Quality Indicators. This is to help faculty evaluate the quality of open access journals and publishers. Rather than compiling a list of ethical or unethical journals or publishers, the Advisory Committee on Scholarly Communications of the university developed a set of indicators to be used by a faculty to judge the quality of a journal. Such an approach will go a long way in empowering academics to make their own judgments. However, these criteria only work well for people with a strong knowledge of English and therefore may not be helpful to most researchers from the developing world.
The indicators are not to be used in isolation. Rather a number of indicators both positive and negative should be triangulated to get the big picture on the bona fides of a journal or publisher. This may sound familiar to physicians who use similar strategy to make a diagnosis on a patient by combining signs and symptoms and do not base their diagnosis on an isolated sign or symptom or laboratory result. Assessing a journal is as precarious as the uncertainties faced in the practice of medicine.
Some of the positive indicators are: the scope of the journal is mentioned and focused; the target audience is defined; sound editorial board; and affiliation with an institution or university. These are all judgment calls best made by the researcher who is familiar with a particular field. Other positive indicators are transparency about author fees and Digital Object Identifier and an International Standard Serial Number. Other desirable indicators are indexing beyond Google Scholar and particularly for Open Access Journals, in the Directory of Open Access Journals or other reputed indexing or abstracting agencies.
Negative indicators are: the journal's website is difficult to locate; scope of the journal is not focused; target audience not well defined; “About” information is missing; and information about the peer-review process is not transparent. Other suspicious features are spamming or aggressive advertising, or repeat lead authors in the same issue.
A recent paper  identifies more than a dozen characteristics to distinguish predatory journals from legitimate ones. Some major cues to detect fake journals identified in this paper are scope of the journal extends to nonmedical subjects; spelling and grammatical errors on the journal website; using distorted images mimicking genuine websites; fake impact factors; description of the publication process not transparent; promise of rapid publication; no retraction policy; contact E-mail is nonprofessional; and so on.
To conclude, Beall, beginning with the publication of his list of predatory journals and publishers in 2008, raised awareness about dishonest publication practices. By the time his list disappeared in January 2017, other people and associations have picked up the task of educating and empowering authors about questionable journals and publishers. Beall will always be remembered as one of the early pioneers who singlehandedly took on the job of ensuring that the scientific literature is not contaminated by misleading papers published in predatory journals, which was one of the unfortunate side effects of the open access movement.
Friedman TL. Moore's law. Thank You for Being Late: An Optimist's Guide to Thriving in the Age of Accelerations. London: Allen Lane – An Imprint of Penguin Books, Penguin Random House UK; 2016. p. 36-84.
Beall J. Predatory publishers are corrupting open access. Nature 2012;489:179.
Patwardhan B, Dhavale DD, Bhargava S, Deshpande R, Jaavare A, Ghaskadbi S, et al
. Guidelines for Research Publication. Report of a Committee appointed by Hon Vice Chancellor Savitribai Phule Pune University; 05 May, 2015. Available from: http://www.unipune.ac.in/uop_files/Report-Guidelines_20-5-15.pdf
. [Last accessed on 2017 Jan 29].
Burdick A. Paging Dr. Fraud. The Fake Publishers that are Ruining Science. The New Yorker; 22 March, 2017. Available from: http://www.newyorker.com/tech/elements/paging-dr-fraud-the-fake-publishers-that-are-ruining-science [Last accessed on 2017 Mar 28].
Beall J. Dangerous predatory publishers threaten medical research. J Korean Med Sci 2016;31:1511-3.
Reviewers 2016. Med J DY Patil Univ 2017;1:111-3.
Beaubien S, Eckard M. Addressing faculty publishing concerns with open access journal quality indicators. J Libr Sch Commun 2014;2:eP1133. Available from: http://www.doi.org/10.7710/2162-3309.1133
. [Last accessed on 2017 Jan 30].