

The “publish or perish” approach has become an integral part of an academic’s life when seeking positions, striving for promotions, or competing for funding. This approach often hinges on journal-based metrics which push researchers to seek publication in journals indexed in the Web of Science. Due to the pressure to publish a certain number of publications in journals indexed in the Web of Science, researchers might attempt to find a journal with a lower impact factor, i.e., less popular and visible journals in the scientific community. Even more concerning is the fact that researchers might publish their results in predatory journals. This paper analyzes the consequence of introducing a journal indicators-based academic evaluation by analyzing productivity and publication patterns of researchers. Moreover, this paper investigates the correlation between journal-based academic evaluation rules and researchers’ ethics. The analysis is based on bibliometric data collected from the Web of Science database. The case study subject is the Serbian research landscape before and after the introduction of a journal metrics-based academic evaluation. © 2023 Informa UK Limited, trading as Taylor & Francis Group.
| Funding sponsor | Funding number | Acronym |
|---|---|---|
| Ministarstvo Prosvete, Nauke i Tehnološkog Razvoja | MPNTR |
The academic adage “publish or perish” describes the pressure exerted on academics by authoritative bodies (institutions, funding agencies, or governments) to publish as many academic papers as possible in order to advance their careers (De Rond and Miller ). This “publish or perish” environment has contributed to the replication crisis, wherein many scientific studies cannot be reproduced, despite the fundamental principle of the scientific method requiring research to be replicable (Kiai ). Indeed, if governments and institutions pressure researchers to produce a high number of scientific articles and additional expected outcomes, it might improve the productivity of researchers on the one hand, but there might also be some negative consequences as well. Some researchers might prioritize the quantity of research over its quality, or even compromise their ethical standards to meet expectations and secure promotions or increase their chances of obtaining funding. To address this problem, this paper explores how changes in government strategy, such as the introduction of a new evaluation rulebook for promotion to scientific positions and the assessment of national project proposals based on journal metrics, can influence the scientific landscape of a country. Although the San Francisco Declaration on Research Assessment (SF DORA) recommends the elimination of the use of journal-based metrics, such as journal impact factors, in funding, appointment, and promotion considerations, the application of such metrics may have some positive effects on the local scientific community (Cagan ). Research assessment reform has emerged as a widely discussed international subject in recent times. The SCOPE framework for responsible research assessment was developed by the Research Evaluation Group of the International Network of Research Management Societies (INORMS). This structured approach is intended to assist research managers and individuals engaged in research evaluations by providing a systematic, step-by-step process. It serves as a valuable tool for both planning new evaluations and scrutinizing existing ones. The acronym SCOPE reflects its key components: S for START with what you value, C for CONTEXT considerations, O for OPTIONS for evaluating, P for PROBE deeply, and E for EVALUATE your evaluation. In July 2022, the Coalition for Advancing Research Assessment (CoARA) released the Agreement on Reforming Research Assessment, a document outlining principles, commitments, and a timeframe for implementing reforms aimed at fostering responsible research assessment. By the end of 2023, over 600 institutions had endorsed this agreement. Concurrently, there are ongoing EU-funded projects focused on reshaping research assessment, such as OPUS and GraspOS. The Open and Universal Science project (OPUS) is dedicated to establishing coordination and support measures for reforming the assessment of research and researchers in Research Performing Organisations and Research Funding Organisations. The goal is to create a system that encourages and rewards researchers for adopting Open Science practices. Another initiative, Next Generation Research Assessment to Promote Open Science (GraspOS), aligns with evolving policy reforms and strives to pave the way for a Research Assessment system that is both Open Science-aware and responsible. Additionally, various national initiatives, such as the Finnish example ( https://doi.org/10.23847/isbn.9789525995282 ), are actively engaged in reforming research assessment. While the research in this paper focuses solely on the scientific landscape of one country (Serbia) and is associated with a research assessment reform that occurred 15 years ago, the findings from this study may have broader applicability to other national contexts. The lessons drawn from this case could prove valuable for countries and institutions contemplating research assessment reforms in the near future. The Serbian government and Ministry of Science prescribed new academic evaluation rules for promotion to scientific positions (assistant professor, associate professor, full professor, researcher, senior researcher, etc.) in 2008. These rules are used to evaluate the competency of a consortium team for the selection of projects funded by the Serbian Ministry of Science. According to the rules (available in the Serbian language at https://www.mpn.gov.rs/wp-content/uploads/2017/03/Pravilnik-2017-preciscen-tekst.pdf ), to achieve promotion to a higher position and for principal investigators and participants in nationally funded projects, a researcher must publish a certain number of articles in journals indexed in the Web of Science.
Ivanović, L.; Faculty of Education, University of Novi Sad, Sombor, Serbia;
© Copyright 2023 Elsevier B.V., All rights reserved.