Research Integrity Risk Index (RI2): Kita Perlu Introspeksi
Laporan Dr. Lokman Meho et.al., seorang dosen di American University of Beirut , menimbulkan diskusi panjang di kalangan dosen dan peneliti di Indonesia. Terlebih lagi kalangan yang menjadi dosen atau peneliti di kampus yang diteliti oleh tim Lokman Meho.
Di tengah protes dan ketidaksetujuan para pejabat dan rektor dan insan perguruan tinggi, ada baiknya merenung dan introspeksi. Ada 1.500 perguruan tinggi di seluruh dunia yang diuji oleh Meho dan ada 13 kampus di Indonesia yang diuji, karena dianggap kampus yang kelas atas.
Bagaimana dengan kampus lain? Sudah banyak dibahas di koran-koran online seperti Kompas, Tempo, dan lain-lain. Dalam rentang 2017 - 2020 terjadi tsunami artikel ilmiah di Indonesia. Kuantitas artikel membludak, ranking satu di Asia Tenggara. Saran bijak dari sesepuh pendidikan di Indonesia, hendaklah kita lebih mementingkan kualitas ketimbang kuantitas.
Pemerintah sepatutnya segera memikirkan upaya terbaik agar kejadian ini hanya terjadi sekali ini saja. Dikaji lagi peraturan tentang publikasi oleh mahasiswa (S1, S2, S3), publikasi penelitian dosen, persyaratan kenaikan pangkat (jabatan fungsional), laporan semester BKD dan lain-lain. Sebab, artikel atau makalah ilmiah yang bermasalah muncul dari hal-hal tersebut.
Kebijakan yang tepat, dengan olah pikir para ahli di Kementerian Pendidikan Tinggi dan Saintek diharapkan dapat mengembalikan marwah pendidikan tinggi Indonesia. Sekarang saatnya. Sebab, Meho akan rutin merilis laporannya. Tujuan Meho positif. Sebagai katalisator ke arah kebaikan, kebenaran, tanggung jawab.
Meho hopes that the RI² will gradually be used “not just to detect and flag, but to help universities benchmark their performance, design better policies, and celebrate those who uphold high standards in responsible research practices."
Sebagai negara ber-Pancasila, khususnya sila Ketuhanan Yang Mahaesa, sudah selayaknya menjunjung tinggi integritas sebagai pengajar, peneliti, dan pengabdi kepada msyarakat. Tridharma yang esensial, bukan artifisial.
Introspeksi lebih bermanfaat daripada menggugat metodologi Meho dkk.
Di bawah ini dikutipkan perihal RIRI tersebut.
Di bawah ini di-copy paste-kan nama kampusnya:
Zona Merah – Risiko Tinggi:
– Bina Nusantara University (BINUS)
– Universitas Airlangga (UNAIR)
– Universitas Sumatera Utara (USU)
– Universitas Hasanuddin (UNHAS)
– Universitas Sebelas Maret (UNS)
Zona Oranye – Risiko Sedang Tinggi:
-Universitas Diponegoro (UNDIP)
-Universitas Brawijaya (UB)
-Universitas Padjadjaran (UNPAD)
Zona Kuning – Risiko Sedang:
-Institut Teknologi Sepuluh Nopember (ITS)
-Universitas Indonesia (UI)
-Institut Teknologi Bandung (ITB)
-Institut Pertanian Bogor (IPB)
-Universitas Gadjah Mada (UGM)
The Research Integrity Risk Index (RI²): A Composite
Metric for Detecting Risk Profiles
The RI² is the first metric explicitly designed to
profile research integrity risks using empirically grounded, transparent
indicators. Unlike conventional rankings that reward research volume and
citation visibility, RI² shifts focus toward integrity-sensitive metrics that
are resistant to manipulation and bibliometric inflation. In its current form,
RI² comprises two primary components:
- Retraction Risk: Measures the extent to which a university’s research portfolio includes retracted articles, particularly those retracted due to data fabrication, plagiarism, ethical violations, authorship or peer review manipulation, or serious methodological errors (Fang et al., 2012; Ioannidis et al., 2025). It is calculated as the number of retractions per 1,000 articles over the most recent two full calendar years before the last (e.g., 2022-2023 for an analysis conducted in 2025), normalizing for research output and time-lag effects. Elevated retraction rates may reflect weaknesses in research oversight and institutional culture.The analysis uses retraction data from three databases to evaluate institutional vulnerability to research misconduct or oversight failures. The author extracted and uploaded into SciVal all available original DOIs and PubMed IDs (PMIDs) associated with retracted articles from Retraction Watch, Medline, and Web of Science. As of June 18, 2025, Retraction Watch Database listed 43,482 entries marked as “Retraction” and classified under the following document types: case reports, clinical studies, guidelines, meta-analyses, research articles, retracted articles, review articles, letters when associated with research articles, and revisions when associated with review articles. These types were selected because, after cross-referencing, they corresponded to articles and reviews in Medline and Web of Science. Following the exclusion criteria used by Ioannidis et al. (2025), 2,238 records were removed due to non-author-related reasons (e.g., “Retract and Replace” and “Error by Journal/Publisher”). Of the remaining Retraction Watch entries, 38,316 successfully matched with SciVal records. The remaining 5,166 could not be matched, either because they were published in journals not indexed by SciVal or lacked identifiable DOIs or PMIDs in the databases.To supplement the Retraction Watch dataset, an additional 4,416 unique retracted articles were identified from Medline (2,737) and Web of Science (2,850) that were classified as “Retracted” or “Retracted Publication” and tagged as articles or reviews. In total, 42,732 unique retracted articles were matched to SciVal and included in the analysis. Scopus was excluded due to inconsistent classification practices: its “Retracted” label encompasses a broad range of document types, including letters and editorials, making it unsuitable for isolating retracted research articles and reviews.To account for the time lag between publication and retraction, the analysis focused on articles published in 2022 and 2023, rather than 2023-2024, to better capture recent institutional behaviors while ensuring a broader view of retraction activity (Candal-Pedreira et al., 2024; Feng et al., 2024; Fang et al., 2012; Gedik et al., 2024). By June 18, 2025, the number of retracted articles stood at 10,579 for 2022, 2,897 for 2023, and 1,601 for 2024. Worldwide, as of June 18, 2025, the retraction rate for 2022-2023 averaged 2.2 per 1,000 articles, with the highest rates observed in mathematics (9.3) and computer science (7.6) and the lowest in arts and humanities (0.2).According to Retraction Watch, the 15 main reasons for retractions are: Investigation by Journal/Publisher (48%), Unreliable Results and/or Conclusions (42%), Investigation by Third Party (34%), Concerns/Issues About Data (30%), Concerns/Issues about Referencing/Attributions (26%), Paper Mill (25%), Concerns/Issues with Peer Review (23%), Concerns/Issues about Results and/or Conclusions (19%), Fake Peer Review (19%), Computer-Aided Content or Computer-Generated Content (18%), Duplication of/in Image (10%), Duplication of/in Article (8%), Euphemisms for Plagiarism (6%), Investigation by Company/Institution (6%), and Lack of IRB/IACUC Approval and/or Compliance (6%).
- Delisted
Journal Risk: Quantifies the proportion of an institution’s
publications that appear in journals removed from Scopus or Web of
Science due to violations of publishing, editorial, or peer review
standards (Cortegiani et al., 2020). This is measured over the most recent
two full calendar years (e.g., 2023-2024 for an analysis conducted in
2025) and reflects structural vulnerabilities in quality control and
publishing practices. Such publications continue to influence
bibliometric metrics even after delisting, potentially distorting
evaluative benchmarks. This component includes all articles published in
journals delisted by Scopus in 2023-2024. It also includes articles in
journals delisted by Web of Science in 2023-2024 and still actively
indexed in Scopus.Scopus discontinues or delists journals through an
ongoing title re-evaluation program. Journals may be flagged for
re-evaluation due to underperformance on key bibliometric benchmarks
(citation rate, self-citation rate, and CiteScore); (2) formal complaints
about publication practices; (3) outlier publishing behaviors detected
algorithmically (e.g., sudden spikes in output or geographic
concentration); and (4) continuous curation feedback from the Content
Selection and Advisory Board. Journals that fail re-evaluation are
delisted, with indexing discontinued prospectively while retaining prior
content.Web of Science delists journals following an in-depth editorial
re-evaluation conducted by its independent in-house editors, who assess
journals against 24 quality and four impact criteria. Titles are
re-evaluated when flagged by internal monitoring, community feedback, or
observed shifts in editorial quality. If a journal no longer meets
quality standards, such as lacking editorial rigor, publishing ethically
questionable content, or deviating from peer-review norms, it is removed
from coverage, with future content no longer indexed. In serious cases,
previously indexed articles may also be withdrawn. Between 2009 and June
2025, a total of 974 unique journals were delisted—855
by Scopus and 169 by Web of
Science. Of these, 206 were delisted in
2023-2024 and had 124,945 articles in the Scopus database.
Institutional affiliations for these articles were tracked globally to
evaluate exposure to low-integrity publication channels.
Data for both components serve as proxies for broader
research integrity concerns, such as paper mills (businesses that sell
authorship), citation cartels (reciprocal citation networks used to inflate
impact), citation farms (organizations or networks that generate or sell
citations), fraudulent authorship practices, and other forms of metric gaming
(Abalkina, 2023; Maisonneuve, 2025; Candal-Pedreira et al., 2024; Feng et al.,
2024; Ioannidis & Maniadis, 2024; Lancho Barrantes et al., 2023; Smagulov
& Teixeira da Silva, 2025; Teixeira da Silva & Nazarovets, 2023;
Wright, 2024). Importantly, both components reflect verifiable outcomes rather
than inferred behaviors, making them robust indicators of institutional-level
risk.
Source: https://sites.aub.edu.lb/lmeho/methodology/
A Catalyst for Change
Although Meho does not consider the RI2 a quick fix for all academic publishing issues, he does believe it highlights the importance of research integrity in institutional assessment and university rankings. Meho hopes that the RI² will gradually be used “not just to detect and flag, but to help universities benchmark their performance, design better policies, and celebrate those who uphold high standards in responsible research practices."
Future releases will continue to rank the top 1,500 most
publishing universities, while also assigning RI² ratings to an additional
1,000 universities and the 100 most publishing research centers. This broader
coverage will allow for broader regional comparisons and more effective
monitoring of global trends over time.
Sources:
Tidak ada komentar:
Posting Komentar