An Indonesian scientist at work on Borneo. The country’s new index seeks to capture academics’ performance.

National Geographic Image Collection/Alamy Stock Photo

How to shine in Indonesian science? Game the system

JAKARTA—Last July, when Indonesia’s Ministry of Research, Technology and Higher Education (RISTEK) here honored eight researchers, along with institutions and journals, for their exceptional contributions to science, observers noticed something odd. Many of the laureates were relatively unknown academics from second-tier universities; underdogs had apparently become leaders.

It didn’t take curious scientists long to figure out why. The honors went to top scorers in Indonesia’s Science and Technology Index (SINTA), a system introduced in early 2017 to measure research performance. Critics showed that several winners had inflated their SINTA score by publishing large numbers of papers in low-quality journals, citing their own work excessively, or forming networks of scientists who cited each other.

It’s unclear whether formal rules were broken, but SINTA’s architects concede they were outwitted. And the revelations have led to a fierce discussion about SINTA, a unique nationwide attempt to capture the output of every academic in a single formula. Some say it should not be used to produce rankings, or should even be abandoned. But the government is undeterred: After a meeting on 3–4 January, it announced the rollout later this year of an improved version. SINTA “gives recognition to Indonesian scientists, triggers competition among them, and motivates them to be better,” says Sadjuga, RISTEK’s director of intellectual property management. (Like many Indonesians, he goes by only one name.)

Indonesia has introduced several other policies in the past 6 years to boost research output from its more than 250,000 academics, who work at more than 4000 universities. University professors may lose almost half of their salary if they don’t publish in international journals, for instance. As a result, the number of papers published by authors in Indonesia has soared from just under 7000 in 2014 tomore than 28,000 last year, according to Scopus, a database operated by Dutch publisher Elsevier. Indonesia seems set to overtake Malaysia as the region’s biggest research producer by 2020.

SINTA—also the name of a Sanskrit goddess—turned the pressure up a notch. It combines data from Scopus and Google Scholar with information submitted by Indonesian academics to track published papers, citations, and researchers’ h-index, a controversial metric reflecting both output quantity and citations. These numbers are used to calculate a personal score that is taken into account when academics apply for research grants; a high score may also help with promotions and salary negotiations.

Many other countries use publication and citation data to evaluate research; some pay hefty cash bonuses for papers in top-tier journals. But, “There is nothing like [SINTA] that I know of,” says Diana Hicks, a research metrics expert at the Georgia Institute of Technology in Atlanta. The extra push was welcome, says Danang Birowosuto, an Indonesian physicist at CINTRA, an international research group in Singapore: “Our international competence in science is still very low.”

 But many Indonesian academics worried that SINTA might harm their reputations. Thousands joined groups on social media to help each other navigate the new numbers-driven landscape. “Although the original aim was sincere,” discussions soon turned to gaming the system, says plant biologist Andik Wijayanto of the State University of Malang.

Surging ahead

New incentives have driven a sharp rise in Scopus-indexed papers from Indonesia.

(GRAPHIC) M. ENSERINK/SCIENCE; (DATA) RISTEK

In October 2018, Anis Fuad, a health informatician at Gadjah Mada University in Yogyakarta, presented RISTEK with a detailed analysis of the problems. Indonesia’s most-cited 2018 paper so far wasn’t a major breakthrough, Fuad noted, but a study titled “Analysis of Student Satisfaction Toward Quality of Service Facility,” presented at a workshop co-organized by the Indonesian Publications Collaboration Community (KO2PI) and published in conference proceedings, a type of publication that gets minimal peer review. The study had been cited 42 times, often in papers on unrelated topics—including mosque architecture and cold storage of fish—that were also published in conference series or in low-quality open-access journals no longer indexed in Scopus.

One of the paper’s 10 authors was statistician Ansari Saleh Ahmar of the State University of Makassar, who won SINTA awards in two categories last July; he co-authored more than 100 papers in 2017 and 2018 and has been cited almost 600 times. Ahmar is also president of KO2PI, which has run workshops in an extraordinary range of scientific fields. On a poster produced in early 2017, KO2PI promised participants a paper in a Scopus-indexed proceeding in return for a 1.5 million rupiah ($106) fee. Ahmar says he was “surprised” by his own citation rate, but says statistical papers are often cited in seemingly unrelated fields. He says he is no longer active in KO2PI and, given the controversy, would now like to return his award.

After asking Ahmar and other academics suspected of gaming the system for an explanation, RISTEK has deleted their SINTA accounts, Sadjuga says, but it has not withdrawn the awards because “the public shaming is punishment enough.” Sadjuga says problematic data in Scopus and scientists’ unethical behavior contributed to the problem but does not blame SINTA itself. (An Elsevier spokesperson says Scopus has stopped indexing three journals that many Indonesian scientists have published in and is investigating “concerns” about the conference series used by KO2PI, which is published by the U.K. Institute of Physics.)

Gaming aside, Indonesia’s research evaluation should not rely on a commercial database, says Dasapta Erwin Irawan, a hydrogeologist at Bandung Institute of Technology. He also says the system’s preference for Scopus-indexed international journals is misguided, because research in Indonesian journals may be just as good and sometimes more relevant. RISTEK doesn’t entirely ignore local journals: It has created an online portal, named Garuda, to more than 7000 journals in the Indonesian language, as well as a journal accreditation system. But researchers win far fewer SINTA points when papers in local journals are cited and none at all for publishing in them.

That lack of appreciation for locally relevant research violates the “Leiden Manifesto for research Metrics,” an influential paper Hicks and three co-authors published in 2015. Hicks says SINTA falls short on several other principles in the manifesto, which stipulates that metrics should “support a qualitative, expert assessment” and “account for variation by field in publication and citation practices.” SINTA currently does neither.

A new version of SINTA, set to be launched this year, will integrate data from several additional sources, including the Web of Science and the Indonesian National Library. It will also give researchers credit for other types of output, such as books, artwork, and patents. A new tool will flag self-citation and the ministry will disseminate scientific integrity guidelines to Indonesian universities.

But Mikrajuddin Abdullah, a physicist at Bandung Institute of Technology, says RISTEK should still review last year’s awards and retract them if they were based on misconduct: “It will teach us that scientific achievement does not come suddenly, but is the result of a long period of perseverance.”