By Esther Nakkazi
In the contemporary academia and the research world, scientific excellence, which brings promotions and recognitions heavily rely on metrics such as the publication’s impact factor, researchers’ ability to secure grants, citation counts and others.
This may introduce bias and lacks a wholistic perspective and may not necessarily constitute a proper definition of excellence in research.
Therefore, the definition of excellence in research should be more about the quality of research participants meeting at the fifth International Network for Government Science Advice (INGSA) meeting held in Kigali last month (1-2 May 2024) heard.
In other wards, excellence in research should extend beyond grant acquisition, publication records, and citation counts to include broader factors such as policy influence, effective science communication, research impact and significance, the meeting agreed.
“Our current approach to assessing research, primarily through metrics such as impact factor and citation count, tends to yield subjective evaluations. Thus, I’m advocating for a more holistic perspective in our assessments,” said Edmond Sanganyado from the Department of Applied Sciences, Northumbria University and an a Global Young Academy Alumni -UK.
In some academic publishing domains like in China, the Chinese journals exchange citations amongst themselves which creates a form of intellectual corruption akin to ‘you scratch my back, and I’ll scratch yours.’
“How do you compete with Chinese researchers when they cite each other over 40,000 times and in Africa you do not publish? It is the men citing each other and Chinese citing each other,” said Linley Chiwona-Karltun, from the Swedish University of Agricultural Sciences.
“Now more than ever, I am convinced that having people from different backgrounds is essential,” said Marie-Violaine Dubé Ponte, a Member du Comite intersectiontoral etudiant (CIE), Naval University – Canada. “I think it is an effective way to bridge the gap between science and society.”
The meeting heard about how some countries are doing things differently. For instance, South Africa in 2013, began a discussion around the importance of engaging the public about what the researchers were actually doing.
South Africa then published a document, the Science Innovation Framework, which sets out the policy for the whole country on how researchers, universities, and government departments can participate in science engagement.
“One of the things that I loved about this framework was how people-centered it was because there is a temptation when we are trying to reframe or define research excellence, we tend to focus on the science, not those who are in science,” said Sanganyando.
Currently, the framework is under the Department of Science and Innovation, which outlines specific aims, highlighting the importance of popularizing science for a critical public, ensuring informed critique of innovations or scientific advancements. It also emphasizes the importance of science communication and profiling South African science.
The framework, also outlines how they are going to fund science engagement. Government departments are required to set aside funds focused on ensuring the aims of these science engagements are realized, and they also provide policies to higher education institutes, incentivizing science engagement.
For example, the University of Cape Town in 2013-2014 introduced a scoring system where participation in social engagement related to research earns points credited for promotion. Similarly, at the national level, science councils also set aside awards for recognizing those involved in science engagement.
“When you’re evaluating a grant, also look at aspects of public engagement, that is what they are actually doing right now in South Africa,” said Sanganyando. He compared this with Zimbabwe his home country, where this year, they released guidelines on how universities are supposed to recruit academic staff and what was missing from the guidelines was public engagement, social engagement.
“What was emphasized (in Zimbabwe) was the amount of money that you got from grants, the number of publications that you have, your level of education, but nothing on social engagement,” he said.
At the Coalition for Advancing Research Assessment (COARA), they are trying to do things differently.
Menico Rizzi, who is a Steering Committee Member, COARA says nowadays it is acknowledged in the research community, that the current method of research assessment often impedes the true impact of scientific endeavors and may inadvertently restrict curiosity-driven research.
This is primarily due to the prevalent reliance on quantitative measures in many institutions, he says. COARA is committed to fostering inclusivity, ensuring that participation is not limited to individuals but extends to institutions as well.
“The foundation of our coalition rests upon an agreement document, originating in Europe but with a global scope. While not exclusively a European initiative, we are actively pursuing a worldwide outreach strategy to achieve global representation,” says Rizzi.
The goal is to expand across all continents, striving for inclusivity in the governance structures and in the medium term, they aim to have representatives from each continent on the steering board.
But there is a challenge of changing from a culture of research assessment, which communities have adopted for years – the one relying on metrics.
“We are creating a community. We have to go back to recognize the quality of the content,” he emphasises.
“If you want to change things, it’s very challenging,” says Remi Quirion, Quebec Chief Scientists, Fonds de recherche du Quebec and President of INGSA-Canada.
Quirion believes this may be an opportunity for Africa, especially for the newer universities and forward-thinking organizations, which may be open to exploring new approaches and learning from them. “It will not be 100% proof. Sometimes it will work, sometimes it won’t.”
Ends