Implementing Systems-level Reform: Institutional Change towards Transparency
This is a modal window.
The media could not be loaded, either because the server or network failed or because the format is not supported.
Formal Metadata
Title |
| |
Title of Series | ||
Number of Parts | 13 | |
Author | ||
License | CC Attribution 4.0 International: You are free to use, adapt and copy, distribute and transmit the work or content in adapted or unchanged form for any legal purpose as long as the work is attributed to the author in the manner specified by the author or licensor. | |
Identifiers | 10.5446/55218 (DOI) | |
Publisher | ||
Release Date | ||
Language |
Content Metadata
Subject Area | ||
Genre | ||
Abstract |
|
00:00
Computer animation
00:11
Computer animationMeeting/Interview
01:10
Computer animationMeeting/Interview
06:16
Meeting/Interview
09:14
Meeting/Interview
09:50
Meeting/Interview
Transcript: English(auto-generated)
00:12
Hi everyone, welcome to my talk. My name is Sander Sinoni, and I'm giving a talk titled Implementing Systems Double Reform Institutional Change Towards Transparency. I'd like to start
00:22
off and mention that over the past 10 years, the scientific landscape has completely changed. If we compare ourselves to where we were 10 years ago, it would be unrecognizable. We've had so many new ideas, so many new ways of looking at how we do science. We have so many new suggestions on how this is going to work. We're designing a credible
00:42
research system for all of us. The thing is, we have to remember that we cannot stop at ideation. We can talk around the table for as long as we'd like, and we can come up with many different ideas. But if the end goal is to actually implement these systems, and to have research systems produce credible research that will actually benefit society, then we have to
01:05
remember that the end goal is actually implementation, not just ideation. And it's difficult because determining the vision for a research system, a credible one, and implementing it are altogether different challenges. In the same way, research doesn't exist in a vacuum.
01:24
Whenever we start talking about implementation, there are all these factors and issues that we need to consider. For example, there are existing research cultures. Whatever we implement, whatever initiative we have, isn't going to sit on top of our research culture. It's going to interact and integrate with it. And so we have to remember things such as
01:45
centralization of authority, because a lot of research cultures outside of North America, Australia, as well as Europe, have more centralization of resources and authority to the government, rather than on an institutional or research level. And in fact,
02:00
there's more power gaps or more power scaling as you go up in the ranks, and that will affect who develops and determines what policies as well. And on that note, we have to remember that not everyone has the same autonomy. If you're in a research culture that pays you very little wages and you're relying on publications to survive or to progress your career, then you may not have as much capacity
02:25
to change to research practices that may yield more credible research, but might take you away from the KPIs. Number three is research integrity, where we have to remember that any good-meaning research system or limitation, when interacting with research integrity,
02:44
if we don't have the underlying research values or integrity, it's not going to turn out well. For example, the notion of preprints. Well, in Indonesia, if you upload something to a preprint server, it gets indexed as part of your KPI. Now, there are several cases where researchers and lecturers would have their students upload each and every one
03:02
of their assignments with the lecturer as a co-author in order to bolster their research portfolio. So we need to also assess, can these well-meaning initiatives go wrong if there's not the underlying research integrity there? As well as the liability of research culture, where in a lot of places, they've just recently implemented research policies.
03:24
And a lot of them are updating it quite quickly. China is an example of this, where they're constantly updating their research policies, quite to drastic measures and effect, in fact. And so, how do researchers react to this? If the culture is always labeled and never setting,
03:43
how is a new initiative coming in going to behave? Could it be easier to implement, because they're already implementing new things as it is? Or in fact, even if it's easy to uptake, is it going to be more difficult to actually set and be sustainable in the long run, because none of the research cultures are actually setting due to the liability of the
04:01
policies? And we have to consider all this. For example, there was the transparency audits that happened some time ago. And a lot of people, especially from a research culture in North America, Australia, and in Europe said, this is not going to work for us. No, absolutely not. They didn't even say this isn't going to work for us. They said,
04:20
this isn't going to work. This is not the way to do things, but consider a research culture where there's very little space for researchers to be able to move into have autonomy. And in fact, everything is guided by top-down policy where there's more centralization of authority. In fact, the transparency audit could be critical to developing a research system that's
04:41
credible there. And so, policymakers priorities are another important thing to consider because could be severe pushback. A couple of years ago, there was this huge controversy where the Indonesian government behind closed doors, of course, was pushing back against open science. Why? Because churning out journals was part of their KPI at the time. And they saw open
05:03
access as an enemy to this thinking, well, if you can publish things up and put it up without being peer reviewed, then why do we need all these journals? And so, the pushback against open access, as well as open science thinking that open science only consisted of open access. So, I'm not saying that they're all right. I'm saying we need to be cognizant of
05:20
what their priorities are and how they're likely to react to new incoming initiatives. And there's also a large disconnect between umbrella level networks with grassroots level researchers. For example, UNESCO recently released their open science recommendations. One of the feedback from the different countries are like, it's really difficult for us to
05:41
implement this. Because very so often, there are lots and lots and lots of networks now everywhere that we're seeing to improve transparency and credibility. But very often, there's this big disconnect with the typical researcher who may not have an audience or access to these networks, and yet aren't told to implement the things that they're recommending.
06:05
And remember that the goals that every researcher, every institution, every system can produce credible science, regardless of the journey there. And so, they might take different pathways, but that's the end goal. So, what can we do to implement strategic change? The number one is we have to navigate competing priorities. For example, in 2019,
06:26
some colleagues and I, we wanted to create a conference, an event that could reform Indonesia's science. But we knew that there was pushback against the government. So, what did we do? We had an open science conference without mentioning the word open science even once in any of the marketing materials. Instead, we knew that their priorities
06:44
was to move science forward in the region. And so, we framed it in terms of, okay, we're going to create a conference to upskill all the researchers. But we know the way to get there is through open science. And so, we had all the sessions talk about transparency and research
07:01
integrity. As a result, we developed good networks with parts of the government. And we're even invited to lead science briefs, science policy briefs for them. And so, by understanding what the different priorities are, we'll be able to implement these initiatives well. We also have to understand the parties on each level institutionally. For researchers, are they able to dedicate thoughts to creating good science, or are they
07:24
just trying to survive? That allows us to work in the system a bit better. Number two is be sensitive to culture. Did you know a lot of cultures in Asia won't tell you if they disagree with you? And so, that's why a lot of researchers from outside find it frustrating because they're trying to communicate with a completely different culture.
07:42
And so, we have to know the people because science is a social enterprise. And in the end, it's not about the system, it's about the people in the system. And by understanding the culture, what does the country's culture prioritize? What aspects of open science, what aspects of transparency, what aspects of integrity, what language would they use? We'll be able to implement these initiatives a lot better. Number three, we have to co-design,
08:06
not transfer. A lot of the time, the initiatives are designed and implemented by one group for a completely different group, whether we see it ideated in the global north and trying to implement it global south, or we see it ideated and implemented by the UN-level organizations
08:26
or even government or institutional, and yet the grassroots researchers are living in a completely different world. It doesn't work that way. And we wonder why a lot of aspects of open science and research integrity have had slow uptake in areas outside of North America, Australia,
08:44
and Europe because it wasn't co-designed or ideated there. And so, the way to do this is we need to be able to bring everyone to the table to design these policies together, to design these ideas together, because very so often, we don't realize that the ideas we're coming up with, the ideas that we think about, are perfect for our research context
09:06
without understanding how different it is out there. And so, there are many different ways we can do this. For example, one of the events that I'm leading is an event called Advancing Science in Southeast Asia for October this year, and in one of the sessions, we're actually
09:22
bringing together UN-level organizations such as INXA, International Science Council, and we're having talks with UNESCO, along with institutions in each country, along with the grassroots researchers to come together and to design policy documents together, because if we can integrate the ideation and implementation, then the whole process becomes a lot smoother.
09:45
And so, there are many different things we need to consider, and I'd like to leave everyone with this thought. For every idea we have pertaining to science as a whole, we must consider in which context it will work, because the goal isn't
10:01
ideation, the goal is implementation. Thanks so much for coming to the talk, I'm looking forward to hearing the discussion and the questions. Thanks everyone.