Poster presentation: How to support early career researchers with identifying trustworthy academic events?
This is a modal window.
The media could not be loaded, either because the server or network failed or because the format is not supported.
Formal Metadata
Title |
| |
Title of Series | ||
Number of Parts | 20 | |
Author | 0000-0003-3999-253X (ORCID) | |
Contributors | ||
License | CC Attribution 3.0 Germany: You are free to use, adapt and copy, distribute and transmit the work or content in adapted or unchanged form for any legal purpose as long as the work is attributed to the author in the manner specified by the author or licensor. | |
Identifiers | 10.5446/58079 (DOI) | |
Publisher | ||
Release Date | ||
Language |
Content Metadata
Subject Area | ||
Genre | ||
Abstract |
|
00:00
Computer animation
Transcript: English(auto-generated)
00:00
And with that, let's just see if we're ready for Julian Franken. Julian, hello, how are you? Hi, I'm fine, thank you. Excellent, yes, please remember to share your screen. And once you're ready with that, we can begin with your presentation. Yes, okay, are you seeing my screen now?
00:22
Yes, yes, we do, yes. All right, hello everybody, my name is Julian. I'm from the TIB, Leibniz Information Center for Science and Technology in Hanover. My poster is titled, How to Support Early Career Researchers with Identifying Trustworthy Academic Events. I'm working in a DFG funded project called Confident,
00:42
and we are about to, or we're supposed to build a platform to support researchers with finding the right academic events like conferences, like this one actually. And we are supposed to support them avoiding the wrong ones, so predatory or fraudulent ones. Recently, there was a report published
01:01
about predatory practices, so about predatory journals and predatory conferences in particular by the Interacademy Partnership. One of the main insights, core insights, I would say, is that predatory conferences or conference quality is best seen as a spectrum,
01:20
as you can see here on the top, I copied this infographic from the report. And typical markers are here also put to illustrate how a predatory conference, for example, looks like. And some of those typical markers for predatory ones
01:42
are, for example, that in extreme cases they don't take place at all, they can be a complete scam, or they only pretend to have a peer review process but actually don't. So Confident wants to keep those conferences and events out of its database. As you can see here on the top, I marked those types.
02:06
So in the infographic below here, I tried to illustrate how the rough process looks, how we intend to keep those out. So as you can see here first, a new event is entered,
02:21
and then we want to conduct some automated checks. For example, if an organizer is on the blacklist, we want to keep this event out. Then if it is not on the blacklist, some markers that already can be found here and can be identified by a machine, those are highlighted and then sent into the manual check
02:41
so that a professional can look into those. Again, if those check is passed, then they enter into the Confident database. And after that, we hope to involve the community as best as we can and give them the opportunity to flag events that they deem predatory
03:01
or they deem not trustworthy, and those will be reviewed again. But in the end, the most important part of this process is still the own scrutiny of researchers. So we can only be as good as our data probably is, and we still have to rely, or the researchers themselves still have to rely
03:21
on their own scrutiny and do their own investigation and research. We hope to inform them as best as we can. And during this process, as you can see on the bottom, I try to mark that certainty of the quality increases during this process, but there are always some challenges with this. For example, the first is what if the enter data
03:43
is false in the beginning? So if somebody simply lied about anything about the event. Next, for example, and one example for that is if an organizer is mentioned that is actually not really involved in the organization of an event, that would be a problem too.
04:01
This is one of the major challenges. In general, this addresses the issue of how to codify these markers at all. So most of those markers that are mentioned in the report actually can only be checked by a thorough investigation of the researchers themselves by looking at the website, for example,
04:21
and really digging deep into the conference. And we trying to get this into a database is a challenge. Could you please wrap up your thought, Julian? Last sentence, please. Okay, and that's, yeah, basically all of the challenges we will probably encounter and thanks for your attention.
04:42
Super, thank you very much, Julian. And of course, just a reminder, when we're done here with our short presentations, you'll have a chance to go and actually go into the virtual rooms, meet the speakers, and continue to talk with them one-on-one. So we're looking forward to that as well. Good, thank you very much, Julian. Appreciate it.