Altmetrics in the library
This is a modal window.
Das Video konnte nicht geladen werden, da entweder ein Server- oder Netzwerkfehler auftrat oder das Format nicht unterstützt wird.
Formale Metadaten
Titel |
| |
Serientitel | ||
Anzahl der Teile | 8 | |
Autor | ||
Lizenz | CC-Namensnennung 3.0 Deutschland: Sie dürfen das Werk bzw. den Inhalt zu jedem legalen Zweck nutzen, verändern und in unveränderter oder veränderter Form vervielfältigen, verbreiten und öffentlich zugänglich machen, sofern Sie den Namen des Autors/Rechteinhabers in der von ihm festgelegten Weise nennen. | |
Identifikatoren | 10.5446/46274 (DOI) | |
Herausgeber | ||
Erscheinungsjahr | ||
Sprache |
Inhaltliche Metadaten
Fachgebiet | ||
Genre | ||
Abstract |
|
3
5
7
8
00:00
ProgrammbibliothekGüte der AnpassungKurvenanpassungTwitter <Softwareplattform>MultiplikationsoperatorVorlesung/KonferenzComputeranimation
01:28
BitDatenverwaltungInternetworkingWellenpaketGrundraumCASE <Informatik>Metrisches SystemPunktWeg <Topologie>Vorlesung/KonferenzBesprechung/Interview
02:49
AggregatzustandRelativitätsprinzipPaarvergleichMetrisches SystemCASE <Informatik>TeilbarkeitProgrammbibliothekDienst <Informatik>Prozess <Informatik>SondierungProjektive EbeneComputeranimation
04:00
Prozess <Informatik>MehrwertnetzMetrisches SystemKontextbezogenes SystemSoftwareentwicklerProgrammbibliothekPunktPhysikalische TheorieNP-hartes ProblemWeb-SeiteDienst <Informatik>DokumentenserverDatenfeldEinsZahlenbereicht-TestTotal <Mathematik>HilfesystemStatistikTeilbarkeitStichprobenumfangSondierungGrundraumMAPComputeranimation
05:45
Kontextbezogenes SystemTermExpertensystemMAPSoftwareentwicklerStabTelekommunikationIndexberechnungMetrisches SystemBeobachtungsstudieRechenschieberMultigraphZählenQuick-SortTypentheorieTeilbarkeitDatenverwaltungStatistikDifferenteWhiteboardErwartungswertVollständiger VerbandDienst <Informatik>GeradeRechter WinkelDiagramm
07:35
p-V-DiagrammKontextbezogenes SystemDateiformatSichtenkonzeptFlächeninhaltTypentheorieElement <Gruppentheorie>InformationMetrisches SystemEntscheidungstheorieSoftwareentwicklerStatistikTelekommunikationSichtenkonzeptZählenCoxeter-GruppePunktTeilbarkeitDiagramm
08:52
p-V-DiagrammKontextbezogenes SystemTeilbarkeitTypentheorieZählenÄhnlichkeitsgeometrieMereologieEntscheidungstheorieDifferenteSkriptspracheSoftwareentwicklerProgrammbibliothekBeobachtungsstudieFakultät <Mathematik>Metrisches SystemGüte der AnpassungVerkehrsinformationBitSystemverwaltungBenchmarkPhysikalismusDienst <Informatik>MultiplikationsoperatorStatistikVarietät <Mathematik>Vorlesung/KonferenzComputeranimation
10:57
EntscheidungstheorieSoftwareentwicklerPunktStichprobenumfangGruppenoperationRechter WinkelVorlesung/Konferenz
11:56
GrundraumSoftwareentwicklerExogene VariableMAPMereologieMetrisches SystemService providerProdukt <Mathematik>VektorpotenzialProzess <Informatik>TermUmsetzung <Informatik>EntscheidungstheorieDemo <Programm>Rechter WinkelVorlesung/KonferenzBesprechung/Interview
13:46
SoftwareentwicklerMetrisches SystemEntscheidungsverfahrenEntscheidungstheorieDatenfeldKontextbezogenes SystemTwitter <Softwareplattform>MittelwertZahlenbereichProzess <Informatik>HypermediaTermInformationService providerMereologieFakultät <Mathematik>Vorlesung/Konferenz
15:36
Metrisches SystemMAPTermEntscheidungstheorieLuenberger-BeobachterSoftwareentwicklerHidden-Markov-ModellSuite <Programmpaket>EINKAUF <Programm>Endliche ModelltheorieBasis <Mathematik>PunktGüte der AnpassungVorlesung/KonferenzBesprechung/Interview
17:44
Translation <Mathematik>ZahlenbereichProgrammbibliothekEndliche ModelltheorieLeistungsbewertungUmsetzung <Informatik>Translation <Mathematik>Selbst organisierendes SystemUmwandlungsenthalpieMinkowski-MetrikStatistikProzess <Informatik>MathematikAnalytische FortsetzungEreignishorizontBitMereologieSoftwareentwicklungVorlesung/KonferenzComputeranimation
20:30
Translation <Mathematik>DatenstrukturMAPSoftwareentwicklungUmsetzung <Informatik>DruckverlaufDreiecksfreier GraphGeradeComputeranimation
21:34
Translation <Mathematik>Metrisches SystemLeistungsbewertungStetige FunktionProgrammFunktion <Mathematik>SteuerwerkProzess <Informatik>Offene MengeQuellcodePerspektiveMetadatenSystemprogrammierungSelbstrepräsentationRichtungComputerunterstützte ÜbersetzungDifferenteMereologieDatenmissbrauchFunktion <Mathematik>IndexberechnungTranslation <Mathematik>SpeicherabzugMultiplikationsoperatorEinflussgrößeProdukt <Mathematik>Prozess <Informatik>ComputeranimationProgramm/Quellcode
23:20
Metrisches SystemSpeicherabzugMetadatenVerschlingungMetrisches SystemHypermediaFehlermeldungOffene MengeGrundraumGruppenoperationMetadatenInformationAnalysisAutorisierungGüte der AnpassungBenutzerbeteiligungCoxeter-GruppeProzess <Informatik>IdentifizierbarkeitPunktUmsetzung <Informatik>BitGraphComputeranimation
24:29
Physikalische TheorieComputersicherheitProgrammierumgebungFramework <Informatik>Kontextbezogenes SystemStrom <Mathematik>MathematikDienst <Informatik>Codierung <Programmierung>Prozess <Informatik>Endliche ModelltheorieDatenbankRippen <Informatik>MAPMathematikUmsetzung <Informatik>RechenschieberComputerspielKontextbezogenes SystemAnalysisWeb SiteHypermediaDienst <Informatik>SoftwareentwicklerInstantiierungComputeranimation
25:28
Strom <Mathematik>AssoziativgesetzCodierung <Programmierung>Algorithmische ProgrammierspracheDienst <Informatik>MathematikProzess <Informatik>Endliche ModelltheorieDatenbankGebäude <Mathematik>Dynamisches RAMWellenpaketPASS <Programm>ProgrammFunktionalIntegritätsbereichGruppenkeimGammafunktionDean-ZahlSerielle SchnittstelleZustandsmaschineKlon <Mathematik>VLIW-ArchitekturMarketinginformationssystemCodierungIndexberechnungAnalysisMinimumDrucksondierungPlastikkarteMAPUmsetzung <Informatik>PaarvergleichMatrizenrechnungExpertensystemService providerMetrisches SystemMathematikWellenpaketKontextbezogenes SystemBitRechter WinkelProgrammbibliothekComputerunterstützte ÜbersetzungDifferenteÄußere Algebra eines ModulsSondierungAnalytische FortsetzungTermBewertungstheorieZahlenbereichComputeranimation
26:36
DatenmodellStabMinkowski-MetrikSelbst organisierendes SystemBenutzerfreundlichkeitOntologie <Wissensverarbeitung>ClientKollaboration <Informatik>ProgrammGarbentheorieRundungKorrelationSpieltheorieAnalysisUnterraumSystemplattformSoftwareGruppenoperationSpeicherabzugMetrisches SystemVisualisierungSystemprogrammierungStetige FunktionInformationp-V-DiagrammGoogolDienst <Informatik>Strategisches SpielGruppenkeimSichtenkonzeptGrundraumProgrammbibliothekZahlenbereichExogene VariableSelbst organisierendes SystemMAPDifferenteMinkowski-MetrikDienst <Informatik>DigitalisierungFakultät <Mathematik>GruppenoperationSpeicherabzugDokumentenserverDateiformatIntegriertes InformationssystemSoftwareentwicklungDatenverwaltungKreisflächeStrategisches SpielMetrisches SystemResultanteIntegralProgrammierumgebungTelekommunikationComputeranimation
28:14
SpeicherabzugUnabhängige MengeLeistungsbewertungMetrisches Systemp-V-DiagrammFlächeninhaltProgrammNotepad-ComputerLokales MinimumCADSchlussregelLemma <Logik>MenütechnikSchmelze <Betrieb>AnalysisSymmetrische MatrixElement <Gruppentheorie>MustererkennungRankingDokumentenserverSimulationRechnernetzDienst <Informatik>PrognoseverfahrenSuite <Programmpaket>QuellcodeGarbentheorieSondierungKollaboration <Informatik>NeuroinformatikVerschiebungsoperatorEreignishorizontBeobachtungsstudieFramework <Informatik>Stromlinie <Strömungsmechanik>Physikalisches SystemProgrammschleifeMetrisches SystemSpeicherabzugWort <Informatik>MomentenproblemMatrizenrechnungBitComputerunterstützte ÜbersetzungMAPAnalytische MengeDifferenteLeistungsbewertungStatistikPerspektiveTUNIS <Programm>GrundraumProzess <Informatik>SondierungBeobachtungsstudieAdditionMathematikExogene VariableHidden-Markov-ModellMultiplikationsoperatorCASE <Informatik>Quick-SortTranslation <Mathematik>EchtzeitsystemComputeranimationVorlesung/KonferenzProgramm/Quellcode
29:37
Notepad-ComputerProgrammRückkopplungProgrammschleifePhysikalisches SystemMetrisches SystemLeistungsbewertungp-V-DiagrammMenütechnikAnalysisMethode der kleinsten QuadrateRankingLie-GruppeMachsches PrinzipTelekommunikationSystemidentifikationKontextbezogenes SystemResultanteBeobachtungsstudiePhysikalisches SystemMereologieWeb SiteQuick-SortCASE <Informatik>Metrisches SystemZahlenbereichCybersexBitFehlermeldungGruppenoperationSoftwareentwicklungSoftwareImplementierungProgrammbibliothekInformationDifferenteOrtsoperatorGrundraumDienst <Informatik>StrömungsrichtungPerspektiveMAPWasserdampftafelAutomatische HandlungsplanungSelbst organisierendes SystemDruckverlaufComputeranimation
31:28
TelekommunikationIdentitätsverwaltungDialektMereologieMosaicing <Bildverarbeitung>GruppenkeimAssoziativgesetzInformationPunktMetrisches SystemProgrammbibliothekProjektive EbeneEinfach zusammenhängender RaumIntegriertes InformationssystemGrundraumTeilbarkeitDifferenteÄußere Algebra eines ModulsImplementierungIndexberechnungBildverstehenVollständiger VerbandVisualisierungGemeinsamer SpeicherAssoziativgesetzInformationGruppenoperationSichtenkonzeptUmsetzung <Informatik>Computeranimation
32:26
MultiplikationsoperatorCMM <Software Engineering>BitMailing-ListeOrtsoperatorArithmetisches MittelQuick-SortMAPDifferenteSystemverwaltungVektorpotenzialSoundverarbeitungDatenverwaltungFakultät <Mathematik>RechenschieberProgrammierumgebungUmsetzung <Informatik>Dean-ZahlGruppenoperationObjektorientierte ProgrammierspracheVollständiger Verbandt-TestComputeranimationVorlesung/Konferenz
33:51
Translation <Mathematik>EindeutigkeitBildschirmmaskeVektorpotenzialRechenschieberKugelOffene MengeMetrisches SystemFächer <Mathematik>ÜbertragZahlenbereichBetrag <Mathematik>MultiplikationsoperatorMinkowski-MetrikTypentheorieMatrizenrechnungMAPInternationalisierung <Programmierung>DruckverlaufFakultät <Mathematik>IndexberechnungProgrammierumgebungFunktion <Mathematik>Disjunktion <Logik>Ordnung <Mathematik>Äußere Algebra eines ModulsArithmetisches MittelVorlesung/KonferenzBesprechung/Interview
35:25
Äußere Algebra eines ModulsDienst <Informatik>DatenverwaltungKontrollstrukturDigitalisierungMultiplikationsoperatorObjekt <Kategorie>CMM <Software Engineering>Metrisches SystemWasserdampftafelPunktGrundraumKontextbezogenes Systemt-TestAggregatzustandBesprechung/InterviewVorlesung/Konferenz
36:24
DigitalsignalProgrammbibliothekInhalt <Mathematik>Metrisches SystemTotal <Mathematik>Web-SeiteSichtenkonzeptPeer-to-Peer-NetzEreignisdatenanalyseNabel <Mathematik>Coxeter-GruppeTelekommunikationSelbst organisierendes SystemTouchscreenProgrammbibliothekPhysikalisches SystemCOMForcingIndexberechnungGrundraumOffene MengeMetrisches SystemMinkowski-MetrikNatürliche ZahlDokumentenserverVorlesung/KonferenzBesprechung/InterviewComputeranimation
37:59
Produkt <Mathematik>ZählenBenutzerbeteiligungInformationMetrisches SystemVollständiger VerbandMultiplikationsoperatorPunktIndexberechnungProgrammbibliothekProgrammierumgebungCoxeter-GruppeMaskierung <Informatik>Mereologiesinc-FunktionTwitter <Softwareplattform>Computeranimation
39:01
FacebookTwitter <Softwareplattform>Coxeter-GruppeRechnernetzMetrisches SystemPunktProgrammbibliothekPlastikkarteEinsMultiplikationsoperatorCoxeter-GruppeMailing-ListeDatenfeldData MiningMessage-PassingFacebookTwitter <Softwareplattform>Einfach zusammenhängender RaumGemeinsamer SpeicherDifferenteComputeranimation
40:11
VerknüpfungsgliedZustandsdichteDokumentenserverStrom <Mathematik>SystemprogrammierungQuellcodeDesintegration <Mathematik>InformationTaskIndexberechnungMetrisches SystemDatenfeldTVD-VerfahrenPunktProgrammbibliothekDatenverwaltungPaarvergleichGrundraumMereologieStrömungsrichtungIntegriertes InformationssystemInformationComputeranimation
41:26
DokumentenserverStrom <Mathematik>QuellcodeSystemprogrammierungDesintegration <Mathematik>InformationAnalysisVerkehrsinformationKollaboration <Informatik>SchnittmengeShape <Informatik>Funktion <Mathematik>Basis <Mathematik>Kollaboration <Informatik>Offene MengeInformationVerkehrsinformationDokumentenserverAnalysisStrömungsrichtungMailing-ListeVisualisierungEinfache GenauigkeitIntegriertes InformationssystemGüte der AnpassungPunktDifferenteMereologieMAPFokalpunktTypentheorieShape <Informatik>GrundraumDemoszene <Programmierung>SystemplattformTermFunktion <Mathematik>Projektive EbeneIndexberechnungSoftwareentwicklerDialektSchnittmengeMetrisches SystemWärmespannungExistenzsatzLeistungsbewertungAnalytische MengeFlächeninhaltFakultät <Mathematik>Computeranimation
47:12
Funktion <Mathematik>PermanenteMetrisches SystemKonfiguration <Informatik>Digital Object IdentifierInformationGarbentheorieFunktion <Mathematik>GrundraumVerkehrsinformationAdditionTypentheorieDokumentenserverIdentifizierbarkeitArithmetisches MittelStatistische HypotheseSpannweite <Stochastik>KonzentrizitätDigital Object IdentifierComputeranimation
49:20
Funktion <Mathematik>PermanenteMetrisches SystemDigital Object IdentifierKonfiguration <Informatik>p-V-DiagrammHoaxDokumentenserverMailing-ListeURLIdentifizierbarkeitCASE <Informatik>BitCoxeter-GruppeDigital Object IdentifierIndexberechnungMetrisches SystemComputeranimationVorlesung/Konferenz
50:27
Metrisches Systemp-V-DiagrammTwitter <Softwareplattform>Lesezeichen <Internet>Coxeter-GruppeMAPZahlenbereichMetrisches SystemComputeranimationVorlesung/Konferenz
51:29
HilfesystemKlassische PhysikMetrisches SystemBitMereologieEigentliche AbbildungVorlesung/Konferenz
52:29
RückkopplungAdditionPunktBitCoxeter-GruppeMultiplikationsoperatorKontrollstrukturVorlesung/Konferenz
53:38
SoftwareentwicklungMomentenproblemDatenfeldMustererkennungNeuroinformatikGüte der AnpassungSchnittmengeMetrisches SystemBitSoftwareTwitter <Softwareplattform>Vorlesung/Konferenz
54:57
DokumentenserverSchreib-Lese-KopfProgrammbibliothekGrundraumMetrisches SystemVorlesung/Konferenz
56:02
GruppenkeimFlächeninhaltÄhnlichkeitsgeometrieFakultät <Mathematik>Service providerMetrisches SystemÄußere Algebra eines ModulsMetrisches SystemRechenschieberMathematikProgrammbibliothekGrundraumComputeranimation
57:13
ÄhnlichkeitsgeometrieFakultät <Mathematik>Service providerFlächeninhaltGruppenkeimMetrisches SystemMetrisches SystemDatenflussDifferenteMatrizenrechnungGruppenoperationt-TestGrundraumÄußere Algebra eines ModulsSoftwareentwicklerFakultät <Mathematik>AnalysisProfil <Aerodynamik>TermComputeranimation
58:29
Domain <Netzwerk>ProgrammierumgebungBeobachtungsstudieEinflussgrößeComputerspielDienst <Informatik>Metrisches SystemE-MailMatchingBitRauschenDomain <Netzwerk>ProgrammbibliothekDienst <Informatik>Dreiecksfreier GraphComputerspielSelbst organisierendes SystemRechenwerkComputeranimation
59:27
ComputerspielEinflussgrößeDienst <Informatik>Metrisches SystemGruppenoperationProgrammbibliothekDienst <Informatik>TelekommunikationDreiecksfreier GraphComputerspielHypermediaResultanteBitFigurierte ZahlSimulationComputeranimation
01:00:43
GammafunktionKanalkapazitätInformationSoziale SoftwareVerschlingungMinimumAnalysisDienst <Informatik>Expandierender GraphHypermediaWeb SiteAnalysisKontextbezogenes SystemComputeranimation
01:01:46
AnalysisLokales MinimumSondierungExogene VariableSchreib-Lese-KopfInformationInformationstechnikSoftwareentwicklerWeb ServicesProgrammiergerätSchreib-Lese-KopfProgrammbibliothekGruppenoperationÄußere Algebra eines ModulsZahlenbereichSondierungGrundraumBitSoftwareentwicklungInterpretiererResultanteComputeranimation
01:03:32
Dienst <Informatik>ProgrammbibliothekLeistungsbewertungKreisflächeMomentenproblemFakultät <Mathematik>Computeranimation
01:04:39
Dienst <Informatik>Metrisches SystemBitKontextbezogenes SystemExogene VariableGrundraumDifferenteVorlesung/Konferenz
01:05:52
ProgrammbibliothekImplementierungInformationStrom <Mathematik>EbeneQuick-SortBitMetrisches SystemMereologieImplementierungFehlermeldungAutomatische HandlungsplanungComputeranimation
01:06:56
BildverstehenSichtenkonzeptVektorpotenzialEinflussgrößeTeilbarkeitProgrammbibliothekProjektive EbeneBitGrundraumMetrisches SystemBildverstehenIntegriertes InformationssystemStrömungsrichtungGemeinsamer Speicher
01:08:27
CMM <Software Engineering>VektorpotenzialBildverstehenGruppenoperationRankingRückkopplungBitArithmetisches MittelQuick-SortMailing-ListeRankingVektorpotenzialComputeranimation
01:09:49
VektorpotenzialBildschirmmaskePhysikalisches Systemp-V-DiagrammOffene MengeFunktion <Mathematik>Fakultät <Mathematik>Disjunktion <Logik>RechenschieberVektorpotenzialProgrammierumgebungFunktion <Mathematik>BildschirmmaskeMetrisches SystemInternationalisierung <Programmierung>Offene MengeComputeranimation
01:11:31
CMM <Software Engineering>Metrisches SystemPunktMetrisches SystemGrundraumCMM <Software Engineering>MultiplikationsoperatorComputeranimationVorlesung/Konferenz
01:12:33
Coxeter-GruppeTelekommunikationForcingBesprechung/InterviewVorlesung/Konferenz
01:13:50
IndexberechnungMetrisches SystemVollständiger VerbandMereologieVorlesung/Konferenz
01:14:55
PlastikkarteMultiplikationsoperatorVorlesung/Konferenz
01:16:12
Computeranimation
Transkript: Englisch(automatisch erzeugt)
00:39
Okay, yes, this works, good.
00:43
Welcome back everybody. If everybody can take a seat, if you can. So before moving to the first speaker, I just want to quickly remind you, or actually you can remind me, if you want to say something on Twitter
01:00
about this conference, which hashtag do you have to use? No, ugh, we told him 20 times, but his learning curve is a bit slow. Okay, so which hashtag do you have to use for the conference?
01:22
Excellent, which hashtag do you have to use if you have to follow up for next year? Now it's you, excellent, very good. Keep that in mind. Okay, our next speaker is partly not here and partly here. Stacy Conkeel is talking on behalf also of Sarah Sutton.
01:43
And now Stacy is working for Altmetric, but actually she's also a little bit a librarian in disguise. She has a librarian training, and of course she loves Altmetric. I ask all of the speakers what they think of Altmetrics, and obviously she loves them.
02:01
And go on. Awesome, thank you, Martijn. All right, I've got the lapel mic on, so hopefully those in internet land can hear me. Those who are familiar with me and my work, I'm the outreach and engagement manager at Altmetric now, but I've worked previously with the awesome folks at Impact Story and before that with the equally fantastic folks
02:21
at Indiana University Libraries and UMass Libraries. A little bit of PLOS thrown in there too. My point is, librarian by training, and I've been interested for many years, since my days at PLOS in Altmetrics, when I was on the tenure track at Indiana University, I started to wonder how can I use these for my, how can I use Altmetrics
02:41
to make my own case for tenure and promotion? And that led me to this kind of larger question of how are other people using these metrics? I asked around, not a lot of very sure answers. People were interested in it, but they weren't sure if they were gonna use it for their own case. They're so new, Altmetrics are, generally speaking,
03:01
that the people I know who had tenure in the libraries did not use them when they made their own case for tenure, so it's kind of uncharted territory. So what I did was, I came up with this pet project that I've kind of had in my back pocket for the last three or four years or so, and when I moved to Altmetric, I was given the opportunity to follow up on this work.
03:23
So I partnered with Sarah Sutton at Emporia State University, and now we're also working with Michael Levine Clark to administer a survey to figure out, okay, so in the US anyway, at research-intensive institutions, what are librarians actually doing with these metrics? One, how aware of Altmetrics are they
03:41
in comparison to other research metrics, like journal impact factor, citation counts? So that's one side of the coin, and the other is how are they using them in their own tenure and promotion processes? And then the third side of the coin is are Altmetrics being used in library services at all? There's currently a lot of hand-waving happening right now
04:03
in the Altmetrics field, right? So we're saying Altmetrics are good for all of these things, but are people actually using Altmetrics for all of these things? So we have, I actually authored an article with David Sher, formerly of Purdue University, talking about, okay, so if you've got a repository,
04:20
you can potentially use Altmetrics to advocate for your repository, help people to market it to your researchers, kind of a value-added service for repositories and so on. We also talk about Altmetrics might be able to help you with collection development. If you want another data point to help you decide what to purchase, what to weed, and so on,
04:42
Altmetrics can be that other data point to add to your existing data points, like usage statistics, journal impact factors, and so on. So there's a lot of questions around and a lot of theories as to how Altmetrics might be used, but again, not a lot of hard data. So we decided to ask every single librarian
05:02
working at a Carnegie-classified R1 research-intensive university in the US, because it's something like 180, 190 universities, total of 13,000 librarians overall. It's an inexact number, to be sure. I worked with a student and we manually scraped the data
05:20
off of many, many library web pages so we could contact people. So we sent out this survey to 13,000 people. Over 400 responded, so we got a fairly representative sample and we wanted to learn about the things that I talked about previously. So how are Altmetrics being used in the context of one's own tenure and promotion process? How are they being used for library services?
05:44
And then in general, what's the level of awareness? So here is what we learned. In terms of the awareness, people who consider themselves very familiar with or experts in Altmetrics, it varied depending upon the self-reported role of librarians.
06:01
So we see here, we've got collection development librarians, instruction librarians, people who primarily do reference services, scholarly communication support and assessment and then we've got this kind of overall baseline. And as you might expect, scholarly communication support librarians, they are among the librarians who are most familiar with Altmetrics
06:20
and other types of research metrics like journal impact factor, citation and usage statistics. And I apologize for the quality of these graphs. This is still very much in its early stages, this study. So we kind of threw these graphs together for this meeting. But so we've got varying levels of familiarity, but overall, in terms of the percentage of people
06:41
who really know about Altmetrics, obviously it's this yellow line, right? So it's much, much lower than our baseline familiarity with other types of research metrics in libraries. And then we asked them, so how often do you discuss these indicators, these different indicators of research impact when you're on the reference desk, when you are doing a reference interview,
07:01
providing reference services, which all sorts of librarians have to do, right? So as a research data management librarian, you might also have to staff the reference desk for X amount of hours per day. So many people provide reference services, and these slides will be available after the meeting, so they're kind of small to read now, but we see several things.
07:23
So librarians across the board are more likely to discuss citation counts and the journal impact factor in reference interviews, but that's not necessarily saying a lot. So overall, we're much more likely to never or rarely mention metrics in reference interviews. It's just not a topic that seems to come up.
07:40
And of all the types of librarians, scholarly communication librarians, again, much more likely to bring up these issues and the idea of research metrics in general in reference interviews. So how are metrics being used in collection development? This is another kind of hand wavy area. We talk a lot about it, but what are the actual practices?
08:02
So collection development librarians, along with most other librarians, much more likely to use usage statistics to inform collection development decisions. But even though you'd expect maybe collection development librarians, if I were one, I'd be like, how can I use new data points to inform the decisions,
08:22
the tough decisions that I have to make? So I actually expected collection development librarians to be using altmetrics much more than they actually are. We see that it's a very, very low percentage of people who are using it, self-reported collection development librarians. And every other type of librarian that we talk to is just as likely to use journal impact factor,
08:42
ooh, or citation counts. That's what I get for trying to scroll in presenter view. There we go. So every type of librarian just as likely to use journal impact factor or citation counts
09:00
for collection development decisions. But again, we see that standouts are usage statistics, very, very popular, altmetrics not popular at all yet. But I will say that I think that this, even though it's not necessarily the study showing like, yay, everybody's using altmetrics, for all these things we had hoped, this will provide a really good benchmark
09:22
for future similar studies. So we can see, okay, have altmetrics, as they're being used for all of these different library services, is the use growing over time? So our study asked a lot of questions regarding altmetrics and other impact metrics as they relate to libraries.
09:41
So stick around for the altmetrics 15 workshop, we'll be talking about some other findings. And we've got a lot of other questions that we're gonna be presenting on and writing on very soon. Like how are librarians using research metrics for their own tenure and promotion processes? How are they using altmetrics to decide what to read
10:00
or what to pay attention to in the LIS literature? Are librarians collecting and compiling reports on behalf of faculty members or department chairs or administrators to help inform hiring decisions, tenure and promotion decisions? And are there disciplinary differences? So if you are a liaison librarian,
10:21
someone who interfaces regularly with particular departments like Department of History or Department of Physics, are you more likely or less likely to use research metrics in a variety of ways? So these are some questions we're soon going to answer. And now I know that part of the thrust of this whole conference is to kind of
10:42
engender discussion, so I actually wanna flip the script a little bit and pose questions to you all in my last few minutes, if that's okay. So first of all, just by show of hands, how many librarians are here? Wow, so a lot. I'd say maybe a full half of the room, that's fantastic.
11:06
Among the librarians here, do you think that altmetrics should be used to make collection development decisions as another data point? Raise your hands if yes, okay. Maybe like 10, 15 people.
11:21
Yeah, we might be a bias group, right? How many of you think that we should not be using altmetrics to make collection development decisions? Anyone? You're right, our sample's totally biased. I really messed that one up. Okay. Well, I guess that's it. I just wanted to start a discussion,
11:41
and I think I've answered that question. So I'm happy to take questions now. Are there any questions? Hi, my name is Maira, and I work at Utrecht University Library.
12:02
The question you just asked about if we should use altmetrics in collection development, how would they look like? I get it on an article level, but for a whole collection, there are deals going on and stuff, so isn't there also a responsibility
12:22
for the publishers in presenting altmetrics? I would say so, and I think it's also a responsibility on the part of altmetrics providers to provide metrics cogently. If we're selling, I mean, I work for an altmetrics provider, right?
12:40
So part of one of the pitches that we use when we go out and we say, this is useful for your institution, is we do talk to librarians about the potential for use in making collection development decisions, and we've actually, I don't want to do a product demo. You can come see me at the break, but we've got a journals tab on our Explorer product where you can see, using a heat map,
13:00
okay, so these are the journals that are being talked about a lot. This is where a lot of the conversation lies, and so I think that's one way that we can add value, that the publishers, maybe they're not there yet, but I know a lot of publishers are using our data and other providers' data potentially, and rolling their own, like PLOS. PLOS is doing an excellent job
13:21
with their article level metrics, but yeah. So it's a, really we're in uncharted territory right now in terms of developing these tools to add value. Okay, thanks, thank you.
13:44
More questions. Hi Stacey, I'm Timon Uffeline from Springer, Marketing Sales, so very interested in the current question.
14:02
I'd like to just get your thoughts on why you think currently, practically, no librarians are using altmetrics for collection development purposes. I mean, clearly this room would like to, but I mean, what are your thoughts? Any reasons why? It just seems the data's been around
14:21
for a couple of years now, most of the larger publishers are providing it, but for some reason it hasn't found its way to collection decision making. That's a really good question. So I think part of the issue is overall awareness of altmetrics. There's still kind of this misunderstanding, as I'm sure many of you are aware of, that altmetrics equal only what's available on Twitter,
14:43
and you talk to your average old school faculty member in ex-discipline and they're like, I don't care what people ate for breakfast, don't use those ridiculous numbers to inform decisions about anything relating to academia. These are common things
15:01
that I think a lot of librarians hear, and people take to heart, unfortunately. I think our job, as altmetrics providers, to really go out and help educate, and it's also the job of librarians like you all who are in the know, people who are fully aware of all that the field has to offer
15:20
in terms of useful data and data beyond just social media and where useful data lies within social media data to kind of educate your fellow librarians and talk about all the ways that this data could be useful, and correct the misconceptions as well,
15:43
since there seem to be a lot of those. One last short question. I think we should clearly distinguish between article-level metrics and journal-level metrics. Yes.
16:00
Yeah, and one of the advantages of altmetrics is that you can have other aggregates than journals. Yes, and so in your paper, I'm not sure that leads to a question. Okay, no, I think that's a really good observation.
16:21
Because you mix them up in a certain way. Perhaps too negative to say it that way. But librarians will make their development decisions in terms of journals, not in terms of articles. That's a really good point. Yeah? Yeah, that's absolutely a good point, and I think that to add to that,
16:42
we have new acquisitions models, or not new. I mean, they've been around for a while now that are supplementing these journal-level purchases that are being made, and also the big-deal purchases that were brought up, these kind of things that librarians, these deals that were made with publishers
17:00
to purchase a suite of journals, which, I mean, it's kind of outside of the scope of this. So for things like patron-driven acquisition, interlibrary loan, where requests are being made on the article basis, I mean, I don't think it's necessarily up to individual librarians to act as a gatekeeper and say, no, the altmetrics aren't showing
17:22
that we should actually purchase this article or what have you. But I do think that there is some value to add there as we look at these alternative ways to make purchasing decisions within libraries. So that's a really good point, thank you. Okay, thank you. Thank you.
17:45
Our next speaker, our next speaker is Kristi Holmes. She's director of the Galter Health Science Library at Northwest University. I also asked her what she thinks of altmetrics in journal,
18:02
and she had a very nice quote, which is that if people think that numbers equals impacts, then they're kidding themselves. And on that bombshell.
18:39
Okay, great. Thank you and thank you very much
18:42
for the opportunity to attend this terrific event and also to share some of the things we're thinking about at our library. So I'm going to be talking a little bit about what's happening in the United States and the conversation that's happening around medical schools.
19:00
I think that there's actually a highly motivated audience in libraries to plug into this very specific need, and I'll talk about that as I go through. So our library is actually very, very active in the evaluation and assessment space at our institution. I direct the evaluation and continuous improvement program
19:25
for our Translational Sciences Institute, and our library is actually a part of our Translational Sciences Institute. It's a really unique model, and to be honest with you, I wasn't quite sure how it would all work out when I joined the organization,
19:41
but I cannot tell you how great it is to have all of these honorary liaison librarians across all of these major parts of our organization out talking about the library and understanding what's happening in the library. So that's been really unique and really worthwhile for us.
20:00
So we talk a lot about translational medicine, and we've all seen some of these specific statistics. It takes 20 years for a good idea to start entering into the clinical realm. Now things are getting better. It's not taking as long. We're kind of beginning to understand how to optimize that process,
20:20
and we also understand that it's more than just that quip of from the bench to the bedside. I mean, we really need to implement these changes in a way that create profound and lasting improvement in the health of our communities, and we're beginning to have a more nuanced discussion of that, certainly in the U.S.
20:42
So translational medicine is supported heavily by the National Institutes of Health. This is the largest grant program at NIH. It's crazy. It creates crazy people on all of the campuses when they're writing grants. You can imagine there's a lot of pressure to get these awards, because what it is
21:01
is it creates an opportunity to begin to accelerate some of these conversations and some of these things happening locally as well as participating on the national level. Translational medicine, like I mentioned, you really are going from basic scientific discovery into improved health,
21:21
but this isn't just a bi-directional discussion. I mean, this is a very interesting cycle which plugs in and out throughout this line. So while I'm particularly fond of this representation, I do want to make sure that it's not just,
21:43
you know, that we all understand it's just not one direction. So at NewCats, which is the name of our Translational Sciences Institute, we're doing a lot of different things. We're measuring a lot of things. We're measuring the success and efficiency of our cores and other parts of our institute.
22:01
We're looking at how well people are working together. We do a lot of team science at our institution, and we're also measuring a lot of things, measuring, measuring, measuring. So I've put stars by some of the things that can actually be monitored by publication data. So we monitor time to publication or other output.
22:23
We start to use publications as indicators for return on investment, you know, however inefficient they may be at trying to describe the whole process. And then we start to look at influences of research outputs, and here's where things
22:41
get to be pretty interesting, because we're actually trying to understand how things are pushing beyond publications into other interesting research outputs. You know, just like everybody in this room, we're really trying to do this in a meaningful way. We're trying to employ best practices and some guiding principles as we move through,
23:01
you know, whenever we can. We really wanna understand both productivity as well as impact. So it's quantity and quality. What we really want is great data, don't we all? So, you know, I'm not unique there. We wanna be able to understand who's paying attention to our work,
23:22
what kind of conversations are they having around that, and then to be able to tie these in a meaningful way to the things that are happening at our institution. So to take it back to publication, this graph is, these are all of the papers published by Northwestern University affiliated authors
23:41
from 1950 to the present, as indexed in Scopus. Scopus is great. They do a good job of institutional and author identifiers. So this is a very easy thing to do here. Great data, a lot of data. We can tell there's a ton of stuff happening here. There's so much missing information. So we don't know who's citing these works.
24:03
All of the great metadata isn't here in this bar chart. We don't have any kind of understanding about what kind of funding opportunities led to any of these manuscripts. We don't know things about open access status, compliance with open access mandate,
24:21
those really valuable attention metrics, and so on. So, you know, so these publication metrics are really the low-hanging fruit, and I think that we can do a lot, you know, in a more nuanced and interesting way together as we start to begin to identify what makes something impactful. So, you know, I think outside of the United States,
24:43
there have been some really great conversations about impact, and obviously make these slides available, but, you know, what is impact? So it's academic impact, but also there are really meaningful changes that can happen at a societal level.
25:00
But context is everything. So it's not just the paper, and it's not just that it gets tweeted. You know, it's about looking at what it leads to. So, for instance, if you write a paper, and it's cited in the development of new methodologies,
25:20
if it leads to new standards of care, if new curriculum, if it's cited in curriculum documents, so we're at a med school, you know, if it's changing the way that we're training our doctors, that's very impactful. And I've tried to list a few other things here. In the United States, if you can figure out
25:41
a way to bill for it, so here's the CPT codes, you know, if that's cited, that's meaningful impact. That indicates, those can serve as indicators that there's a change in the way that we're delivering healthcare. And so that's, you know, it's just taking it to that next level. Now, some of these attention level metrics
26:00
are actually really valuable in, you know, trying to trace how that's happening, you know, how that conversation is going on. It's a very messy business, it's very difficult to be able to tease out the details, but it can be really meaningful. So, what we're trying to do is build an ecosystem to bring this back to the library.
26:22
You know, we've all heard about library as place, but we're thinking about it in terms of library as partner. So we're really stepping up, again, to support evaluation and continuous improvement, but also to participate, not just at New Cats, but also in a number of different activities on campus. Libraries are terrific partners for this kind of activity
26:41
for a number of different reasons. You know, libraries are trusted, neutral parties, they're very knowledgeable about this space. The people in the libraries have a very service minded ethos and really work for the betterment of the organization. So, it certainly makes sense.
27:01
In medical libraries, you see a lot of different things happening. You know, we're doing things like, we're in the clinical environment supporting clinical activities, supporting research, integration with the electronic health record, doing a lot of digital initiatives, like we just had our soft launch of our new Fedora repository yesterday,
27:22
so that's very exciting. You know, as well as integrating alternative metrics into some of the things that are happening on campus and these information systems. One of the things that we did that I think is somewhat unique that you might find interesting is that we launched a metrics and impact core,
27:41
where we help people with a number of different things. We help them with publishing strategies, we help them managing and tracking publications, measuring impact. There's a new format for the NIH biosketch, so everybody who submits a grant has to have this biosketch. We've processed hundreds of people.
28:02
You know, this is a really active role that we can play, and I think that it's about libraries stepping up and saying, you know what, we're here to help you, we can do this. I do wanna, so this is always, I always get asked a question, so metrics and impact core,
28:21
we call it a core because that's what resonates. It's not about what we wanna call it, it's about us using words that make sense to the people we're trying to serve. So I thought I'd hit that. So for both the metrics and impact core, as well as our evaluation that we're doing at New Cats,
28:43
we're doing a lot of different things. You'll see some of our favorite tools here, altmetrics.com, Plum Analytics, to do things like to characterize real-time dissemination and public engagement. We're doing a lot of cool bibliometric analyses. Social network analysis plays a very big role
29:01
in a lot of the things that we're doing, and we use that to monitor change over time usually, but there's a lot of other things you can use to measure a lot of surveys, a lot of surveys. And then also, what we've been trying to do recently is something called a micro case study that we're trying to tailor
29:21
for our translational science sort of perspective where we're trying to make the process of collecting that really rich case study data a little more efficient. And I've been really, I know the folks who participated in the REF probably didn't feel like it was all that inspiring. It was probably a lot of work.
29:41
But the case studies that were produced as a result of that are just, I mean, they're just magnificent and really provide a lot of context. And so we wanna try and figure out a way to try and make that work at our own site. And we're implementing instrumented workflows, a lot of dashboards.
30:01
These are real dashboards from our system. You know, we monitor things like patient accrual, diversity of clinical trial participants, use of certain systems, and so on. So there's a lot of stuff going on. But mostly, what we really wanna do is shine a light on what's happening. We wanna be able to find it, we wanna be able to provide context,
30:21
and we wanna be able to take that information and turn it into something actionable. So I put together a couple of things that I think are useful that I try and think about as we implement any kind of program or service in the library.
30:41
You need to know who cares, and you need to know who the people are, who you need to make sure that you are kind of paying attention to their perspectives. And you also need to know who's gonna be your champion in your organization. You need to know what kind of pressures
31:00
and motivations are happening, what's missing. Like, what kind of things do people need? That's a really great place to step in and help out. And think about this not just as a, oh, well, if we ever get that position funded, then we could do that. It's like, well, what can we do today? A lot of people have LibGuide software.
31:21
You need to do a LibGuide on impact and talk about metrics. There's a lot of different things that you can do here. And so on. So communicate, and then also just to feel empowered. That's a little probably, this is probably the jet lag now that I'm starting to be far too positive at this point.
31:43
So the last thing I wanted to mention is to connect to other people who are thinking about these or very different opinions of these kinds of topics. There's a Google group. It's the Research Impact or Res Impact Google group. A lot of librarians and information scientists
32:01
are doing assessment and visualization work. It's nothing sexy or too exciting. It's really practical conversations about cool tools, approaches, interesting papers, and so on. And so please consider joining. And then also at the MLA, the Medical Library Association meeting is a joint meeting that will be held in Toronto.
32:21
And there's going to be programming around this because like I said, there's a lot of stuff happening in the medical library. So I don't know if I have time for questions, but. Yep. I do? Okay, thank you.
32:40
Are there any questions? Yes. My Anna, go for you from Denmark, Copenhagen. So can you say a little more about your stakeholders? Because I work in a position similar to yours. And I think about the individual researchers,
33:01
the management at different levels and the administration at different levels. But maybe you have other suggestions on all of you on that. Sure. I don't know if I have other suggestions or not. It really is about getting out, having conversations. And we try and approach everybody.
33:23
So it's not just the dean. The dean's the last person generally to know. We work really closely with our students, with our junior faculty. I mean, they are very motivated to help to tell why it is that what they're doing is worthwhile, the junior faculty are. And then they can begin those conversations,
33:41
invite us to department meetings, you know, and so on and so forth. So, you know, I think that we found that to be really helpful. More questions, also? Yeah. I think you have a unique opportunity because in translational medicine, in translational medicine, we have PubMed, yeah?
34:05
And so, and else we is all advocating that the altmetrics adds to that. And you could perhaps show that, or show that it doesn't matter. Yeah. Thank you. I think that this is really,
34:24
so rarely do you have a problem that everyone, everyone cares about improved health, you know? And these types of things in that space. And it seems like that there really are some good data. We've been,
34:43
absolutely. I mean, and everybody is stepping in. I mean, you know, that's why, so I've been a big fan of Stacey's for a long time. And so, you know, it's like the numbers that she sees when she surveys the academic librarians,
35:01
it's almost a little, like medical libraries are just their own little thing, you know? And because we've got these very motivated, very, you know, people under a lot of pressure, to publish, to get things out, and so on. So we're seeing that, and I'm hoping that that, you know, helps to kind of carry things forward
35:22
with integrating some of these metrics and alternative approaches in medicine as well. Okay, thank you. Sorry, we have to move on, maybe in the coffee break. And with that, we cross the Atlantic and we go back to the Netherlands, or the lowlands.
35:47
And so our next speaker is Walter Heeresma. Sorry, let me get my stuff, yes. Is Walter Heeresma, who used to work for quite some time in Wageningen, he just told me.
36:02
But now he just started as a manager for the digital services and innovation at the Vrije Universiteit in Amsterdam here. That's the other university, yeah, a little bit further. And he says about altmetrics that he rather prefers to have just a dashboard of metrics,
36:20
and that altmetrics is one of it. And I fully agree with him. Thank you very much. And thank you to the organizers for the invitation to come and talk here. Just seeing that there are so many librarians in this whole, it's like preaching for your own public, but I hope you pick up some useful things
36:42
from this presentation. First, I want to show that librarians have a natural role in the whole discussions around altmetrics. And then I put my little two cents in on the thoughts of what we as librarians could be doing,
37:01
should be doing on the whole thing. And by the way, if you can't read the screen, it's already available on SlideShare. I like to share my presentations around and I get the impression that's normally appreciated. If you just have a look at an ordinary article at Nature,
37:20
you see there already the donut from altmetric.com. Whether you go to a journal I like a lot, the journal, the JASSIST actually, there you see altmetric indicators. If you do a search in a primo discovery system,
37:43
actually from my previous university, there we have altmetric indicators integrated already. If you look at a DSpace open access repository, there you have possibilities of an altmetric indicators as well.
38:00
If you look at the Web of Science, it's not actually called altmetrics, but there is useful information of usage counts recently introduced. The competitor Scopus went a step further around the same time. They introduced a whole dashboard of metrics.
38:20
The point is what I want to make is that I was rather amazed with Stacy's presentation that only 30% of librarians are aware of altmetrics, whereas I can't escape altmetrics when I'm in a library environment. If I don't understand what it is, how are we going to explain our uses?
38:41
We need to polish up the knowledge on this whole thing to get to 100%. I agree with your tweet, but this is amazingly low. The audience here are probably all aware of altmetrics. That's why you came here.
39:00
The first role of librarians is actually library outreach. It's not actually library outreach towards your audience in the academy, but it's also within among librarians itself, I just learned. My experience in this field is start small, gain experience,
39:20
and really the first message you have to get out there is altmetrics are not only Twitter mentions and Facebook likes. What I normally do in presentations on altmetrics is also make the connection to social networks, but it's not only about social networks.
39:41
And I love to remember then that for researchers, they sometimes don't realize that the importance of their actual publication lists and bring all the different publication lists that are out there easily findable and connected to each other. But that goes a bit further.
40:01
And then if you do make presentations on these subjects, share them with the audience because all the librarians can learn from it, and perhaps it brings you other invitations. Collecting altmetrics, that's really a task that I see is really important to libraries. And I have here shown a few indicators.
40:23
I should have add book metrics as well, and I probably forget a few others, but there are many opportunities to collect all kinds of altmetrics. We just discussed that, and there are, of course, all kinds of variations. So that field is developing, and we should follow it and see what happens.
40:42
My point is, when I think of the library role in this landscape, is the point where I want libraries really stress their importance is that they should be the managers
41:00
of the current research information systems at universities. In the Netherlands, all universities now have recently decided to acquire a new current research information system. All these universities were using MAJIS as a current research information system. They're all switching over to either Convirus or Pure.
41:21
In the Netherlands, Simplectik has not been part in the whole comparison. I was, a half year ago, I was in the United States, and there Simplectik was really popular. But I noticed also a difference is that the focus on current research information systems,
41:40
like we have here in Europe, is not present in the United States. And I think that for libraries to take their role and to fill a gap in the whole academic infrastructure is to seize that opportunity of current research information systems and use that as a platform
42:01
to aggregate altmetric indicators, bibliometric indicators, and these kinds of things. And then you have to make another distinction as well. A lot of universities have repositories as well. I think current research information systems and repositories should be connected to each other. A repository is probably the outward looking face
42:20
of what is behind the scenes in the current research information system. So they're closely integrated. And actually with the term repository, we get also into the discussion about an open access repository is there only for open access items, whereas you have institutional repositories
42:40
perhaps having more in that. So I want to get rid of that whole thing of repositories. And I like to talk, I love to talk about institutional bibliographies because we leave out, as what Christy was just showing us, the sea of current. We go back as far as possible to the 1950s, Christy did, based on Scopus data.
43:01
At my previous university, we have an ambitious project to collect all the information of our university that they have published in the past 100 years. And they have only three more years to go to get there because then they exist 100 years. So I love the idea, the concept of institutional bibliographies.
43:21
And then of course, we have the challenge of making use of that collection of information because collecting publication information for the sake of collecting only has no purpose. And if you collect all that information only because you have to write a single report
43:44
to the Ministry of Education on your publication output, that's also not a good justification to your researchers to say that they have to fill the repository or to fill the current research information with their output. So you have to make use of the collection you have
44:02
the whole collection of information you have. And then you start with the fun part and it is analytics, bibliometrics, altmetrics, collaboration analysis, collection analysis. In Wageningen, we have built a nice tool on how to recommend journals on the basis of
44:24
reference analysis of all the publications in the repository. So there are all kinds of new tools at your availability, new ideas you can start to work on once you have a comprehensive collection
44:40
and then make visualizations of these kinds of things you can dream on really. And that's where the fun part starts and that's where we should be working. So what is then needed, that librarians should develop new skill sets as well. And if you really want to start working on these,
45:01
Chris, he had a whole list of seven points, but my point is that start to get your Chris in a good shape. And then only if you have a good collection of information of your publications, you can start to do all kind of analysis.
45:21
You can buy them off the shelf, bibliometrics, both Thomson Reuters are providing the insights packages, Elsevier is providing SciVell, and here in the Netherlands you have CWTS as an institute they have developed the CWTS monitor which you can interact with
45:42
with your current research information system. And then the challenge of course is now altmetrics. And what we haven't mentioned here in the room yet, that what is really important to me and I haven't seen too much research on that subject yet, that we collect altmetrics at a single article level,
46:01
article level metrics, but then going into repositories we want to aggregate, aggregate to a researcher because it's needed for his research evaluation, tenure and promotion type of things. I know it's a sensitive subject, but they're doing it now on the basis of citations only.
46:21
So I rather prefer to have citations backed up with altmetrics in this area as well. But then aggregation to a research level, to a department level, to a whole faculty, whole universities of course, all kinds of aggregations. And this is a subject that hasn't been researched too much yet. And that's where the altmetrics 15 conference on Friday
46:44
should be going on, or next year the 3 a.m. conference should be going to. Another point is what you very often see, and that's a trap, it's too easy to stick into that, is to concentrate only on peer-reviewed articles.
47:01
This is an aggregation from our current research information systems. Academia produces far more than only peer-reviewed articles. Of course, peer-reviewed articles are the mainstay of the output, but don't forget about the non-refereed articles, the books, the book sections, PhD theses,
47:22
conference contributions, et cetera. There's a whole lot more. And for all these kind of publication output, that's where I see the challenges for altmetrics, to collect information by means of altmetrics for these items as well. So don't concentrate only on peer-reviewed publications. It's all too easy.
47:41
They come at you by the bucketful, but we need to collect them for the other types of output as well. So how do you do that? First, we have to equip researchers with a permanent identifier, and I think it's really, really important
48:01
to implement orchids. In the Netherlands, we have a whole range of identifiers for researchers, and we circuitly have kept it to ourselves. But all Dutch universities have ISNIs, but they don't know about this. They have VIAs, but they don't know about this.
48:22
And orchids is the only identifier that is going to play a major role, and they are being asked now by publishers to fill in their orchid as well. So in the repositories, we better make use of these kind of identifiers as well. And then the other thing is that I showed you
48:40
there's a lot of additional output, apart from scholarly articles, and we have to make sure that they get permanent identifiers as well. It's important to publish them for theses, working papers, reports, conference contributions, and these kind of things. And in my opinion, the only stable identifier
49:06
that researchers also recognize are DOIs. Of course, we can talk about handles, or about URNs, and they are all stable identifiers, but they are not easily explainable. The DOIs is so omnipresent that all researchers know about this,
49:22
so we have to, if I could say what kind of identifier I'm going to use in my repository, I would prefer DOIs in this case, because also journals are now saying that in the reference list, you should list a DOI. And with URLs, this is always a little bit fake,
49:43
and so DOIs is there, the identifier to go for. So I want to conclude my presentation with two warnings, really, is don't collect metrics for peer-reviewed articles only. And the other one that hasn't been mentioned yet, if you read a little bit more
50:02
in the altmetrics literature, it's all about, it's a lot about what altmetric indicator is the best predictor for a citation. Stay away from that, stay away from that. It really is about providing this dashboard of metrics
50:21
and allow researchers to tell their story about their impact. And whether it is citations, whether it is Mandalay bookmarks, whether it is a lot of Twitter mentions, doesn't really matter. It allows them to tell stories, and those stories are important because we are looking more and more
50:41
for all kinds of evidence around only citations. So this is where I want to finalize my presentation. Thank you. Okay, thanks a lot. I agree on many levels, but I'm sure there are a lot of questions.
51:01
Questions come from people who disagree. In the far back, just a minute. I just wanted to ask about telling stories to showcase impact. That's something also Kristi mentioned.
51:20
In what department have you experienced that works best where research is really telling stories going beyond the numbers of altmetrics about their research? What you see nowadays happening definitely in the Netherlands and also in the REF actually in the UK is that they are not only looking at scientific impact
51:41
but also societal impact. And societal impact for me telling a story about your research and bringing that story across to the audience, a wider audience, is part of that storytelling we have or scientists should be doing. And we're giving them the tools if we have a proper dashboard of metrics
52:01
and see where they have been mentioned, see where there are newspaper articles on their newspaper articles, on their journal articles written somewhere already. So we make it easy for them to tell a little bit more around the classical scholarly impact in this way.
52:22
Does that help? No, I only have experience telling these stories over the last year to researchers but then having feedback from them
52:44
or we have used it in this and this way, I haven't had that from their viewpoint yet. But when I tell these kind of stories to researchers, they actually agree with it and they seem to pick it up
53:01
and think it a useful addition to what they should be doing. I haven't come across researchers that say no way, don't do it. But then people who come to these kind of presentations probably are inquisitive and want to know a little bit more about the subject.
53:20
So there's a bias in that as well. Time for one more question. There's one there. Hi, I'm Kaveh from River Valley. I agree that it's not just peer reviewed papers
53:42
we should be looking at but what about things that are not even published? If I'm a researcher and I write a computer program, for example, that has a great impact on my field at the moment, if I don't publish it, I don't get any recognition.
54:02
So that we need to count the impact of things that are not even published that are important for me, the contribution that I've made in my field. Yes, but then if you write a really good piece of software that's being used by a lot of people, then you have GitHub there
54:22
and there we get usage from GitHub that helps you to support the claim you make on the use of your software you have developed. I didn't mention and didn't show the same goes for data of course that is going to play a major role. And there's on Twitter, there's a little bit of discussion going on
54:41
about aggregating things, but then aggregating all metrics or usage data from data sets to articles is also an interesting research question to be answered. But we have to go and collect the data, the basic data in a systematic way and then I think that's where I get into my repository story again, okay?
55:03
Okay, thanks a lot. Yeah. We move on to the last speaker just before lunch.
55:20
And this is Alenka Pincic, I hope I pronounced it correctly. Despite for me at least for foreign sounding name, she is at the university library in Delft. And she is head of research support, relatively new in the topic of altmetric. And when I asked her what she thinks of altmetric,
55:42
she said, well, that still remains to be seen. So I'm very interested to her though. Thank you very much. Yes, I'm kind of new to the topic of altmetrics. And I had to think of the few months ago, there was one conference also in Amsterdam on impact of science.
56:01
And it was also discussed a lot about alternative metrics and altmetrics. And I even heard Ben citations and I thought, whoa, whoa, whoa, hold on, wait a minute. However, I thought I was so smart in figuring out that we perhaps should call this not altmetrics, it's complementary metrics and that stuff, you know.
56:22
So I thought, great. And after, because the Dutch liked the abbreviation, I even thought, okay, let's call it the COMET, simply in the Dutch libraries. And if I would have done that, then I also would have to change, of course, my starting slide. But anyhow, I see now just a few months later already,
56:42
I see, oh, I wasn't that smart. Everyone now says and knows that it's actually complementary and not alternative really metric that we are talking about. Right, so why am I here then? I'm new to these subjects because I work at the library, but I'm not a librarian. I'm a microbiologist. So I've been in research and science myself before.
57:04
And I have this role because also the libraries and the role of the libraries are changing. There's no new story anymore. In all these missions of the university libraries, we see terms such as knowledge flow, technological innovation. In Delft, we say, yeah, freedom to excel
57:21
because we believe that if we use the knowledge of others and we share it, we help and we all benefit and we give our academics and students the freedom to excel. In this way, we also contribute to the mission of our university, that is bring science to society. So I do see now that I think that although traditional use
57:45
of bibliometrics was kind of different than it is now, it's shifting from really doing the collection development to research assessment. Indeed, the librarians should be or are already really the key central group, actually the actors
58:03
who are using this bibliometric analysis and as well as alternative metrics. However, in Delft, they actually do not have yet really a bibliometric analyst as a profession.
58:20
Many different profiles actually perform this kind of analysis and they're also spread around the other faculties, but such a bibliometric analyst we do not have yet perhaps I could say. So just a little bit about Delft. I have to hold this, I don't know why. So I have nothing to hang it, but I hope I don't produce
58:42
too much of a background noise. I put it in my match, really careful now, I shouldn't move. Okay, I'm sorry for this. The Delft library, we operate in four domains
59:00
and here I would like your attention of course for the domain publication and impact where it goes also about be read, be cited, be seen, be visible, increase your visibility. The organization is somewhat different. We have in the Delft library a department, a unit called research support and we provide
59:22
our services all along the whole research life cycle. I also do a little bit of a quick Dutch course here, so I used to unfortunately and inadvertently a Dutch figure of the research life cycle and this is where it actually all happens. Eventually researcher will come to the last blue phase here
59:42
which is the dissemination of the results and this is also where the bibliometric services would happen as well, complementary metric services. I need to mention it's not only library services that have to deal with dissemination, it's also a very important role for our marketing communication and communication department.
01:00:00
and we have really gurus, as we call them, who are advising and helping our academics and all social media and science 2.0 issues. A bit more about that. So what do we do with regards to metrics?
01:00:20
Currently, we do some bibliometric analysis, especially for research groups, individual researchers, and it's all somewhat ad hoc. And also, our strategic department, especially, they do perform bibliometric analysis, of course, for some strategic purposes.
01:00:44
We do that by, just now recently, it's almost a year ago, we launched a research support portal, and if I would be live now and click on this research support portal, this is what I will be talking about. So we raise awareness, we try to help users, academics,
01:01:02
to know, to find a way in getting sighted, being visible, and also all about impact. However, this is in development, and we're gonna expand on these services in due course. The same way I could also show another site on this research support portal,
01:01:21
where we advise how to engage, let's say, in the social media. So this is how we are find when we do the ad hoc bibliometric analysis. When it comes to altmetrics at TU Delft, so far, we just engage bottom-up, so we are raising awareness via the portal that I just show
01:01:44
and we do currently also the comparative analysis of the major tools, although I don't know if that's really smart, because sometimes you shouldn't only look at the major, sometimes it's really interesting to look at the minor providers as well, or tools.
01:02:02
And we really think, and really would like, and we hope, also for discussions here, that we can start in experiments or in pilots with really experts, such as perhaps CWTS, that we have here at the luxury in the Netherlands. I hope to get some cooperation on that. Right, so that was a little bit how Delft is engaging
01:02:25
with the bibliometrics and alternative metrics, altmetrics. And that situation might be somewhat different in the Netherlands, so I performed a small survey among the Dutch university library. My numbers are much lower than Stacy's.
01:02:42
So the Netherlands is small after all. So I'm talking about 13 university libraries, and we had 10 respondents, and I always, when it comes to these low numbers, of course, it's always difficult to really judge who you ask, who's responding, so it's not really a bias, but one should be aware of that, and I do need some room here in the interpretation
01:03:03
of these results that I got. It's all only qualitative and not quantitative. We did try to do, of course, our best, and we asked really people who have a notable role, let's say, in research support. We did it through the special interest group research support, such as role as a program manager,
01:03:21
head academic services, head research support, people we asked to contribute to the data. What did we see? First we ask, of course, the first question, okay, how is that with bibliometrics? Let's check that first. And we ask if they are, the bibliometrics is really rolled out as services.
01:03:45
And the answers were actually kind of surprising. Let's say, just briefly, about one third says, yes, we do, we do that especially for the individual researchers. And another third also continued,
01:04:01
so the circle goes down like this, and we heard yes, we also do that for research groups, for the departments, for faculties. And actually, one third said no, or no but, and in fact, Delft is also in the no but, because no, yes, you could be in two answers here,
01:04:22
but because we are developing them at the moment. Okay, so this was bibliometrics. But then I thought, okay, when I ask, so how is that with altmetrics, and does your institution have these altmetrics really on the institutional level? That's what I was talking about here.
01:04:41
And it was, whoops, no. I was a bit surprised, because in fact, it was no. It is used, again, for individual researchers, but not really what we are talking about, altmetrics for institution. I have 10 respondents, perhaps.
01:05:00
It's different from the respondents that, from the university that did not respond to our survey. But even there, it's not so black or white. I could actually first here conclude, okay, so that's it. And it actually closed down some of my other additional questions, because I wanted to ask, okay, what's your workflow? What's the statistic, what's the uses,
01:05:21
what's the experience, and all that stuff? Hmm, I couldn't get answers anymore. However, there is some fine-tuning that I could receive with regard to altmetrics in the Dutch libraries. So going down to base, first we checked off, so what is the knowledge or awareness or even usage of the tools that are currently available
01:05:41
about alternative metrics? And now the blue bars here are actually, I don't know the tool. And this also somewhat surprised me. And the orange part of the bar is, yeah, I know the tool, but I don't really do much with it.
01:06:04
So it's only the gray that it says, yeah, we are actually really recommending to use some of the tools. And the yellow, I think that's a sort of something that is due to low numbers, difficult to discuss, and also it's a little bit of an error in the question
01:06:21
because it includes Sybel and Insights, which are also used as a standard with your metric tools, of course. Nevertheless, we went on with the question, okay, you don't have the altmetrics for institution, but how are you currently then dealing with this subject? And are there any plans for implementation
01:06:41
at an institutional level, and which tools are being considered? And the engagement of the Dutch University Libraries really was somewhere between, no, not currently, but we are hoping to do something with it. A lot of it was we are observing the market,
01:07:02
a little bit waiting still. And also pretty much many of the libraries are really experimenting with this tool and looking which one and what would provide me the best information, the data that I actually need for my purposes. An interesting one here was,
01:07:21
just like Wouter also just mentioned before, half of the university libraries are actually waiting till their new current research information system is implemented, as that's gonna be an important factor in deciding how to implement alternative metrics. The other here, I put it there,
01:07:43
although this is anonymous, but I found it interesting to point out that one university library is really starting a project and how on these kind of indicators, perhaps that will be discussed here later. The last question I think was all right,
01:08:02
but do you have then a vision, what do you want to achieve with altmetrics? Share your view with us. And I could see that, let's say again, about one third said, no, we don't have a vision. It doesn't mean that they don't care. It just means we didn't really make up our mind yet.
01:08:24
We did not really, we don't have a clear picture yet. And the most or the one third said, it needs maturity, but it does have potential. So I looked a little bit further, what means this potential? I'm just gonna list because I literally took over
01:08:41
what was said there. I need to read it a little bit, okay? That's kind of a standard. It means, it has a potential as a means for demonstrating impact, engagement, and beyond academia, we know that, as a means of translating success of science, sort of based on the storytelling, and as a means of for self-assessment.
01:09:01
I know this is no rocket science, but anyhow, I have to listen to these effects, what I heard back. Yeah, the potential is also especially for the individual researcher or individual scientists, because of the speed. It's a fast insight into, okay, it's still called impact, but that's what I think it remains to be seen.
01:09:21
It has fast insight also in for the early career researchers who actually then know where they are in the environment, but they don't have to wait for the citations. And it can quantify the success of achieving your own goals, if you set them. Also, the target group, you can follow and see if you achieve actually those goals.
01:09:41
This was seen as a potential. It was also mentioned that for an institution, it can contribute to the institutional ranking and impact. The last slide with, oops, sorry, the last slide with potential of the altmetrics is the, okay, it can serve as a, for evaluating the new pathways,
01:10:02
new forms of publication and publication, here it is not meant as an article. And it can serve perhaps, or it can, it has the potential as a complementary peer review. Kind of like this one. Going into, towards open science, we can have also open peer review.
01:10:20
At an early stage, we can already see how the environment reacts on the scientific output. But I also think that we should use it, perhaps not to provide the only extra data all over across the whole institution, but actually use for demonstrating some exclusiveness and specialty of an institution.
01:10:44
And of course, we'll provide data for that and also then use it for strategic purposes. I'm thinking, let's say in Delft, as you say, if we say, okay, internationalization, Delft has to improve on that, or needs more data, how international are we and where do we have to go and how to increase that,
01:11:01
and this kind of data could be used, but I presume that we need some new indicators for internationalization. Or open science, data was mentioned, I think there's something to win with alternative metrics. And the last one is, yeah, perhaps for underrepresented research output,
01:11:21
such as in Delft, we have science, engineering and design faculties, and design research output is underrepresented. Now, this is what I had to actually say. I have summarized it just in a few points. Yes, there's a lot of awareness, a lot of experimentation about metrics
01:11:41
at Dutch university libraries, not yet metrics for institutions and no national guidelines so they're subject upon that as well. And there's a lot of potential as well for individual researchers as for institutional students, but we are hesitant and waiting for maturity.
01:12:01
Thank you. Okay, thank you very much. Are there any questions from the audience? It's lunch time.
01:12:25
I got a question, basically, it's in the end all about science communication, right? What I seem to be missing so far is the proactiveness, and you probably indicated it yourself already. Why are Dutch libraries not already doing workshops
01:12:42
and presentations in engaging with early career researchers, those kind of scientists, to make them, first of all, more aware than they are now, and second of all, let them play with the data. Why don't you already proactively do that?
01:13:04
I don't see it that black and white, that we don't do anything at all. I can speak for Delft that our marketing and communication department is engaging a lot also with young career researchers, and there's a lot of advice on science communication.
01:13:24
And perhaps this is not perhaps solely the role for marketing communication, but also not solely the role for the library, I think. In fact, I hope that we can join the forces here, indeed, to improve on this science communication aspect. So initiatives are being taken, for sure.
01:13:46
More questions. Cameron Allen, it's possibly more comment than question. I'll see how it comes out. The discussion about what we call this,
01:14:02
I think, has been going on since, well, for a long time, let's put it that way, pretty much every meeting that we come to. And I wonder whether there's a sense in which we've actually just got past that, and it's a question not of, whether we call it altmetrics or cometrics or metrics or indicators or impact or whatever else it might be,
01:14:21
but whether it's really, as in fact, I think Vata said, part of telling the story of what this is for, and telling the story of what it's valuable for. It's a question of what you call it to the people you're talking to, whereas we might call it altmetrics amongst ourselves, because that's the story we tell ourselves. But be interested in your thoughts on that.
01:14:44
To me, it comes across more as a comment, indeed. But if you ask me what I think about it, I agree with you. Excellent. Since lunch is waiting outside, I want to wrap this up here.
01:15:03
I have three comments. One, you have to do something with cards. Please remember that. Red cards, red room. Blue cards, blue room. Yellow cards. People are falling asleep, it's time for lunch.
01:15:21
Okay, and then you have to do something with these ones. I still have mine, so I hope I find a very nice poster. Put the stickers on the poster, and show your affection for this work. The last point,
01:15:41
what for me is that for the people who are watching from home, or work, or wherever you are in the world, will be having lunch. I hope you will have that as well. We'll be back in an hour. And the very, very last note is not for Martin. Oh, but it was announced.
01:16:00
But you don't want to say anything. That's all there is.
Empfehlungen
Serie mit 9 Medien