Day 1: Altmetrics: Now and Next and Altmetrics In Research Evaluation
This is a modal window.
Das Video konnte nicht geladen werden, da entweder ein Server- oder Netzwerkfehler auftrat oder das Format nicht unterstützt wird.
Formale Metadaten
Titel |
| |
Serientitel | ||
Anzahl der Teile | 8 | |
Autor | ||
Lizenz | CC-Namensnennung 3.0 Deutschland: Sie dürfen das Werk bzw. den Inhalt zu jedem legalen Zweck nutzen, verändern und in unveränderter oder veränderter Form vervielfältigen, verbreiten und öffentlich zugänglich machen, sofern Sie den Namen des Autors/Rechteinhabers in der von ihm festgelegten Weise nennen. | |
Identifikatoren | 10.5446/46257 (DOI) | |
Herausgeber | ||
Erscheinungsjahr | ||
Sprache |
Inhaltliche Metadaten
Fachgebiet | ||
Genre | ||
Abstract |
|
2
3
00:00
RechenschieberMultiplikationsoperatorSelbst organisierendes SystemInhalt <Mathematik>Demoszene <Programmierung>BitMathematikProgrammierungBildschirmmaskeComputerunterstützte ÜbersetzungMetrisches SystemVorlesung/Konferenz
01:39
Twitter <Softwareplattform>Metrisches SystemRechenschieberMinimumAbfrageGraphKontextbezogenes SystemTwitter <Softwareplattform>SchnittmengeDimensionsanalyseMereologieSoftwareHypermediaZweiVorlesung/Konferenz
03:17
Lokales MinimumSoziale SoftwareAnalysisTypentheorieWeb logPackprogrammServerTwitter <Softwareplattform>Quick-SortWeb SiteDienst <Informatik>Hochdruck
04:58
ComputervirusPackprogrammWeb-SeiteDifferenteSichtenkonzeptCASE <Informatik>DatensatzAusnahmebehandlungResultanteMereologieMetrisches SystemExogene VariableTelekommunikationServerHypermediaEins
06:28
Meta-TagMetadatenGüte der AnpassungSystemplattformWeb-SeiteMessage-PassingVerschlingungMathematikService providerSoftwareschwachstelleFunktion <Mathematik>DatensatzRoboterGoogolMinimumWeb SiteMAPGemeinsamer SpeicherDokumentenserverMetrisches SystemEinfach zusammenhängender RaumEindeutigkeitBitIdentifizierbarkeitSchnittmengeZweiSoftwareKlasse <Mathematik>HypermediaEinsPerspektiveHypercube
09:15
PERM <Computer>Metrisches SystemStandardabweichungGoogolAutorisierungInhalt <Mathematik>PunktCASE <Informatik>ZahlenbereichBitBeobachtungsstudieKontextbezogenes SystemTwitter <Softwareplattform>Web logPerspektiveWellenpaketSchlussregelSoftwareentwicklerTermWort <Informatik>Umsetzung <Informatik>TeilbarkeitKontrollstrukturMaßerweiterungIndexberechnungVorzeichen <Mathematik>DezimalzahlMultiplikationsoperatorZählenGüte der AnpassungMatrizenrechnungCAN-BusVorlesung/Konferenz
14:00
Wort <Informatik>Offene MengeImplementierungFormation <Mathematik>RechenschieberRechter WinkelEinsWellenlehreVorlesung/KonferenzBesprechung/Interview
15:28
Güte der AnpassungWeb-SeiteGebäude <Mathematik>CodeFreewareOpen SourceMinkowski-MetrikUnternehmensmodellOffene MengeRechter WinkelAbstraktionsebeneService providerMetrisches SystemReelle ZahlSpieltheorieEreignishorizontMAPPhysikalisches SystemTwitter <Softwareplattform>SystemplattformSystemaufrufVorlesung/Konferenz
18:52
AbstraktionsebeneMAPFreewareEreignishorizontOpen SourceLageparameterOffene MengeZählenQuick-SortAdditionMaßerweiterungMultiplikationsoperatorSystemaufrufWeb-SeiteBrowserSichtenkonzeptWeb SiteMetadatenBitAutorisierungDigital Object IdentifierZweiReelle ZahlGrenzschichtablösungVorlesung/Konferenz
20:45
BenutzerbeteiligungDokumentenserverZentralisatorOffene MengePackprogrammWeb SiteDigital Object IdentifierGreen-FunktionNetzadresseGrundraumSichtenkonzeptDatenmissbrauchZeitstempelInformationURLAdressraumDemo <Programm>EreignishorizontGoogle ChromeGraphische BenutzeroberflächeBetafunktionGoogolVorlesung/KonferenzBesprechung/Interview
22:14
AlgorithmusZahlenbereichAbgeschlossene MengeDifferenteEinsTypentheorieProzess <Informatik>CASE <Informatik>Digital Object IdentifierArithmetisches MittelMetrisches SystemSchnittmengeKontrast <Statistik>BitArithmetische FolgeMultiplikationsoperatorOffene MengeOpen SourceCodeEreignishorizontHilfesystemDatenbankMaßerweiterungNetzadresseKette <Mathematik>ProgrammierumgebungQuick-SortLeistung <Physik>Dienst <Informatik>SystemplattformProxy ServerGrundraumSoftwareentwicklerAttributierte GrammatikMathematikSoftwareschwachstelleBenutzerbeteiligungLesen <Datenverarbeitung>BetafunktionStörungstheorieProgram SlicingVorlesung/Konferenz
25:12
DatenbankGrundraumProgrammbibliothekMereologieOffene MengeWeb SiteLesen <Datenverarbeitung>MultiplikationsoperatorQuick-SortMetrisches SystemSelbst organisierendes SystemEreignishorizontPhysikalisches SystemMAPApp <Programm>Arithmetisches MittelFreewarePunktStörungstheorieVerschlingungTypentheorieSichtenkonzeptTwitter <Softwareplattform>Gebäude <Mathematik>SoftwareschwachstelleKontextbezogenes SystemKontrollstrukturEinflussgrößeBildgebendes VerfahrenOpen SourceFitnessfunktionKollaboration <Informatik>WellenpaketArithmetische FolgeRechter WinkelSprachsyntheseTabelleFlächeninhaltSystemplattformSchnittmengeRandomisierungVorlesung/Konferenz
27:42
RechenwerkMAPMetrisches SystemSondierungPhysikalisches SystemMaß <Mathematik>Metrisches SystemPolygonMAPInformationGüte der AnpassungGraphProdukt <Mathematik>DatenverwaltungZählenElektronische BibliothekPaarvergleichDatenbankFlächentheorieInformationsspeicherungMeterValiditätStreaming <Kommunikationstechnik>DokumentenserverDatenanalysePhysikalische TheorieRuhmasseFront-End <Software>Gewicht <Ausgleichsrechnung>BitMereologieSchlüsselverwaltungKomplexe DarstellungMaschinenschreibenKontrollstrukturNatürliche ZahlSichtenkonzeptSchnittmengePhysikalisches SystemFokalpunktVollständiger VerbandFunktion <Mathematik>Web SiteIndexberechnungProjektive EbeneRückkopplungHypermediaWeg <Topologie>Intelligentes NetzFramework <Informatik>LeistungsbewertungLuenberger-BeobachterMinkowski-MetrikFacebookTermAbfrageStrategisches SpielKollaboration <Informatik>Ordnung <Mathematik>Open SourceSondierungPunktWidgetProzess <Informatik>Selbst organisierendes SystemStandardabweichungEinflussgrößeLoginSystemidentifikationTwitter <Softwareplattform>MultiplikationsoperatorCASE <Informatik>Komplex <Algebra>DatenfeldBeobachtungsstudieDichte <Stochastik>Mapping <Computergraphik>TypentheorieVerschlingungEin-AusgabeSpeicherabzugSoftwareentwicklerDialektZweiRechter WinkelTelekommunikationVerkehrsinformationPerspektiveFlächeninhaltÄhnlichkeitsgeometrieProxy ServerZusammenhängender GraphDifferenteSpannweite <Stochastik>HybridrechnerIterationGoogolMailing-ListeVorlesung/Konferenz
35:26
DatenverwaltungAnalytische MengeProdukt <Mathematik>Arithmetisches MittelFamilie <Mathematik>Metrisches SystemSoundverarbeitungUmwandlungsenthalpieDifferenteKonstanteVektorpotenzialLeistungsbewertungAggregatzustandFramework <Informatik>TypentheorieSelbst organisierendes SystemPhysikalisches SystemDatenfeldEntscheidungstheorieZahlenbereichInterpretiererRückkopplungPunktBruchrechnungAdditionMathematikOnlinecommunityProjektive EbeneService providerInjektivitätSoftwareentwicklerMotion CapturingSchlussregelSystemplattformKartesische KoordinatenMAPBewertungstheorieEreignishorizontOrdnung <Mathematik>Message-PassingNormalvektorHypermediaBitrateKontextbezogenes SystemZählenTexteditorFrequenzMinkowski-MetrikMathematische LogikProzess <Informatik>PerspektiveDatenstrukturIndexberechnungInteraktives FernsehenFlächeninhaltRechenschieberGebundener ZustandZentrische StreckungMatchingEinflussgrößeGrenzschichtablösungMultiplikationsoperatorRandwertProxy ServerCASE <Informatik>BeobachtungsstudieZweiNP-hartes ProblemFokalpunktProgrammbibliothekFormale SpracheLeistung <Physik>EichtheorieTouchscreenExpertensystemInklusion <Mathematik>Spannweite <Stochastik>Notepad-ComputerMereologieRankingObjekt <Kategorie>Bellmansches OptimalitätsprinzipSoftwareWort <Informatik>GrundraumGruppenoperationGeradeTeilbarkeitGenerator <Informatik>Deterministischer ProzessBetriebsmittelverwaltungFunktionalKonditionszahlFormale GrammatikSondierungOrientierung <Mathematik>SpeicherabzugRuhmasseTermSummengleichungTelekommunikationForcingRelativitätstheorieKonstruktor <Informatik>Offene MengeInformationFehlermeldungOrtsoperatorReverse EngineeringStrömungsrichtungDigitales ZertifikatBildschirmmaskeGesetz <Physik>UnordnungGemeinsamer SpeicherAttributierte GrammatikIdentitätsverwaltungTeilmengeNegative ZahlFitnessfunktionClientVerschlingungStabReelle ZahlVerkehrsinformationMinimalgradHilfesystemVorlesung/Konferenz
45:30
HypermediaMetrisches SystemMatrizenrechnungSoftwareentwicklerProdukt <Mathematik>Kontextbezogenes SystemVerschiebungsoperatorAdditionDatenfeldPhysikalische TheorieDatenverwaltungFunktion <Mathematik>TypentheorieData MiningLeistungsbewertungCoxeter-GruppeEDV-BeratungIndexberechnungRechter WinkelMultiplikationsoperatorURLNatürliche ZahlTaylor-ReiheBesprechung/InterviewVorlesung/Konferenz
47:06
Web-SeiteTaskDatensatzDean-ZahlFramework <Informatik>SoftwareAggregatzustandRandomisierungDateiformatDifferenteHypermediaMetrisches SystemVersionsverwaltungWeb SiteSystemverwaltungDomain <Netzwerk>MereologieElektronisches BuchDokumentenserverWeg <Topologie>Textur-MappingSchreib-Lese-KopfVerschlingungMultiplikationsoperatorComputerspielZahlensystemInformationTwitter <Softwareplattform>Reverse EngineeringSyntaktische AnalyseOffene MengeService providerAutorisierungGrenzschichtablösungUmsetzung <Informatik>Landing PageVorlesung/KonferenzBesprechung/Interview
49:33
Kontextbezogenes SystemZahlenbereichDigitalisierungCASE <Informatik>BildschirmmaskeRandomisierungEindeutigkeitZeichenketteArithmetisches MittelTangente <Mathematik>Inklusion <Mathematik>TermRechter WinkelVarietät <Mathematik>Offene MengeWeb-SeiteMathematikKollaboration <Informatik>MetadatenSoftwareentwicklerStabCoxeter-GruppeNP-hartes ProblemMeta-TagAutorisierungInformationElement <Gruppentheorie>Lesen <Datenverarbeitung>ProgrammierumgebungDatensatzOpen SourceDatenreplikationZählenWeg <Topologie>WebdesignVorlesung/Konferenz
51:43
Lie-GruppeSchlüsselverwaltungMeta-TagBenutzerbeteiligungGrundraumDatensatzDigital Object IdentifierURLGebäude <Mathematik>DifferenteEinsUmwandlungsenthalpieWeb-SeiteVersionsverwaltungMetrisches SystemProjektive EbeneOffene MengeDokumentenserverOrdnung <Mathematik>Domain <Netzwerk>MAPInhalt <Mathematik>MetadatenFunktion <Mathematik>DatenbankTwitter <Softwareplattform>AggregatzustandAutorisierungElement <Gruppentheorie>Deskriptive StatistikKlasse <Mathematik>t-TestInstantiierungMultiplikationsoperatorVorlesung/KonferenzBesprechung/Interview
53:56
RechenwerkDomain <Netzwerk>Inhalt <Mathematik>MultiplikationsoperatorOpen SourceHypermediaPhysikalische TheorieZirkulation <Strömungsmechanik>DifferenteKlasse <Mathematik>IndexberechnungAnalytische Menget-TestAutorisierungUmsetzung <Informatik>Kontextbezogenes SystemBildschirmmaskeFlächeninhaltRückkopplungGoogolWeb logMetrisches System
55:34
KommensurabilitätWärmeleitfähigkeitOffene MengeStapeldateiAutorisierungMultiplikationsoperatorCodeGrundraumPunktPhysikalisches SystemMailing-ListeBitAnalytische MengeZählenProgrammierungDreieckTaskInstantiierungCASE <Informatik>Treiber <Programm>Gleitendes MittelQuick-SortMobiles InternetSystemverwaltungEntscheidungstheoriePerspektiveInhalt <Mathematik>Basis <Mathematik>DifferenteSystemplattformp-BlockForcingOpen SourceMereologieKontextbezogenes SystemZahlenbereichVerkehrsinformationMetrisches SystemLastSpieltheorieDatenfeldCoxeter-GruppeLeistungsbewertungMAPProzess <Informatik>VektorpotenzialTabelleSummierbarkeitTermWeb SiteVorzeichen <Mathematik>Web-SeiteTypentheorieProjektive EbeneSchreib-Lese-KopfWhiteboardWeb logMetadatenDigital Object IdentifierProdukt <Mathematik>E-MailDatenbankKategorie <Mathematik>Arithmetische FolgeBenutzerbeteiligungMathematikDienst <Informatik>SchnittmengeData MiningMaßerweiterungSelbst organisierendes SystemEinsKontrast <Statistik>SoftwareschwachstelleVorlesung/KonferenzBesprechung/Interview
01:01:46
MereologieQuick-SortInternationalisierung <Programmierung>FlächeninhaltVorhersagbarkeitSystemverwaltungSoftwareentwicklerBitInstantiierungMAPGrundraumKollaboration <Informatik>Temporale LogikBitrateMetrisches SystemTermIndexberechnungProjektive EbeneCASE <Informatik>BeobachtungsstudieGruppenoperationZentrische StreckungVerkehrsinformationExistenzaussageHypermediaTwitter <Softwareplattform>ProgrammierungNatürliche ZahlSystemplattformMapping <Computergraphik>MultiplikationsoperatorDatenfeldPhysikalisches SystemArithmetisches MittelWellenpaketKontextbezogenes SystemPunktKontrollstrukturDialektGüte der AnpassungÜberlagerung <Mathematik>SprachsyntheseOpen SourceDatenbankSpannweite <Stochastik>AbfrageFacebookLeistungsbewertungOffene MengeFitnessfunktionInformationInformationsspeicherungDifferentePaarvergleichVorlesung/Konferenz
01:07:43
p-BlockLeistungsbewertungSelbst organisierendes SystemMinkowski-MetrikPinchingBitIntegralMetrisches SystemPhysikalisches SystemSoftwareentwicklerDifferenteMailing-ListeSchnittmengeMultiplikationsoperatorRechter WinkelRankingProjektive EbeneDatenanalyseLuenberger-BeobachterFramework <Informatik>TermIndexberechnungVorlesung/Konferenz
01:09:25
EinflussgrößeDatenfeldSchlussregelFlächeninhaltRelativitätstheorieSoftwarePerspektiveKonditionszahlNormalvektorHypermediaFormale GrammatikMetrisches SystemInteraktives FernsehenFokalpunktStrategisches SpielSoundverarbeitungDifferenteTelekommunikationKartesische KoordinatenTypentheorieProgrammbibliothekGrundraumProdukt <Mathematik>FunktionalRandwertBeobachtungsstudieOrdnung <Mathematik>Mathematische LogikLeistungsbewertungZahlenbereichMinkowski-MetrikService providerSelbst organisierendes SystemInformationDatenstrukturDatenverwaltungVektorpotenzialPhysikalisches SystemStrömungsrichtungBildschirmmaskeCASE <Informatik>ExpertensystemGebundener ZustandFramework <Informatik>Orientierung <Mathematik>SystemidentifikationValiditätOpen SourcePhysikalische TheorieKontextbezogenes SystemUmwandlungsenthalpieArithmetisches MittelZweiSpeicherabzugDesign by ContractGesetz <Physik>Formale SpracheSummengleichungSchlüsselverwaltungPaarvergleichStatistikFunktion <Mathematik>Spannweite <Stochastik>Digitales ZertifikatSchnittmengeNatürliche ZahlRichtungGruppenoperationEin-AusgabeBitMultiplikationsoperatorVerkehrsinformationTaskQuick-SortSystemverwaltungOrtsoperatorRechter WinkelIntegralMomentenproblemMereologieRankingSelbstrepräsentationPunktSchätzfunktionTransformation <Mathematik>Abstimmung <Frequenz>TabelleFiletransferprotokollProfil <Aerodynamik>Vorlesung/Konferenz
01:15:41
Metrisches SystemPhysikalische TheorieHypermediaDatenfeldEinflussgrößeProzess <Informatik>ResultanteSoftwareentwicklerRechenschieberLeistungsbewertungMAPMathematikKontextbezogenes SystemVektorpotenzialFramework <Informatik>GrenzschichtablösungBeobachtungsstudieTypentheorieProdukt <Mathematik>AdditionZahlenbereichSummengleichungFormale SpracheMinkowski-MetrikSchnittmengeTelekommunikationAggregatzustandGruppenoperationKonstanteIndexberechnungVerschiebungsoperatorUmwandlungsenthalpieVerschlingungForcingClientSpannweite <Stochastik>Konstruktor <Informatik>CASE <Informatik>OrtsoperatorBellmansches OptimalitätsprinzipFrequenzStabFokalpunktDatenverwaltungOrdnung <Mathematik>DatenstrukturSelbst organisierendes SystemDifferenteOnlinecommunityBetriebsmittelverwaltungGenerator <Informatik>PerspektiveLeistung <Physik>SoundverarbeitungMathematische LogikSystemplattformInterpretiererEreignishorizontRankingTeilbarkeitPhysikalisches SystemÄußere Algebra eines ModulsKraftOrientierung <Mathematik>
01:21:57
p-V-DiagrammIndexberechnungCoxeter-GruppeData MiningMetrisches SystemPhysikalische TheorieFramework <Informatik>Vorlesung/KonferenzBesprechung/Interview
01:23:49
MehrrechnersystemDean-ZahlMereologieSystemverwaltungMetrisches SystemFramework <Informatik>Reverse EngineeringComputerspielExistenzaussageTextur-MappingSchreib-Lese-KopfOffene MengeUmsetzung <Informatik>Vorlesung/KonferenzBesprechung/Interview
01:25:47
InformationArithmetisches MittelBeobachtungsstudieVarietät <Mathematik>TermInklusion <Mathematik>Zusammenhängender GraphDifferenteOffene MengeZählenDatenreplikationKollaboration <Informatik>ProgrammierumgebungElement <Gruppentheorie>Vorlesung/KonferenzBesprechung/Interview
01:27:36
MatrizenrechnungRechenwerkInformationsmanagementDifferenteZirkulation <Strömungsmechanik>Klasse <Mathematik>AutorisierungMultiplikationsoperatorZahlenbereichDigital Object IdentifierOrdnung <Mathematik>WärmeleitfähigkeitOffene MengeKontextbezogenes SystemInstantiierungBildschirmmaskePhysikalische TheorieCodeProjektive EbeneElement <Gruppentheorie>Indexberechnungt-TestVirtuelle MaschineDeskriptive StatistikVorlesung/KonferenzBesprechung/Interview
01:32:15
RechenwerkDreieckMultiplikationsoperatorWeb logE-MailLeistungsbewertungVorlesung/KonferenzBesprechung/Interview
01:33:38
MenütechnikMaß <Mathematik>BitMereologieZahlenbereichGrundraumDifferenteMetrisches SystemPerspektiveProzess <Informatik>Mobiles InternetVektorpotenzialBitrateTreiber <Programm>Physikalisches SystemTermCASE <Informatik>ExistenzaussageRohdatenEinfügungsdämpfungProgrammierungForcingSoftwareentwicklerKontextbezogenes SystemKollaboration <Informatik>GruppenoperationSystemverwaltungp-BlockZählenMailing-ListeBasis <Mathematik>Open SourceProdukt <Mathematik>IndexberechnungProjektive EbeneQuick-SortFlächeninhaltInternationalisierung <Programmierung>VerkehrsinformationPolstelleMultiplikationsoperatorHilfesystemGüte der AnpassungVorlesung/KonferenzBesprechung/Interview
01:41:09
MAPVorhersagbarkeitTwitter <Softwareplattform>Kollaboration <Informatik>InstantiierungFlächeninhaltQuick-SortBeobachtungsstudieHypermediaBitTemporale LogikMetrisches SystemVorlesung/Konferenz
01:43:08
GammafunktionSchnittmengeNatürliche Zahlp-BlockQuick-SortPerspektiveÄhnlichkeitsgeometrieSoftwareentwicklerDifferenteMailing-ListeMetrisches SystemRechter WinkelRankingVerkehrsinformationPunktTransformation <Mathematik>IntegralMultiplikationsoperatorOrtsoperatorPhysikalisches SystemTabelleLeistungsbewertungKontextbezogenes SystemVorlesung/Konferenz
01:47:10
SystemverwaltungTaskProfil <Aerodynamik>GrundraumFiletransferprotokollMereologieIntegralMomentenproblemEin-AusgabeAbstimmung <Frequenz>SelbstrepräsentationVerkehrsinformationMetrisches SystemSchnittmengeVorlesung/Konferenz
Transkript: Englisch(automatisch erzeugt)
00:00
Can you come and say hey, please Okay, we have super with Claire helping us out with slides and things so when I can't work the technology
00:21
Okay, hope everyone managed to get coffee and is ready for an exciting day of talks. Welcome to 4 a.m I'm cat I work at altmetric and I'm one of the organizers of the conference. So I hope you have a great time here So I'm pleased to introduce you all of the speakers of our metrics now and next session And what we've really tried to do with the content in this session is to give you a bit of an overview of some of the
00:47
latest thinking in altmetrics Some of the new ideas and the new technologies that are starting to emerge And to really kind of set the scene for the next couple of days of discussion And we have a slight program change because I'm afraid Katie from plus couldn't make it in the end
01:02
But the good news is is that gives everyone else a little bit more time to talk So I won't be cutting them off too quickly I think we'll have everyone talk and then we'll do questions at the end So if you do think of anything, please try and hold on to it till then and we'll get started So first up we have you and Addie and Ewan is the founder of altmetric
01:23
He was a former researcher and he's going to share with you some of the thinking that we've been doing over the last year And what we're thinking of moving forward with altmetrics. Hi everyone
01:47
So what I was gonna talk about is not so much anything to do with altmetric.com if you're interested now You can talk to the people here It was more like things we've noticed as altmetric.com so obviously we get a lot of people talking to us about our metrics
02:03
and Some things you know we can help with some things we can't but it's to give you an idea of the kind of queries We're getting and things we've noticed. So it's this 10,000 feet view from from altmetric.com The first thing I should say is that following Antoli's very clever idea
02:21
I've also put my slides online because I realize you can't necessarily see the bottom of them so they're there bit.ly addie4am So without further ado jumping straight into it. The first thing is That we're seeing academic Twitter is still growing. So this is a graph of Tweets that mention articles not books or data sets or software or anything on that specifically articles and the interesting thing is it's still growing by
02:44
quite a high percent so from 2014 to 15 it was 45 percent growth 15 16 32 percent It looks like this year it's gonna grow again By between 30 and 40 percent and that by itself without context isn't necessarily the interesting part The interesting part that Twitter itself is only growing 10 percent a year
03:01
So for whatever reason we're finding more articles being mentioned online Why is that is it because more scientists are using Twitter, you know, they're over represented in that temps and new users It's because journals are getting better social media. I don't know but we've definitely seen the trend Second interesting thing which I think is a very positive thing is that people are talking about preprints more
03:24
So this year has really been the year where we saw this kind of data take off People have always talked about archive online, but they're also talking about bio archive So this is a very kind of loose analysis I'd be interested if anyone else wants to take it further, but I'm just quickly looking at plus one
03:44
I picked plus one just because it's you know, a big journal has a lot of data And looking at how many articles it published in 2016 And then comparing it to bio archive and then seeing you know How many mentions in total did it get for every hundred papers? How many tweets did it get how many blogs?
04:01
How many policy citations this kind of thing? So the interesting thing is with bio archive actually in general and there could be all sorts of you know reasons for this but It actually gets more blog posts and tweets pair hundred papers than plus one And you could say that's because you know They're inviting people to submit preprints to bio archive and as you're kind of selecting a certain type of research to go in there
04:24
Or maybe it's you know researchers think again just like with open access at the beginning It's a particularly controversial topic or something. That's quite exciting. So they deliberately make it open as soon as possible There's all sorts of reasons, but regardless people are finding articles on preprint servers and talking about them
04:41
So the opposite as you might expect maybe it's intuitive is you get fewer news stories policy sites wikipedia sites From from the preprints from bio archive than you do plus one So there's still more news stories written about things in published journals than there are in preprint servers On bio archive specifically but they still exist so you do still get news stories and you do still get policy documents
05:06
So there's an example here about it was a preprint about zika virus On bio archive and it was already picked up by the cdc That so to be clear. I think this is a great thing that preprints is, you know, kind of taking off
05:21
Um, I don't think that's a controversial view in this room probably what I think is maybe a problem is there's no difference In the way these things are cited If you go to a wikipedia page and it cites a preprint it cites in exactly the same way as it cites peer-reviewed research If you look at a news article in the vast majority of cases
05:42
The way the news article talks about the research is exactly the same as it would talk about Peer-reviewed research with a few notable exceptions. So there's spiegel there was one example I could find Where the journalist actually goes it's research that hasn't been peer-reviewed yet um, I you know and I Don't necessarily think it's
06:01
an old metrics community issue or responsibility to fix But I think it's someone's responsibility to at least think about it and think is this what we want, you know We talked I remember four or five years ago There was a lot of the crisis in scientific communications was about, you know, people over hyping Results and things in the mainstream media if that was peer-reviewed research. How much worse is it going to be?
06:20
If you know, it's accepted to just as long as it's on a preprint server It's now, you know part of the scholarly record and we say whatever we want about it Um, it's not all negative Some of these things seem negative, but they're not necessarily but here is one that's unfortunate for us, especially We've seen publisher metadata get worse. So
06:42
Google scholar when it first came out really kind of incentivized a lot of publishers to put meta tags on their page Uh with good metadata because if they weren't there they wouldn't get indexed in google scholar Which is important for discovery and you know, basically ties to your business as a publisher Nowadays we don't see that so often if anything we sometimes hear from publishers that
07:02
They have meta tags, but it's only activated when it's the google bot that crawls their pages I cannot conceive of why this would be a good idea. But there you have it. There's a move away from the kind of traditional scholarly platforms which For all of their strengths and weaknesses, you know do have a record of putting good metadata on the pages
07:21
And towards more custom platforms where often the metadata is an afterthought So, you know people say the metadata is the love letter to the future and things so people kind of forget that And they don't include it in their first pass of their new publishing platform. No meta tags No way for us and other all-metric providers to kind of link a paper with
07:42
Or to rather connect an actual research output with a link Another thing books versus articles. So I feel now Uh pretty confident that we've got good data on article sharing on social media, so
08:01
You know if you there's you know, it's never 100 there's always problems We talked about this a little bit in the workshop yesterday But in general if you tweet a link to a paper I'm pretty confident that at least somebody in the LMS community is going to be picking it up properly But books and i'm not going to talk a lot about this because I know gene has got something coming up on it but books is are very different and again, it's to do with
08:23
Not just technological issues, but also kind of social ones the way people cite books Um, so there are some example tweets there. You can't really see them because they're down the bottom But it's for example, people don't link to a book if you think about it, you know Where is the canonical place for a book to live? Is it amazon?
08:40
Is it a publisher site, you know, is it your institutional repository? It's not a clear kind of accepted way to do it um There's other connected problems around books not necessarily having the same level of quality metadata as articles and certainly not With a good unique identifiers like doy so that is changing. So it feels like it's a little bit
09:01
behind So, you know we talk about data sets and software being Uh kind of second-class research output systems, but actually from an all metrics perspective books are you know, they're still out there as well It's not they're not quite at the same level as articles So moving more onto good things um, I always
09:21
I think other people working in our metrics will have experience as well. But I always really like it when people Actually know what our metrics are and occasionally I meet a researcher through friends or whatever they'll be like Oh, yeah, the donut guy. I know you And that always makes me feel really good And most recently someone said to me. Oh how much I finally know you do because you're in the economist That means that our metrics has really made it
09:42
Although okay, maybe I don't know what that that standard is But there you go that did happen this year And I think it is a sign amongst many other things and that awareness and uptake is still growing so I really wish we had hard data here and I know It's Michael Habib around Like I know Scopus used to do that, you know, what percentage of people have heard about all metrics
10:04
And I remember it was 0.5 percent the very first year. I wish I knew what the percentage was now if we had the same kind of longitudinal study And we don't but we do know that, you know increasingly we have more researchers who know about all metrics
10:20
More publishers more funders and governments, especially from an all metric.com perspective We have many more publisher. Sorry many more funder and government customers than we have before And in general their approach to all metrics is also quite sensible so the worry was always you know people are going to
10:41
Be Get have their tenure packages judged on how many tweets they have in this kind of thing and And it never really seemed like a real worry And actually it isn't like we've not worked with anyone who's really come to us and said I really want this one number to judge people on they're generally not like that That's very good um, one things that we've realized
11:02
as a company is that training and awareness and kind of pragmatic best practices As important an activity as the actual development of the all metrics Which is easy to say it's actually quite hard to put into practice but we spend a lot of time we have I think as a company we have more people on the non-technical side than we do the development side
11:25
It's still pretty close, but a lot of the work we do now is in training and awareness The flip side of this Is and uh, I think maybe rebecca, I don't know if this is the paper that rebecca meant Um, but things like this so there was this blog post recently in the lse impact blog the click bacon click bacon impact one
11:46
Uh talking about a paper that had a high altmetric score Uh, but the the content it was you know, it's about the case for colonialism in a tnf journal and um The authors say well, you know this article got published and it had a lot of attention about it
12:02
And the highlighted bit here is by the rules of modern academia. This is a triumph And then following on from that separately earlier on we've got uh two tweets from people saying In this conversation going, you know, we're talking about this colonialism paper. Stop tweeting about it. It's adding to the all metric score You're just giving the guy extra all metric points
12:23
and I don't know Quite how to feel about this on the one hand people assume That our metrics are accepted it's obviously not a niche thing if people are worrying about to this extent On the other hand, obviously it's completely wrong And the you know awareness and the training and stuff that we're trying to do saying it's not about the counts
12:42
You always have to look at the mentions in the context and things maybe it hasn't filtered all the way through So it's an ongoing challenge Okay, i'm almost almost done So then because we're at an all metrics conference and well You know, we talk about indicators most of the time Um, I thought what better place to make up a new
13:01
metric Uh, which i'll call the e-index Uh, it's precise to one decimal point And it's calculated by looking at the number of hits on google news for any search term Um, I thought that's a good exercise. Let's see if we can gauge how popular our metrics is in the public consciousness
13:20
um, so for all metrics and 375 0.0 hits on google news and for the impact factor. There's only 388.0 Hits on google news. So yeah, that's pretty good going And then just to bring everyone down to earth. I looked for crisper And it was 193 000 hits on google news. So, you know, not sure about the public consciousness
13:46
But at least maybe in the places that matter, you know, we're getting somewhere I'll leave it there. Thanks a lot. I'm looking forward to checking on the breaks Great thank you. Even um, we might have to have words about your new metric
14:03
We'll see. Um, okay next up. We have heather and jason who are co-founders of impact story And they are going to talk to us about the design and implementation considerations for an open altmetrics ui and api Which sounds pretty exciting Let me just find the slides
15:12
It is exciting. You're right Okay Okay So i'm gonna give you a sneak peek of the thing called paper buzz
15:21
And it's brought to you by impact story and pkp. I think one's in the audience wave one. There we go right there So you can answer questions later as well So all of us I think are the vast majority of us use data from altmetric and it's fantastic and it's really good It's fantastic for lots of reasons. It's comprehensive data. It's easy to use It's one of my favorite apis to call it just makes me happy. It's so easy
15:44
It's free for research use which a lot of us really value and makes it really different from Lots of apis in the scholarly space before now And on its detail page on the page you get to from the bookmarklet. Um Many of those details are free for everybody. Uh, there's a lot there. That's really great
16:03
but it's a Company that has as a business model, um keeping their code closed source and closed data And it also costs money for journals and startups and others to build things on top of this So because of that, um cross ref has decided that they will build something with open data
16:25
And make that available openly For people to use open source open data and free for everyone and joe wass is in the audience right here And he's going to be talking about That late on thursday in more detail the cross ref event data api
16:42
On thursday in the one of the last sessions. So stay tuned for that so Unfortunately, because um, they're they're building a system which philosophically They're not making judgments. They're not rolling things up. They're keeping it a low level of abstraction Which is really useful for the ground infrastructure for something
17:03
But it means it's not that easy to use actually for lots of people for lots of purposes in some ways So there's real value in having something that's a higher level of abstraction there so And and one of the reasons there's value in building this higher level of abstraction is because open is so important
17:22
we need Adoption here for this for all the transparency issues that we've been talking about. So um Early on they're talking about the all metrics providers um Sorry, just a sec So all the all metrics providers right now are a closed source the main four that are used, um plum x
17:45
All metric and even impact story because we're mostly getting all metric data right now it's closed source and the The methods we use for getting data these days are so complicated. You can't summarize them in Brief sentences about how to call twitter data. You really need the source to be able to document how it's actually done
18:04
Which requires open source, so the Cross ref api is all open source. You can see what's actually there the stuff we're building on top is open source It's the only way we're going to get to transparency. So it's a real game changer for all metrics as a as a
18:21
platform and really important The cross ref event data api has 35 million events so far though 30 million of them are wikipedia That's one of the things that makes it Difficult to use they're putting out a lot of low level events specifically right now a lot of low level events about wikipedia makes it hard to get to some of the
18:44
Stuff that a lot of a lot of people want about twitter invited and so on and again They're doing this for philosophical reasons But but it does make it harder to use so pkp and impact story We're building a higher level of abstraction on top of this which we're really excited to show you So it's open source open data free for everyone built on the cross-ref event data
19:04
The audience includes everyone so pkp is interested because they want to include this data in ojs for philosophical reasons Again to embrace the open so it's great going to be great for journals researchers tool builders And then others that use tools built on that the api simplifies events. It includes counts. It includes summary statistics by day
19:26
Adds that sort of value on top of the cross-ref api So if you haven't seen the cross-ref event data api here is a snapshot of what it looks like There's a lot of detail here as you can tell again joe will go into it in more detail Um, here's what we're proposing instead
19:41
Or excuse me. Here's what we're proposing in addition. So this for people who don't want that level of detail uh, we just Uh, we deduplicate events summarize them include more detail there. Jason will talk about it in a Few more seconds in a little bit more time. Um, we also include metadata there. So the
20:02
Journal title the journal authors things that you might want when you get the doi You could go into crossref and get that in a separate call But lots of times people want that at the same time So we're again enriching that data to make it an easy to use api We're also building a website. Um a bit like the details page you get to when you click on the
20:22
Altmetric bookmarklet that's free for everyone includes all the events and is indexed by doi So one of those pages exists for every paper Um as a bonus it includes a new source, which we are really excited about so unpaywall views Let me tell you a little bit about unpaywall real fast It is a free browser extension. If you don't have it, you can go get it right now
20:43
At unpaywall.org. It's a browser extension that pops up this little tab on the corner of your paper When you are on a publisher site or somewhere with a doi and that tab is gray when it is closed Access but green when we have been able to find an open access copy of it anywhere on the web
21:03
So we look in institutional repositories and pubmed central and archive and so on Turn that green green and you can click on it and go straight to an open access copy More than a hundred thousand people have installed it so far in the six months It's been available with more than 4.6 million uses. So it's a really rich data set growing fast
21:21
Really exciting the people come from 185 countries and we're able to see about 30 percent of them are from university ip addresses We're releasing that view data. Um d With privacy issues taken into account so no ip addresses obviously, but we're releasing the country a rounded time stamp. So it's not very
21:41
Um identifiable and whether it's a college location, so this information is also in our new Api with one more bonus, um feature in that website about what's buzzing this week So we're using the crossword prevent data and this unpaywall data to see what are people talking about and reading this week And on that note jason's gonna give a demo
22:08
Thank you heather. Um So, uh, I actually want to show this in google chrome. It's not super well tested. Um, this is a very early beta Um, we wanted to show you guys because we're just really excited about it and really interested in hearing your comments
22:21
Because it is still in very active development. We're going to launch this officially at the end of this year So as heather said we think one of the really exciting strengths about paper buzz is that it's all open data And it's open code behind it So we've been talking a lot yesterday Um at the kind of other workshop the all metrics workshop and also some today about the provenance chain behind the data
22:42
How do we know exactly where every single one of these events has come from? And so far that's been really difficult to do Because we have this open platform now. It's not so difficult to do we can also not only can we trace the provenance But for practical purposes we can use this data anywhere we want any way we want We think that's a really powerful
23:01
And a lot of people think that's a really powerful attribute of open data So lots of people can build tools in this and we thought well, we'll be able to align so we built this thing called paper buzz Hot buzzing this week. So I just show you this real quick. You can go there It's live on the web and you can kind of play around with a little bit but we just said What are the things that people are talking about most online using crossword event data and also using unpay wall data?
23:24
So we're able to see what people are reading who have this the hundred thousand people who have this extension installed We're also able to see what people are talking about. So we combine these in your little algorithm again It's all in open source Open source code so you can track exactly how that algorithm is formed and so we can say okay What are the top ones there's this thing about case for colonialism? There's another thing about being type 2 diabetes. That's pretty fun
23:45
Because we have open access data from the OA DOI API We can click through to read some of these some of them we can't but we're able to see which ones were able to read And then we can also sort of slice and dice the data because again, we have these rich data sources behind it So we can pick a certain topic and say well, I just want to read what are the top, you know, environment stories
24:04
And so then I can kind of see these ones are obviously about the environment Another thing that's fun is because we do have data about the audience because of the IP addresses that we have from From Unpaywall we can use that to help us figure out okay Are mostly scholars reading this or are mostly people who aren't scholars reading this where scholars are sort of proxied by
24:24
Are you coming from university IP obvious? It's not perfect. But by using this we can You can sort of see i'll just use i'll do it on all topics If you look and see what are the mostly non-academic articles First one is one about tattoos Which you maybe understand maybe a little more interesting to the wider population
24:42
Maybe just as specifically that set of scientists who are studying tattoos probably pretty small was quite a lot of the of the general public There's a lot of them. We found it was interesting a lot about people's diseases Right because a lot of times if someone has this disease, they're staying very up on the research for that. So, um, There's one a story about this therapy. There's a couple more about right. So type 1 diabetes
25:01
So people who have type 1 diabetes, of course, are very interested in following this research Perhaps even more than the researchers who are doing the the science about it so we thought and then of course, um Because open access is really important to us We can kind of filter by open access and then as you go through and read Everything is something that a person even without university library access is able to read So you can go through them as you see all these like blue read buttons
25:22
It means that you're able to um To just go ahead and read that open access either open access public on the publisher site or like I said with the OADI Database has 90 million articles in it. We can look at and see hey, is it somewhere else? You know free to read online and so we can let people click through there So i'll just click on one of the score details and this is the part that I think is
25:41
A little more relevant for a lot of the journal users is we can actually get this breakdown Of of the all metrics that we're getting from crossword event data by source as well as the unpay wall data So as you can see, there's 138 event 131 events for this. I just kind of picked this one at random. So quite a lot of tweets And you can filter and just see the unpay wall views so you can see where they're from
26:03
There's one from romania. That's pretty cool Maybe it was Maybe they were at our our recent last Conference and then we got a reddit link there And of course, we have the other types of links as well like wikipedia and stuff like that eventually people will be able to subscribe to get alerts for this but quite prominently places the view and api because again
26:22
We feel like the open data behind this is a real big part of the advantage so you can look through here and um Without any trouble you can build your own apps on top of our data. The hotness api is also You know also free for anybody to use so in a lot of ways This stuff that i'm showing you here is not actually as big a deal
26:41
as what heather was talking about because Anyone can build this right like we built it. It was fun We it took us maybe a week or something like that anyone A lot of people here in the room have the ability to build an app like this on this data And that's what we think is really powerful And you know, i'm interested in all metrics for a long time and this is definitely one of the things that i'm most excited about because Now we have an open data set for all metrics that anyone can build stuff on top of and anybody can do research
27:05
Certainly one of the weaknesses which needs to be acknowledged is there is not nearly as much data And the quality of data is not nearly at the level that you would get from allmetro.com from plum right now Um, they've got a lot more years behind it, um, and they've been collecting it for a lot more years So, um, we even have a little warning on this. It's like if it's pre 2017, you know
27:23
The data is a little is not as not as complete So we're looking forward to that improving that's definitely something that I would you know Kind of consider as a weakness of what we're doing But again, I would really consider um the strength being this this incredible source of open data We're super excited about it really happy to talk to any y'all at the break and thanks for attention
27:46
Thanks very much jason and heather I think um, that was an amazing amount of information in 10 minutes So i'm sure there will be lots of good questions At the end for you
28:02
So next up we have daniela lowenberg from the california digital library Um daniela is a research data specialist and product manager and she is going to talk about making data count
28:26
So today i'm representing make data count Which is a project to develop data level metrics and elevate data to a first-class research output I'm here representing our team, which is a sloan funded grant collaboration between
28:43
data site data one and the california digital library And you may recognize some altmetric faces included on this project So before I go into what we're working on right now, I wanted to give a little bit of a history of How this project has come since the beginning in 2014
29:04
So in 2014, uh nsf gave a grant to plus The california digital library and data one to do a pilot into data level metrics And this was really done by john kratz and carly strasser where they did a survey Out to the research community both looking at what would it take to develop data level metrics?
29:26
And what do researchers value in data level metrics? and what we found from What will we need to do to actually make data level metrics? Is that data are much different than articles and it's a much more complex process
29:42
Data have multiple granules. They're more complex than a pdf They can be versioned and data can be derived from other data So there was a lot more components to look into when trying to develop data level metrics even learning from when we built article level metrics
30:00
And what they found from the research community is that people valued citations Downloads and views similar to articles and also touched on social media mentions also being important to them so in 2016 More funded a meeting grant for california digital library data site and data one to come together and look at
30:24
This survey and what steps would need to be taken to take this project to the next level of actually developing data level metrics And what was acknowledged is that journal articles already have an infrastructure for this But data really need an infrastructure to make to elevate them to a first-class scholarly output and give credit to researchers
30:46
For data like they do for their articles Even though data play a larger role in the recent process there just hasn't been an infrastructure for this so far Which takes us to now in 2017
31:01
Data site data one and california digital library were funded by sloan For a two-year project that we're now working on to develop these data level metrics So what are we working on? The first is a formal recommendation for measuring data usage We're working with counter because we know that we need a standardized way of actually counting data metrics
31:24
So right now we've come up with a draft and that's on a google doc that we're hoping to get community input and feedback on And then we'll be Incorporating that putting it up as a preprint and iterating off of the feedback that we get from the community But we'll also be starting to build out the rest of this from this first
31:43
initial draft of the counter recommendations we've come up with Um working from that we're going to develop a hub for these data level metrics similar to article level metrics using legato We'll be using a similar framework for dlms We want to make usage tracking easier We want to drive adoption and show how it can be done easily by doing mass adoption across repositories and publishers
32:07
And we want to engage researchers across all communities and iterate as we learn from this project within the next two years What will this look like well, it'll mostly look
32:20
In the back end. Um, what we will be doing is taking with the hub We'll be taking citation usage information in and then repositories can be feeding information and log crunching similar to article level metrics That we then can display out on any repository that wants to use our consistent way of doing data metrics
32:42
This will be all open source. Everything will be documented on github And anyone who wants to be involved in working on this from the repositories We'd love to get earlier adopters We'll be beginning with the data one and the california digital library Repositories first to show how log crunching can be done
33:01
But we're looking also for other early adopters interested and gain a start on displaying this And by the end we hope to have a standardized widget that also could be on repositories showing Citations downloads views and mentions of social media for every data set just like articles
33:21
So when we entered this space though, we knew that we couldn't be the only people working on data metrics So we wanted to do a survey to see what else is going on in this space So we talked with anyone who got in touch with us when we released a survey about what we're working on a lot of publishers societies journals
33:40
Repositories maybe a lot of you that are in this room are watching the live stream And what we learned when talking with them? Is that we need a better system to monitor and track usage and impact impact metrics for data Right now there's not a standardized way people want to know what's going on with their data and it led to More community input into what we can do to make make data count and data level metrics
34:05
really useful So ways to get involved and where we're headed with this as I mentioned earlier the counter cutter practice for research data Is really where we're looking for community input. I know a lot of you in the room now
34:20
Could give really valuable feedback on this and we're hoping that as many of you can contribute as possible We also have a link that I can send out for the survey We're looking for people who want to be early adopters in this we have A list of them now that we're trying to put together But the more the better the more community input we can get into this
34:44
And also help with community outreach once we do have this up and running for data level metrics both researcher publisher all metric all views To make sure that people are citing their data and talking about data like they do of articles and also on our website, we have a roadmap for where we'll be headed and
35:04
Where you can meet with us We really need the community to be a part of this to make it a success and we're really excited about this So please get in touch with us follow us and find me at the break to talk more about it
35:27
Thanks very much. Daniela and some really nice suggestions at the end there. I think for how people can get involved Next up. We have polly allen polly is director of product management at plum analytics And she is going to talk about all metrics and societal impact
35:42
Hi everyone, how's everyone doing? Get all stretched out here So my name is polly allen and I was formerly a product manager director product management for plum analytics We were acquired by el severe in february 2017 So i'm now director product management for research metrics at el severe and we're talking today about altmetrics and and societal impact
36:06
So thinking back to the early days of the altmetrics conference and and how the movement was born Certainly one of the bright and shining hopes for altmetrics has always been that it could provide some way to measure Social impact or at least proxies for social impact some way to know in an area where we have not a lot of data
36:25
Can we drive better data-based decisions? In in the whole area as a whole and we've made a lot of really great strides in the area so far And there's of course so much more to do Um, it's exciting to see that we're moving from an era of looking at that
36:40
Just looking for the signals in the online exhaust that exists out there for and things like social social media and mass media And looking now more in specific places where people are saying can we measure social impact in a particular field? And find out where my research is being read and referenced in areas like policy documents in clinical citations
37:01
And it's some really exciting developments that I think provide a lot more Youthfulness while at the same time we're improving the robustness of the underlying data the different ways that can be gathered gathered and counted And I wanted to go through an example today of societal impact. What are we talking about? What does that what do we mean by that? And how are we measuring it compared to how it goes on in the real world?
37:23
So let's consider the example of clinical clinical guidelines and clinical citations so For a piece of research to gain clinical impact what has to happen? So the course has to be published and then it has to be read consumed in some way by someone and largely
37:41
You know those activities are moving online to a greater degree than they were in the past Um dissemination of that research is still largely at conferences and people, you know Making sure other people in their field are aware of it Again more on person, but there are ways to detect signals for this online with platforms like slide share, etc
38:01
So Once things are begun to be disseminated There's the amplification of those messages and that's a really interesting place to see where people are choosing to amplify their message Does this differ from you know from different fields? And does the news worthiness of an article? Really indicate this is a you know amplification worthy message or is this something that's that's just happens to be
38:24
Um buzzy and something of interest to the long lowest common denominator These signals are harder and harder to tease out but we've started to you know, look at the work of what do these signals mean? And the goal ultimately for a lot of researchers is for their research to become part of the consideration set
38:42
for the people who make decisions around clinical reviews and clinical guidelines to say, you know Please read this work become aware of it and then actually reference it in your work so that I can know and include in My funding application that I my work actually had, you know an impact When it finally got to the stage of being included in a clinical review a guideline a bedside assistance
39:04
tracker or part of physician education even And made its way to the clinic ultimately So what we do today Is that we're able to put little It's hard to see on the screen. There's little gauges on a bunch of different spots So we take our little gauge thermometers and we can stick them in at various points in the pipeline to get these proxies
39:26
of of impact because Actually measuring impact quantitatively is very hard and that's what people are asking for in order to make decisions So we have this ability to gather events and count them And what i'm excited to see now is even though there's so many other areas. This is one example
39:44
Where we're looking at this is one pipeline we can stick things into gather data from apis and crawling uh, there's a lot more it's a whole world of sometimes it feels like to to discover in terms of What are the other areas? Can we start looking at patents? Can we stop start mining legal documents for?
40:02
economists and social scientists and really try to see Where in all of this online, um, you know Chaos we can actually try and make some order and pull some numbers out in a robust way that that people agree agree is counting something useful
40:20
But when we talk about getting to the next stage it really is getting beyond Counting and how do we count things and it's about designing systems that? Help people make decisions and provide feedback loops the way your speedometer does on a car Um so that so that you can make decisions in more real time And and so that you can um, you can assess yourself
40:44
Do all the things that people are eager to do to improve their own performance in various ways The This of course is a double-edged sword. So there's power great great power in providing, you know having systems that Use data especially data at scale underlying them to make decisions, but there's also great great danger
41:05
And in kathleen o'neill's book weapons of math destruction She describes a lot of really interesting scenarios in place today where systems that have big data underlying them Have been designed ostensibly for efficiency But also to make sure that we're reducing the human bias and human error in a system
41:23
And she shows how so many of them have actually done the reverse where they're actually encoding the errors that exist in the system today And and reproducing them in a massive scale creating these rich get richer situations um, and and I think it's important that we do we make sure as we're designing systems and starting to move forward in that arena that we're
41:44
Making sure we recognize the underlying biases in a lot of the data When we're looking at wikipedia metrics, for example I think it's important to remember that less than 15 percent of wikipedia editors are female What kind of you know biases are we introducing by counting these metrics? And not counting others
42:02
I don't think the uh, the u.s news and world report could have anticipated in 1983 When they first as a struggling publication decided that they would take on an ambitious project To rank 1800 us universities in terms of excellence their definition That they would have had such a huge impact on academia as a whole
42:23
And so when they chose originally they started doing the the rankings they were they were doing qualitative surveys And the complaints arose. This is this is too biased Who are you phoning? Why are you phoning those people? This is just opinion based and it's actually affecting our bottom line is the universities You need to get more data. You need to get more quantitative and it's another example where how do you measure excellence?
42:46
How do you put that number on that and rank people based on that? So from a system where people might have heard from word of mouth that such and such a school has the following positives negatives Find a good fit for them We moved more to a system where because of the attributes they chose to use when they moved to a quantitative system in 1988
43:05
It's very interesting to see that they chose they wanted their ranking for people to believe These are the schools that are excellent when people think of excellence. What schools do they think of? They think of harvard. They think of stanford and what attributes do they share? They share very low acceptance rates. They share
43:23
alumni go on to prestigious jobs with high pay what they didn't include or rank highly in those early days were things like admission fees or the percentage of minorities that are going to the school And what we saw between 1985 and and and today is that tuition rates have risen 500 percent four times the rate of inflation
43:44
partially driven by schools need to improve their rankings in a system where it was designed for winners continue to win and losers unless they get a huge injection of cash unless they start making sure they have more applicants and people than that they can take how do we drive that a lot of them did this by
44:02
raising money to Provide great sports teams and get a lot of attention in the media So I think it's a it's a cautionary tale about how the metrics we use today are being used And how especially when you have a system that's designed using proxies They're more open than than than ever for for gamification and we have to be wary but also think ahead
44:23
So it's not a reactionary thing afterwards and we've designed the system and it's in motion But really thinking through what are the implications of the underlying biases in a lot of the systems we're designing So it's not all doom and gloom. I want to reassure you. I really am excited about the area and And i'm i'm incredibly optimistic. I think a lot of the problems we see in in underlying big data systems that have these problems today
44:47
They're born of laziness. They're born of the human Tendency to use what's available And if they have a quick number they can grab to use for whatever even if it's not fit for purpose There's there's a danger that people will use it lazily and and not in appropriate ways
45:03
So it's an ongoing effort for education working closely with the people who are making decisions based on our metrics And and making sure that we're conscious going forward not just of what what do we use to measure social impact? But what is the social impact of using those measures?
45:21
Thanks very much Thanks very much polly some um, really pertinent points in there I think that we all need to consider as we go through the next couple of days Next up we have jean lu and jean is product development manager altmetric and she is going to talk about all metrics for books
45:46
Hello, everybody. My name is jean and I lead all new product development at altmetric And so we've been talking a lot about metrics this morning and I think what we should also remember is that the second aspect is the narrative
46:01
So the mentions for example of research outputs make up those metrics that we are talking about So i'm going to tell you a story about all metrics for books And uh if you were in the workshop yesterday, that was one of the types of things that might be perhaps coming up in next year's Workshop, you know, how do we start to talk about altmetrics data for more than just journal articles?
46:22
And what does that even look like right now? Well bear with me So In the donuts quest a book that I personally authored is a thrilling tale of altmetrics for books that begins with a little donut Once upon a time there was a little sentient donut who wondered what people were saying about scholarly books online
46:46
The little donut embarked on an epic quest across the scholarly landscape Hoping to map the online locations of all the world's books He crossed the taylor and francis mountains
47:02
And ambled through the springer nature woodlands He traversed the elsivir rapids Canoed around the venerable wiley islands And surfed on the bustling amazon river So i'm going to pause here for a minute because these are all obviously
47:21
publisher names So when we start looking for altmetrics for books One of the things we need to do is know where to look so we need to know what kinds of records We actually want to track That is actually a surprisingly difficult task because every single publisher will have lots and lots of different domains that they host book records
47:40
So when you think about a book, do you think about the physical version of the book? Do you think about the e-book? Do you think of the e-commerce page an institutional repository page? Those are all valid places that books live and to actually find them and actually make sure we're tracking them properly Is actually quite difficult and for each publisher we've ever worked with some of them have at least
48:05
At least two to three domains if not up to eight And we have to make sure that our tracking software can actually recognize every single one of those web page formats So we can recognize that that's a book and not just a random website Um a random page on their website. So let's keep that in mind
48:24
He even followed the mysterious nomadic books who somehow managed to live in many places all at the same time So this is exactly what I mean when I say they live in different places But they are actually the same book because the concept of the book is what we're tracking We're not tracking necessarily the physical copy. We could be but it's not online
48:44
We want to for example, look at the e-book version or just the e-commerce landing page But all of those represent the book and the author themselves may not use one particular version To refer to their book when sharing the link So we need to be able to get that book wherever it lives
49:01
And you might have heard recently that allmetric.com started tracking amazon that was actually quite an undertaking Because it turns out there are tons of mentions of amazon and when we're working with data providers such as twitter That gets expensive very fast to actually track all that information Parse it down to the exact books that you want because we're not necessarily wanting to go and track harry potter
49:23
We want academic books. So that separation is quite difficult and these books these nomadic books They live all over the place. So we need to identify where they are But we have a little secret weapon you see so the little donut searched tirelessly for the magical isbns
49:42
Special numbers that identified each unique book So isbns are really interesting because they come in two forms a 10-digit digit form and a 13-digit form and you can convert between the two However, 10 digits is also a phone number So recognizing an isbn from a phone number or any other random string of numbers is quite difficult as well
50:04
You need to know the context you need to be able to find that isbn out of a sea of numbers From a policy document for example, you need to know that it is an isbn in that case Also, sometimes he had to try and avoid his foe the bad metadata snowman
50:23
The bad metadata snowman liked to mess with the little donuts metadata reading goggles making it hard to track books and chapters So ewen already talked about this in his in his presentation that publisher metadata seems to be getting worse That's definitely true of journal articles for whatever reason some publisher metadata seems to be disappearing
50:43
With books oftentimes meta tags are not even available at all And so when you don't have meta tags to rely on what you have to do instead is scrape numbers like isbns And titles and authors and everything out of the page directly Which sounds like perfectly valid it is fine. You get that information. Anyway, you can build a book record
51:04
But as soon as that web page layout changes that breaks the scraper And you can imagine in a reasonably small team that we still have it all metric. We have fewer developers and we have non-technical staff It's pretty unsustainable to go and fix every single page that breaks to try and recapture that data
51:22
So what we really want is for book publishers to add meta tags to all of the different places that their books live And one of those meta tags has to include the isbn So this is where metadata can really mess with things because as soon as we don't know What the book is it's hard to match the attention to a specific record
51:44
But the little donut remained determined Knowing that isbns were the keys to identifying the books Especially the ones that lived in many places so like I said if you put a book record on a repository And you put an isbn there. That's great As long as we see the isbn somewhere else we might be able to say oh that is the same book
52:03
So we're building that book concept. We we know that all these different records all over the web refer to one book Isbns are a little bit different than dois. They don't resolve to a specific url So we have to we don't have a canonical place to say this is the one place all the books actually came from
52:23
So that makes our lives difficult some books do get dois But that is not certainly the universal thing And isbns that you can have multiple for a specific book because different isbns refer to different versions of a book So as you see already, this is very complicated compared to tracking journal articles, which has a very established
52:46
Infrastructure a very established way of putting meta tags on a page assigning dois Registering them and having them resolve to real urls. So it's quite a complicated Mess that you end up in when you're trying to juggle different isbns and find them correctly
53:05
But in this story, we're not going to focus too much on the challenges. You see it is a nice story So finally with a map of online book locations the little donut returned home and logged into the fantastical altmetrics database So obviously if you've ever followed altmetric.com, you know that actually we don't go looking for the output records first
53:26
The signal to start looking actually comes from our attention state attention data So we see a tweet come in and it links to a domain that we recognize if we recognize that's a book publisher domain We go and get that book, but we need to know that domain first
53:42
We need to have that map of all the different domains where Content we care about lives in order to make that connection and once we do we have a huge database of great stuff But you see how very fragile it can be if the books don't have the proper metadata So suddenly he was flung into a rainbow of online attention
54:01
Flying past over 1 million mentions from mainstream news policy social media syllabi and more And so yeah already in the short time. We've been tracking altmetrics for books. I'd say about two years now We officially launched altmetric for books last year, but for about two years now, we've been collecting data We've at least been following a lot of the domains that we know host book content
54:25
And it's very astonishing I think because we didn't even Start by tracking attention sources that were specific to books These are all sources that we know mentioned journal articles and even within those there were already As you can see over a million
54:41
Mentions of these different books and occasionally chapters too. It's quite astonishing and well the donut agrees He says this is wonderful stuff exclaimed the little donut I must tell the authors Because one of the things that I think strikes me when we when we were talking about altmetric for books Is that there is a really strong need in the books area compared to journal articles to actually have some kind of meaningful data
55:06
There isn't the same saturation of citation metrics that you see with books and chapters If anything the the researchers are spending months Years on this content and then they get really no feedback afterwards
55:20
Can you imagine publishing a blog post and not checking Google Analytics? It's kind of an odd feeling to not have any feedback to know that the book's been read utilized, etc so this is something that we think authors really must see and so What we're doing and this is our little donuts mission too is to make that visible
55:40
So he baked batches of rainbow donuts and set off again this time with a new mission To show hard-working authors all the online attention paid to their scholarly books Thank you very much Thank You Jean for that very different presentation
56:02
We're a little bit tight on time. But are there any questions for any of the panelists? Yeah, would you mind coming to the mic? We good, okay
56:31
Just to make things more complicated and to add especially to Jean's task There are very a few fields where People publish anthologies and so books by themselves are not enough
56:42
We we need to be able to track the chapter in the same way that we track a journal article Can you talk to that? Okay, so chapters are actually when whenever they do get their own web page very often They are assigned a DOI
57:00
So usually chapters kind of just are listed in a table of contents on a book website But for some publishers actually so Taylor and Francis was one publisher We piloted altmetric for books with and they had assigned do eyes to all of their chapters So in that case it was possible for us to get chapter level attention data And what we opted to do was roll that into a big altmetric details page where you could actually
57:24
toggle through the table of contents I Will I won't speak too much to the way we scored it We did actually make sure we when we gave an altmetric attention score to the book We actually just pooled all the mentions for the chapters and then come and didn't sum it up for like so because that would be
57:42
Double counting and you could just for example mention loads of different chapters to gain your score So we actually didn't do that. We didn't let people do that We just pooled it and then counted it as if it was one entity to calculate the score that way so it was not as biased but For some publishers that have the deal wise it is definitely possible
58:02
But that's where that publisher awareness of adding deal wise to their different Bits of content that really needs to start happening because the technology is there It's almost like we're waiting for the publishers to kind of pick up the pace and upgrade their systems And I want to mention on the on the plum analytics side
58:20
We really love the the roll up you guys are doing on on the old metric side trying to you know How but the there's still open questions for a lot of the metrics of If someone is just tweeting about a chapter and you roll those up, that's not double counting to count those between chapters So again, the deep the devil's in the details and a lot of these metrics of that roll up task
58:40
And how do you how do you properly attribute things to the right places again? I just do think it takes work from the publishers to get on board to make sure that the the metadata is there Thanks Thanks Thank you for your presentations
59:03
My question involves so we've heard about three or more Platforms and systems and all these types of things. I wonder about the relationships between these platforms are we Partners are we Adversaries are we working together?
59:20
And then the relationship for me for instance as a researcher Should I how should I choose between one or another To do my research on and for instance as a university How do I make a decision of which one to buy do I need to buy all of them? I mean, I'd like to see some some answers you
59:42
Who would like to speak first? I think coming as I do from sort of the Bibliometrics world. I think that's a great opportunity for us research as a researcher So I quite a lot of bibliometrics research as new bibliographic databases have come online have been in
01:00:00
Looking at the properties of these various databases against one another so, you know Originally just web of science and then Scopus and then Google Scholar and then additional ones as well And there's quite a lot of literature now from the bibliometrics community saying well What are the strengths and weaknesses of these different databases for different kinds of research? And I think that's a terrific Terrific job for the research community to the extent that me and Heather are occasional publishers
01:00:21
That's something we could be publishing on. I think the different services can be publishing on it But I think it's a great it's a great research question And one that is probably never gonna be done because especially in the early days of this particular world you know if there's a lot of changes going on really quickly and I think that's one of the I always try and contrast this between the early days of
01:00:41
bibliometrics where It started quite similar to all metrics in that, you know This was a set of metrics that a lot of people were very had a lot of concerns about you mean gene when Eugene Garfield proposed this idea of mining citations to try and track the Progress of ideas through the literature He was met with a tremendous amount of skepticism couldn't get anybody to listen to his ideas for years
01:01:02
And then when he finally did the research world was in many cases extremely antagonistic this idea Like how dare you reduce what we're doing to a number and that antagonism certainly not gone away entirely But in many ways the research community has reached sort of a piece with the idea of citation metrics That took decades and I think in many ways we in the all metric community are way ahead of that because we can
01:01:24
Look at that example and say okay. Well, here are some of the questions people have here's kind of research needs to be done One thing I'm really excited about is there multiple Organizations who are doing this work So at least we can start doing that compare contrast again Well, if science is the only game in town for for quite a long time So I think trying to draw lessons from that that my two-part answer trying to draw lessons from early
01:01:44
progress of the bibliometrics world and then and like I said relying on the research community to look at the pros and cons as For the commercial benefits of one another system. I think that's probably best If we could keep comments short, but as anyone would anyone like to add anything
01:02:02
And I mean I mentioned before about the you know, the awareness and the training and stuff being So first of all, I'd say How much is? Early enough certainly for all of us to benefit frankly the more people involved in our metrics the more people learn about it and the More, you know scope there is for when it comes to the commercial side where I mean, it would be incorrect
01:02:22
Say we're not like we're definitely competing with plum commercially That doesn't mean you know, we would never work together or we hate each other or anything like that it's not that kind of relationship, but In general. Yeah on the non-commercial side. I'd say it's much friendlier certainly than I think I think the main point I'd like I said earlier
01:02:42
There's just it feels like there's this open world that is ready for exploration that we've only started to Explore and I was really excited to see like read it highlighted in in the keynote speech is like This is an area that we've been gathering metrics for ages and there's very little research on it yet So these platforms are available are available for research
01:03:02
We're very happy to work with researchers who approach us on our side I know altmetric has their API available publicly and and I think It's going to take collaboration. It's going to take open open data systems It's going to take commercial partners as well to to get where we need need to go in the field
01:03:21
Okay, I think for the sake of keeping me on time do we have one more question at all Yeah in the back Sorry about the walk. We'll try and get some more mics Hi, my question is in the fits in the debate on societal impacts, and I think it's mostly directed at Polly engine and
01:03:46
It's about country level metrics Have you tried and attempted to grasp the issue of country level metrics and if yes Could you please talk about like some initiatives or some things that you've done in this work? In the regional like regional to a country or like
01:04:03
Yeah, like any research that has been produced like if you try to aggregate the work of Researchers of a country and try to draw some comparisons for example Oh, I we don't have specific research that we're doing yet But what we're trying to do is surface some of the country level information we do store in our database. So
01:04:23
for example policy sources news sources Facebook I think and Twitter are the four that we currently have available in altmetric to actually see the country level data What we want to do bit further So we started this project early in the summer to actually make maps
01:04:40
So that you could look at for any given search query you do in altmetric You'd be able to see which countries had mentioned you the most in those sources And it's quite interesting too because it also reveals to us the coverage gaps So when we look at the whole database for example And then we look at the the maps of news Then we can see there are some parts that we're not covering very well some parts that are very well
01:05:04
I mean, I'm Canadian. So we've added a lot of Canadian news sources Let's let's just say that but you know, so there are definitely some regions of the world that were we're not able to Cover yet, but we can visualize it So the first step for us is to visualize the next step is to delve deeper But all that data is available in our API So we would want to surface that and get some really interesting research coming out about that
01:05:24
Exactly going to talk about this is something actually and see look a little bit after that these maps and so on The and now that we're part of Elsevier Elsevier is Produces CyVal so and that's been used in the bibliometric field for a long time and there we're definitely You know speaking with them long term about how we can apply the same kinds of analyses to altmetric data where it's appropriate
01:05:45
And where it makes sense And then the second point is that what's really interesting to me at the country level and the region level is that it becomes important To track completely different sources when you're looking at different ranges and regions of the world and specifically China It came up in the workshop yesterday where a lot of these social media sources are blocked
01:06:01
What what can you use again for proxies? So we've definitely been working with partners worldwide to try to find usage data citation data from regions that typically are underserved Don't have metrics in this area at all Actually because you need to aggregate the data from the article level or the research level and so you need a good coverage
01:06:20
Of course, okay Okay. Thank you very much. I'm a big. Thank you to all of our panelists I'm sure I'm sure if anyone did have further points that they'd like to pick up Everyone will be around during the lunch break. So please do go up and have a chat with them
01:06:42
Okay, I'm now going to hand over to our next chair for the altmetrics and research evaluation session. Hello everybody
01:07:47
my name is Donna Okubo and I am pinch hitting for Kate Katie Hinkley and She ran into some travel issues so you know as one of the
01:08:00
Organizing members. I know that she really wished she would have been here You know, I know she's done a lot of work in this space. But so instead you have me for a little bit So this session is going to be on altmetrics in research evaluation and So I'd like to introduce the first
01:08:22
Speaker and this is Kate Williams from the University of Cambridge who will lead us in lead us through a Understanding of emerging cultures of evaluation. I'm a research fellow at the University of Cambridge
01:08:48
and also at Harvard And I have an ESRC funded grant on altmetric indicators of research impact in development research organizations and this project involves mixed methods so includes
01:09:02
Ethnography interviews and quantitative data analysis But I want to talk to you today about some preliminary development of a theoretical framework which arising from this project So it's quite a departure from what we've been talking about today although it links to Mike's observations about whether we need a term for altmetricians and to poll his question about the social impact of
01:09:23
using societal impact metrics so Research is obviously crucial in addressing complex environmental social and economic challenges as such research Institutions are eager to develop efficient ways to measure the wider impact in order to inform strategies around research focus funding
01:09:40
communication and reporting Yet the measurement of impact is highly contested and expertise in the area is under theorized So informed by and sociological perspective today I hope to intervene in this setting by reframing impact assessment as a space between fields Which shapes cultures of evaluation and I hope this contribution will provide new directions for the study of impact and altmetrics
01:10:03
So I have five aims the first I begin very briefly by situating the recent focus on research impact in its wider contracts and then I outline three Three of the core ideas. The first is that impact assessment occurs within a space between fields The second is that altmetrics are making available new types of capital or resources
01:10:24
And the third that this framework can be used to understand emerging cultures of evaluation in research organizations And I conclude by outlining an example from the case study of the World Bank This work diverges from previous work because of its field theoretic approach
01:10:40
Which is a distinctly sociological perspective to elaborate this hybrid nature of the space between fields So recent years have seen the reformulation of science systems towards more applied socially accountable inquiry Produced by an increasingly diverse set of research institutions with greater emphasis on the cost output and quality of research and recently
01:11:02
This focus is extended to attention on wider societal impact and accordingly new technique techniques of impact assessment are emerging Key amongst these of course are altmetrics The available academic literature on altmetrics largely focuses on the statistical comparison of altmetric data to establish sources
01:11:21
The validation of measurements and the identification of biases Thus there's a current need for greater understanding about how these metrics take on meaning in practice Given the significant recent uptake of altmetric technologies research is required that considers the specific cultural and institutional effects of these new tools set in the context of nuanced understanding of cultures of evaluation around research impact
01:11:46
So using a sociological framework this paper seeks to address this gap in two specific ways First although there's a body of sociological research on traditional metrics of research quality or scientific impact Work specifically on altmetrics occurs primarily within the fields of scientific metrics and library or information science
01:12:06
Against this backdrop however two recent relevant sociological publications Argue that altmetrics are of growing importance in contemporary research systems and these studies indicate a growing sociological Interest in altmetrics and provide potential avenues for theoretically informed research
01:12:23
Second relatively few studies have focused on the sociological effects of altmetrics rather than the application of altmetric Methodologies to questions of sociological interest therefore there's a need for distinct Related work including ethnographic studies on the institutional effects of research evaluation that pushes novel insights on impact
01:12:42
Assessment and altmetrics out to different fields while also contributing news perspectives to information science and research evaluation and policy so on to my first Core idea in the study of expertise the production of knowledge has been dealt with as a function of relatively
01:13:01
Institutionalized conditions So there's an assumption that areas of expertise will grow into more defined forms as they evolve whereby they coalesce into formal structures such as academic disciplines and professional organizations and thereby establish Control over the production and certification of expertise so for example in law
01:13:20
However having only featured in Britain and Australia since around 2004 the area of impact assessment does not conform to this ideal stereotypical notion of a bounded profession Discipline or intellectual field of inquiry and one explanation is that impact assessment is an emergent field Which will coalesce into formal structures which perhaps underpins a lot of current altmetric research
01:13:42
The assumption here is that it will mature into an institutionalized field a discipline with strong highly regulated boundaries that offer autonomy from other fields and contain a specific logic and fixed capitals or resources However an alternative explanation is that impact assessment occurs in a space between fields
01:14:01
Because it comprises interactions of different arenas not simply different criteria within a self-contained field The available resources or capitals that make up this space are drawn from various field without various fields with their own Implicit and explicit rules activities and norms thus impact assessment can be conceived as a non-standard field of expertise and
01:14:23
these types of fields Located within a multi-dimensional social space that contains a number of actors stakeholders and organization types Rather than tied to a particular community for example in assessing impact the borders are blurred between university departments research centers publishers libraries
01:14:41
Academic networks professional societies the media government and the private sector each group can make a valid claim for expertise in this area The key feature here is shifting permeable borders that permit people ideas and techniques to travel between multiple fields For the range of actors who converge in these spaces this poorly articulated
01:15:01
Hybrid space permits freedoms not available within established fields So actors are able to draw on the expertise ideas and language from outside their own field while remaining credible and authoritative within their distinct arena For example impact assessment requires technical skills so bibliometrics all metrics or academic expertise
01:15:23
Expert reviews narrative case studies and to be useful it must be communicated or implemented in particular ways So performance management institutional evaluations public accountability documents or national evaluations And thus it requires an ongoing balance of technical requirements communications Skills orientation to value for money and an orientation to policy practice or management
01:15:47
Accordingly this is also true of the demonstration of impact by research organizations to be impactful research must also contain Must contain a definitive academic identity Yet that produces a range of products for receptive audiences and positive outcomes for diverse groups
01:16:03
Which can be assessed by assessed by specialists evaluators through a predefined set of packed practices So as shown here on the slide the measurement of impact the capitals or resources of several fields are at play specific specifically this might include the field of knowledge production the rigor and methodologies of academia
01:16:21
politics the media and Economics so this idea offers an alternative perspective whereby to assess research impact Evaluators and audiences draw upon various types of capital from different fields and accordingly to demonstrate impact research Organizations now must also do the same
01:16:40
So the second idea here is that in addition to these fields that I showed before we're also an emerging We're also observing an emerging field of metrics Which is the skills to create and interpret rankings and indicators in contemporary research systems metrics are imbued with significant power In addition to influencing research generation and allocation and the recruitment assessment and promotion of staff
01:17:06
They also change the way we do politics assess value allocate scarce resources and justify worth So this emergent field of metrics links various actors to users or clients using the language of numbers Yet the field of metrics itself is quickly evolving scholarship and academic communication is rapidly changing
01:17:24
With many of the scholarly processes occurring on online which creates a wealth of data that can be used to inform scholarly and impact metrics metrics Especially for applied policy and practice focused fields of study alternative metrics attractive and ostensibly helping scholars and institutions define and observe what impact looks like and
01:17:45
As evidenced by the fact that we're all here today. We're seeing large changes in the field of metrics As Halstein Bowman and Costas put it what is considerably different? however, is that old metrics capture events on platforms that are in a constant state of change and whose Use and user communities are new diverse and not entirely understood
01:18:04
Thus the increasing interest in an importance of altmetrics makes new types of capital available For example on the research production side It's at least theoretically possible for a researcher to gain legitimacy by demonstrating a huge social media following or to bolster
01:18:20
Credentials by combining citations and altmetrics on a CV or in a performance evaluation And on the evaluation side funders publishers and research institutions can also gain legitimacy by Demonstrating value for money and policy orientation by investing heavily in altmetric technologies So in this context altmetrics are now a significant factor in the space between fields
01:18:43
Given these recent challenges at recent changes. It's important to consider the specific Cultural and institutional effects of new metrics and technologies This requires an examination of how actors are involved in the process of gaining and maintaining legitimacy from various fields and highly relevant here is the work of Michelle Lamont who shows that
01:19:03
sociological research in particular can facilitate Unveiling the evaluation criteria and bringing to light the devices institutions or cultural and social structures that support or enable them So the theoretical development that theoretical framework developed here Could potentially allow us to examine the intersection of cultures and logics of different fields
01:19:25
So here I move to an example Of research production and evaluation in an international development institution namely the World Bank The bank was selected because of its dominance in global policy research and because of the tension between its research and policy functions
01:19:41
Through a three-month ethnographic period I examined the context and structures of impact at the institutional and individual level examining the acquisition of various capitals through existing and emerging cultures of evaluation and Provide briefly some early insights in order to demonstrate the potential for
01:20:01
This theoretical framework for understanding how cultures of evaluation are changing So at the bank as elsewhere the increasing focus on research impact is a structuring force in the construction of valued and legitimate knowledge The bank is an interesting case because it's engaged in a balancing act where it seeks to dominate the language and resources of metrics media academic
01:20:22
political and economic fields and as a result the cultures of evaluation reflect these diverse Portfolio of capitals held in constant tension with one another as shown by some examples here What I've seen in particular is that altmetrics are starting to change contests for capital or resources But they're not in full force yet. There's great interest and sizable investment from leaders managers
01:20:46
communications and publishing teams in these new technologies and Researchers are aware that they need to be developing the corresponding new skills as one researcher stated in this day and age with this technology It's no longer acceptable for us just to be speaking to a small community
01:21:01
Whether that's just an operational community or just an academic community There's always a sense that there needs to be some kind of above and beyond So my tentative impression is that we're seeing changing contests for capital whereby legitimacy can be gained in new ways Which will require research with a strong theoretical framework to unpack
01:21:21
The implications of the framework I've outlined for the altmetrics community might be around specific types of metrics For example that policy and media citations are particularly valued in policy research context like at the bank Because they speak specifically to political and media fields where a great deal of legitimacy is sought and amassed In addition the field of metrics is still emerging so we can expect to see shifts as the techniques and theories develop
01:21:46
Which will be reflected in the cultures of evaluation that rely on legitimacy gained from various fields. Thank you
01:22:07
Thank You Kate for a very insightful presentation The next speaker I'd like to introduce is actually a former colleague of mine Rebecca Kennison, and she is from K&N Consultants, and she'll be discussing
01:22:24
You metrics HSS initiative Rethinking humane indicators of excellence in the humanities and social sciences
01:23:07
Kate I see your sociology and I raise you philosophy We're clearly the theoretical framework, I don't know what you're talking about
01:23:21
We are a Mellon funded team that is being led by of by the Dean of Arts and letters at Michigan State who is a philosopher and so that's why I say that you'll see a lot of that in in what we're talking about
01:23:40
here Like Kate's piece about theoretical frameworks. We're also proposing a framework and we're starting with this problem It's all good to do all these great things But you're hardly making any impact on social media, and this is the world in which increasingly
01:24:02
Particularly administrators and again our team is led in part by a Dean so We're really focused on what we can do to help him So what's the goal of the humetrics HSS initiative which is humane metrics is? to create and support
01:24:21
this framework for understanding and evaluating all aspects of the scholarly life well-lived there You guys have your work cut out for you now For promoting and nurturing these values in scholarly practice And for empowering scholars to tell more texture stories about the impact of their work, so we're taking
01:24:45
All metrics and kind of flipping it around and and really taking a look at what values do we want to? Look at what do we want to encourage? And then how can we find a way to do that rather than what can we measure?
01:25:01
And now we hope something's going to come out of that So we're flipping it on its head in a reverse engineer kind of way So our preliminary values-based framework is this which is what we've we've thought a great deal about What We would like to see in the Academy
01:25:21
Equity and openness and collegiality and quality and community and we've started to test that In conversation with people and next week. We'll be having our very first workshop where we'll bring together 21 Academics from humanities and social science We're also just concentrating right now and humanities and social science
01:25:42
With the argument that if we can get it right there, we can get it right in stem. So we'll see So what does this mean in terms of what we're looking at? So equity and we're defining that as the willingness to undertake study With social justice and access to research and the public good in mind the variety of different
01:26:03
Components of that that you can see here fairness inclusivity and diversity and so on We're also interested in openness of all kinds and by that we don't just mean open access and open source, but also transparency and count and candor and accountability and again
01:26:23
Willingness to change and adapt. These are all kinds of openness. So really thinking about what the quality of openness might be What does collegiality mean and we're rapidly beginning to understand that without Collegiality none of these other values will actually matter. There's no equity without collegiality. There's no
01:26:44
Openness without collegiality there's no collaboration without collegiality and so on and collegiality the practice of kindness and generosity and empathy towards yourself and others Really hard in a competitive environment like the Academy What about quality so quality very important not just in replication and
01:27:06
reproducibility and soundness but also in Intentionality really thinking about why you do what you do. These are all quality markers And finally, we're looking at community so engagement with and leadership of
01:27:25
Communities of practice and also with the public and again variety of elements that fall under that kind of value So including solidarity Okay, but what does that have to do with altmetrics? So that's all very great. Wonderful. We should all
01:27:43
strive towards those values Well, here's where we think that we have shared goals with the altmetric Community and with the work that many people are doing and thinking about Because what we're hoping to do is expose and highlight or recognize and reward the scholarship that goes into all research activities and not just
01:28:04
publications in all their all the diversity of publications but really looking at things like peer review and mentoring and And public scholarship of all kinds in order to be able to reward all of those things And so again
01:28:22
Some things are easier to measure and in fact this morning. We found out not so easy to measure, right? Not so easy to measure and now what we're suggesting is we need to expand this Beyond that to to the work that all the scholars do entirely So I'm going to use an example of the syllabus
01:28:40
as one of those Scholarly activities because creating a syllabus is a scholarly activity Not not just not just it's not just about teaching but teaching is also a scholarly activity And we don't want to lose sight of any of those things So what might we look at in a syllabus? I know that altmetrics is working with the with the
01:29:02
Open syllabus project. So and one of our team numbers is This Stacy from altmetrics. So we're very keen at looking in the first instance of what a syllabus might well Why we look at so here's some elements that we think we can you know Pull out in a machine readable kind of way, but we'll find that out as we as we get into that
01:29:24
The course description in the discipline and and Titles and authors and hopefully people are using DOIs and ISBNs, but as we just heard that's a challenge And classing but also taking a look at things like class engagement and student Act
01:29:42
Activities and assignments how much time are you spending within a within a course on a particular? Text for example, that should tell us something. What are you doing within your class about that? that should tell you something about engagement and and tell us something different than when you're citing it in a published work that you're doing because
01:30:04
What are some of the things that we can learn by looking at at How a syllabus is created Also tells us something about The class the person and so on. So what can we discover about the circulation of ideas? Is there a difference between
01:30:25
What's being cited in a syllabus and what's being cited in In a published work we have theories about that, but we haven't really looked to see I mean our theory is that what you're presenting within a syllabus is The best work possible out there and that's not necessarily the reason why you're citing something within an article or a book
01:30:48
Let's be let's be honest about that. So That's one thing that we want to take a look at are there differing ways to take a look at that What are what are the forms of scholarship that are included and by whom what roles do the students play in the scholarly?
01:31:03
scholarly conversation and as I already mentioned earlier really interested in Engagement indicators and how much time is something is being spent and whether we can learn something about context within that So how might this then encourage those values that I started out with
01:31:23
Just gonna use some examples as you think about Syllabi and how they're created a question like am I including scholars and works from all backgrounds? That would be Equity so really thinking about that and that would start to encourage the the value of equity Have I made this syllabus open to other instructors? That'd be a kind of openness
01:31:44
Do I have a code of conduct that encourages kindness? Is this the best word possible and How can I encourage? Engagement and there are many many many questions that I'm sure all of you can think about as well that that might fit into Things that we could look at that then once we can uncover them and unpack them and really think about how you expose them
01:32:08
That might then encourage people to be more intentional for these kinds of values as they're creating their syllabus So this is the the work that started all that off. We're at the triangle SCI last October
01:32:23
I cooking this all up Lillian was there and And we've gone on from them to continue to think about this so We'd love to hear from you. We'd love to engage with you at all times We're only just beginning as I said next week is our very first workshop to take a look at this
01:32:42
So please, you know follow us and engage with us We have a blog where we're where we're blogging as much as possible. Please, you know, come and comment We welcome emails and we really really want to engage with the community. Thank you. Thank you Rebecca
01:33:14
for your philosophical side of things our next speaker will be
01:33:20
Martin Kirk from the University of British Columbia and he'll be telling us about research evaluation in Canada
01:33:45
So I'll say I'm the chemist raising from the sociology and the philosophy or maybe raising lowering I'm not sure but I am from a very different cohort from most of the rest of us So I'm part of University Administration at a very large research University
01:34:01
Number two in Canada and have a very different perspective on metrics and particularly out metrics So I thought I'd talk a little bit about you know, what's happening around us in Canada a sense of what we're up against here So it is all about impact we worry about that a lot at universities Our stakeholders demand nowadays more than ever to know, you know
01:34:23
What have we done lately and with all the taxpayers funded funding that we use in the process of our research? Somebody said earlier that we're also even in the competitive process of Securing research funding has become a lot more important to show impact and potential knowledge mobilization
01:34:42
So it's a competitive aspect as much as anything But There are obstacles to using Help metrics and metrics in general and I'm part of that administrative decision-making body It decides what metrics we worry about at UBC and what systems and and products that we implement to help us
01:35:04
One of our I would say main obstacles is in Canada We do not have a ref or an ERA so we don't have any source of block funding Which in other jurisdictions forces universities to be very mindful of things like
01:35:21
Metrics and out metrics so we don't have the same economic drivers towards using metrics and systems as other jurisdictions Now there are some funders the good news is there are some funders that are looking at Specifically metrics CFI our calendar foundation for innovation is a big infrastructure funder and
01:35:41
They are quite Big in metrics, but many of the other funders not so much our Health research funder CHR at Canadian institution Canadian Institutes of Health research are looking at things like research fish But it's very early days for us and we have not really got ahead around
01:36:04
What how the funder is going to hold us accountable in terms of research metrics? So that's that's the basic context lack of block funding it's all project-based everything is project-based here and The funders are not really pushing us too hard yet for on the metric side of things the provincial governments
01:36:26
Do push us fairly hard and so we do produce reports on a yearly basis that involve quite a bit of Biblio metrics to this point, but I think we're starting to think about metrics This is a look at the funding landscape
01:36:41
Basically list of funders in this country CFI is a research infrastructure insert science and engineering CHR health research shirk arts social science humanities And we've got various other niche programs and and chair programs, so I think we used to be a very well-funded jurisdiction
01:37:02
We've we've lost quite a bit of ground over the last ten years We used to be the top funded per capita in university research. I think we're sort of middle-of-the-pack in OECD now So Instead of talking about cool metrics and cool systems I think what we as administrators look at more is you know, what's the compelling business case? Why do we need systems and tools?
01:37:31
So I'll put my technology pole hat on instead of technology person and these are the kind of things that we worry about as University leaders You know we need to convince people are stakeholders of the research ROI
01:37:44
That's still very difficult to do and I think the tools that help metrics are developing and plum and other Groups are really helping us get to that, but it's still early days Research as an economic development activity is huge for us especially at the provincial level, and I'm sure that's true for most
01:38:04
Collaboration what are the ideal? Impacts how do we build interdisciplinary teams who's missing from the collaboration? Who should we be talking with how do we? Convince funders that we're the best in the business and that other group
01:38:20
In some other part of the world is not quite as good Social science humanity is a big aspect for us because UBC is very good at that but Again because of the lack of indicators it's difficult to show Just how good UBC really is so that's a big issue for us, and then there's all sorts of other things like growing internationalization is
01:38:45
Considered to be a good thing. We really do worry about the times higher ed and ARW you we pretend We don't but at the university administration You know we worry about that a great deal and of course if you look at the KPIs involved in things like
01:39:01
DHE and ARW it's highly quantitative. There's a bit of reputational stuff, but there certainly isn't too much in the way about metrics And those are the things that drive us you know people ask us Why are we going up or down in the various THG and ARW? The other thing in Canada that really is driving us lately in terms of competitiveness
01:39:23
And I know this is true for many other parts of the world too is low success rates in the agencies, so we're looking at Success rates as low as I think the latest CHR Competition project competition we could be looking at less than 10% so this really has become a very dire
01:39:44
Competitive issue being able to outmaneuver and and out Okay, so these are the big questions then and and these are the questions that we need tools to answer and Other uses are really not a compelling business case just because something's cool
01:40:02
It tells us something we don't already know those aren't the compelling business cases for institutions so these are kind of Some of the things that big questions that we're trying to ask you know how do we move up the global ranking scale? In the future you know how do we?
01:40:20
Who do we reward for great research and who how do we define research excellence especially in the shirk social science humanity areas Who do we hire? We've got a new program that was just launched. I would say very opportunistically the count of 150 chair with brexit and with
01:40:40
Mr. Trump that we're trying to recruit some of the world's top researchers bring them to Canada And so that's the launch the 150 anniversary Canada and then we all belong to clubs International clubs UBC belongs to you 21 amongst other and so we want to know you know how do we align with our?
01:41:02
international university partners and Strategically, what are the best themes and best opportunities to work together? You know what what does this look like well, this is a traditional look at the the world map for a sci-val We we do you start about quite a bit. It's quite a useful tool for telling us what we're good at
01:41:21
Who else is good at that these different competencies? And I guess one of the things I'll mention is that what attracts us to this particular tool Is it does tend to predict new trends and new research areas, and that's of great value to us, so This is the map of UBC and gives us a sense of who we're good at who our collaborators are who our
01:41:44
Competition is and who we should hire etc. Etc. Oh metric great tool. We've we've had a look at it It does help us answer a couple of big questions One of those questions is you know what the what is competition up to in strategic areas of interest to us so for instance
01:42:02
We we can do searches on things like Stanford and leukemia and gives us an idea of What the big stories are and what sort of impact our competition has in strategic areas of interest to us? And here's a kind of temporal look at that at that study
01:42:24
Who's looking at the articles? What sort of reach do the articles have? Is a big question for us as well Now looking at UBC. You know who are big players What sort of social attention is social media attention do they get Marco Maras one of our biggest?
01:42:42
He's big in cancer and personalized medicine so this so well metrics gives us a sense of What a big guys are up to how they how competitive they are with the other big guys and And what sort of attention we're getting I think we're running into lunchtime here, so I'm just going to speed things up a little bit
01:43:10
I think the other thing there's some when you delve into the metric reports One of the things that's really important to us too is is quality so how do we define research excellence?
01:43:22
We can look at papers and how they compared to other articles in in things like nature articles of a similar age So there's some real really important tools that tell us about quality and research excellence that we think are important alongside tools like Seibel
01:43:44
And again, you know who's using our research, and who should we be thinking about collaborating with our important? aspects of our metric So just a couple of some quick conclusions I Think evaluation of research in Canada is fairly spotty for some of the reasons. I mentioned the block funding and
01:44:05
the lack of sort of drive from our funders The current tools are good But need to be better. I think we're using them at UBC some to From us kind of an interest aspect others because we really need them badly, but there's certainly
01:44:25
Improvements more flexibility better integration more scope and metrics better social science and humanities uptake I think there's aspects of Clean cleansing data the data is not always all that clean. There's big issues of
01:44:44
Comprehend need for comprehensive data sets What drives researchers bonkers is when they look at data sets when they look at metrics and and their work doesn't appear in them And we need more nuanced impact metrics snowball plus there's other
01:45:05
Sets of research metrics out there, but we need to do more work and develop scorecards and metrics It really makes sense for all of us I don't think anyone has any real good ideas on how to get that right and certainly Times Higher Ed and ARWU does not have the answers. I don't think any of the systems Leiden
01:45:24
There's a whole list of different global ranking systems. None of them seem to really provide a very nuanced ranking of how research how excellent we all are and Just finally before lunch. I would say, you know that what we really need to do is kind of transform
01:45:41
Institutional use of the tools from kind of interesting novel cool to mission-critical and that just hasn't happened The tables I sit at that we just haven't got to that point at all So that's that's a bit about a roundup on on the Canadian context
01:46:05
Thank You Martin very very insightful So we have time for questions or question Anybody Okay
01:46:23
Quickly I find that within our institution. I'm still seeing a cultural perspective whereby people want the most basic metrics reported on a sort of a Fiscal perspective a quarterly just to meet accountability. There's a lot of that there because
01:46:42
As you pointed out a lot of people are dabblers, right? They're almost weekend warriors when it comes to the use of metrics people in leadership positions or individuals Reporting their own data. What would you suggest or what's worked for you? Particularly Martin. I think you use a lot of this data more strategically for planning to move
01:47:04
Those various agents funders like internal foundations medical leadership that sort of thing to be able to use it more strategically more for planning So I think one of the biggest obstacles we have is
01:47:21
integration at the moment You know, we've got pockets of data and pockets of metrics all over the place and so at UBC, we literally have a senior committee that has representatives from all different parts of the university voting on Literally on what metrics we're going to utilize and it's so painful trying to pull it together and actually deliver
01:47:43
An annual report say of metrics and so that seems to be the biggest obstacle and I think the other big obstacle once you leave The university is that we all know our researchers have a lot of administrative tasks and administrative burden, right? I saw I know FTP and Cogger in the US
01:48:01
estimates that Researchers spend 42% of their day doing administration So for us that's a big issue and the funder realizes that so the funder comes to UBC and says we think you should fill in research fish profiles of all your researchers and we say well We've already you know input that that data exists in different places and they want us to duplicate the input
01:48:26
So until we get better integration of data sets and we can minimize the administrative burden I don't think anyone's going to sign up for it that I think those are the biggest key issues for us