We're sorry but this page doesn't work properly without JavaScript enabled. Please enable it to continue.
Feedback

Day 1: Altmetrics: Now and Next and Altmetrics In Research Evaluation

Formal Metadata

Title
Day 1: Altmetrics: Now and Next and Altmetrics In Research Evaluation
Title of Series
Number of Parts
8
Author
License
CC Attribution 3.0 Germany:
You are free to use, adapt and copy, distribute and transmit the work or content in adapted or unchanged form for any legal purpose as long as the work is attributed to the author in the manner specified by the author or licensor.
Identifiers
Publisher
Release Date
Language

Content Metadata

Subject Area
Genre
Abstract
Now and next session: - First up was Euan Adie (Altmetric) who gave us the “10,000 foot view” from Altmetric of the past year. Among the trends he highlighted included academic twitter’s growth year over year and the rise of pre-prints. Another trend perhaps alarming to this audience is that publisher meta-data is getting worse. Without the meta tags, it’s difficult for Altmetric and others to make connections and track mentions. - Following Euan, Heather Piowar and Jason Priem (ImpactStory) who discussed the importance of transparency and open source code, and provided a sneak peak at their new project PaperBuzz, based on CrossRef event and Unpaywall user data. The aim is to help tell you “what’s buzzing this week” by topic or scholarly/non-scholarly audiences, based on open data and code. What are the things people are talking about online and what they’re reading based on open access data. While the sources do not currently provide as much data or data as high in quality, the expectation is that this will continue to change and improve moving forward. - Next up was Daniella Lowenberg (California Digital Library) discussing the “Make Data Count” project. Begun in 2014, this project aims to bring metrics to the data level. Researchers were surveyed about which metrics were most important to them, and based on the survey, the project is now tackling how to best structure data presentation to enable tracking similar to journal articles. The project is in process of producing a draft for posting as a pre-print to get feedback from the community. The project aims to make usage trackage easier and engage researchers across communities, iterating recommendations as they learn from the process. - After Daniella, Polly Allen (Plum Analytics/Elsevier) spoke about Altmetrics and societal impacts – raising attention to issues involved in tracking actual impact (using as an example clinical citations). Polly also made important points regarding the use of big data in general and the biases that are still contained within it, despite the presumption by many that data removes such biases. She encouraged us to fully think through what success means and the societal implications of using any metric, while celebrating the wealth of information that remains to be explored. - Jean Liu (Altmetric) then provided an absolutely adorable narrative about Altmetric for books entitled The Donut’s Quest, detailing the challenges of tracking altmetrics for books. Tracking is surprisingly difficult because there are multiple domains where books live online. Nevertheless, Altmetric has seen an enormous amount of attention paid to books and continue to work to track it. Altmetrics in Research Evaluation session: - First Kate Williams (University of Cambridge) spoke about “Altmetrics in practice: understanding emerging cultures of evaluation,” and her research exploring the social impact of using societal impact metrics. Kate employs sociological and ethnographic analysis to better understand the “space between fields that shapes the culture of evaluation,” and assess impact in a wider context. She discussed how altmetrics and other new forms of evaluative capital can inform development of this framework, but more research is needed around how metrics take meaning in practice. Kate further explained this “space between” involves all stakeholders in the research ecosystem — from scholars to policy makers, publishers, and the public — and :shifting permeable borders to allow techniques to transfer and exist in this hybrid space.” - Next Rebecca Kennison (K|N Consultants) gave an update on the Mellon-funded HuMetricsHSS project, which explores the potential of altmetrics as value-based indicators. The goal is to create a framework addressing “all aspects of scholarly life well-lived,” nurturing values in practice, and empowering scholars to enrich their impact narratives. - The final presentation featured Martin Kirk, PhD, (University of British Columbia) on the current landscape of research evaluation metrics in Canada. Martin reiterated a common theme: current research evaluation conversations are “ALL about impact.” In Canada there is more demand to show definitive results from taxpayer funding, and a highly competitive climate for securing these funds.