Métricas: Uma Perspectiva da Elsevier Setembro, 2018
| 8
ACI Amazon Airiti bepress bit.ly CABI CrossRef Delicious Dryad dSpace DynaMed Plus EBSCO ePrints
Facebook figshare Github Goodreads Google+ Mendeley NICE (UK) OJS Journals PLOS PubMed PubMed Central Reddit RePEc
SciElo Scopus SlideShare SourceForge SSRN Stack Exchange Twitter USPTO Vimeo Wikipedia Worldcat (OCLC) YouTube
DMP
Sources for Plum Metrics
| 9
USAGE (clicks, downloads, views,
library holdings, video plays) CAPTURES
(bookmarks, code forks, favorites, readers, watchers)
MENTIONS (blog posts, comments, reviews,
Wikipedia links)
SOCIAL MEDIA (+1s, likes, shares, tweets)
CITATIONS (citation indexes, patents,
clinical, policy)
PlumX Metrics
| 10
PlumX Metrics
Usage – A way to signal if anyone is reading the articles or otherwise using the research. Usage is the number one statistic researchers want to know after citations.
Captures – Indicates that someone wants to come back to the work. Captures can be an leading indicator of future citations.
Mentions – Measurement of activities such as news articles or blog posts about research. Mentions is a way to tell that people are truly engaging with the research.
Social media – This category includes the tweets, Facebook likes, etc. that reference the research. Social Media can help measure “buzz” and attention. Social media can also be a good measure of how well a particular piece of research has been promoted.
Citations – This is a category contains both traditional citation indexes such as Scopus, as well as citations that help indicate societal impact such as Clinical or Policy Citations.
| 11
PlumX Metrics
An example of a Plum Print for an article that has metrics balanced in all categories. Link to article on PlumX.
An example of a Plum Print with a lot of Citations and Captures, a small amount of Usage, and no Mentions or Social Media. Link to article on PlumX.
An example of a Plum Print with an outsized amount of Social Media. Link to article on PlumX.
“A ‘basket of metrics’—the best support for understanding journal merit” by L. Colledge, 2015, European Science Editing (41(3)), 61. Copyright 2015 by the European Association of Science Editors • http://www.ease.org.uk/sites/default/files/origarticle_1.pdf
Referencias
Elsevier and research metrics Our Metrics Manifesto
• Need to use different metrics and
common sense • Decisions should be based on both
quantitative and qualitative input • Should always use at least two
metrics (more than one way to ‘excellence’)
• The methodologies should be open,
transparent, valid and replicable
• Definitions should be owned by the community
• Need trust between the parties using metrics to evaluate