Altmetrics don't equal social media
One huge misconception about altmetrics seems to be that it is solely based on social media metrics. And while they are definitely a large part of what makes up an altmetric score, they are by no means the sole part or perhaps the most versatile.
One huge misconception about altmetrics seems to be that it is solely based on social media metrics. By social media I'm referring to social networks such as Twitter, Facebook, Google+, Sina Weibo, Pinterest, and Tumblr. And while the metrics derived from these networks are definitely a large part of what makes up an altmetric score, they are by no means the sole part or perhaps the most versatile.
Let's start by reiterating that altmetrics do not replace traditional bibliometrics, they supplement them. They do this by measuring the attention paid to the publication in question. Social media are all about the attention, but it is often hard to make out the context of said attention beyond a positive or negative reaction. If someone is recommending a paper in a tweet, they won't be able to explain why exactly in the 140 characters available to them. This means you'll have to go digging in their timeline to see whether there were accompanying tweets or whether it was part of a conversation. And while people have more room for comments on Facebook, the chance of people actually writing out a long, nuanced explanation of why a paper is of interest to their network isn't that great.
Non-social media sources
You can discover more in-depth attention, and certainly more context, in other metrics. What non-social media altmetrics are measured? Often you can find some basic numbers and metrics on the article's page at the journal, such as page views, downloads, the number of times an article has been shared by email, and of course journal article comments. But there is a lot more that can be monitored. Here is some of what altmetric.com shows us on their sources page:
- Public policy papers
- Reference managers (Mendeley)
- Mainstream media
- Post peer review (Publons, PubPeer)
Hard numbers and context...
What can we learn from the information gathered in these non-social media altmetrics? There are quantitative and qualitative insights to be gained. Firstly, there are the hard numbers without any context provided by things like page views, downloads, saves to Mendeley and CiteULike, and shares via email. These don't provide any context beyond interest and attention, as we can't see why people have accessed or saved the article. It is great that an article was shared via email, yet the text accompanying the link in an email may range from "This is a must-read for what we are currently researching." to "This methodology is faulty, let's not do this in our project." and anything in-between. Still, high numbers at least indicate that something is up with this article, but you'll need to find out what.
For this you can turn to more qualitative metrics that do provide a context and are more than just a number, for example what is said in mainstream media and journal article comments. This article in the New England Journal of Medicine has great coverage if we look at its altmetric score, but what I find more interesting are the comments to the article. Commenters are engaging with the article and each other, discussing different trials and desirable outcomes. The comments also place the article in the context of other trails researching the same topic. In some cases, comments can function as a form of post-publication peer review, as in the case of this Plos One article, alerting not only the author, but potential readers to possible problems with the research reported.
Discovering an article has been used to underpin an article on Wikipedia can also provide some explanation for the quantitative numbers. As would discovering an article has been shared, and more importantly discussed, on Reddit. Most people only know Reddit as a rather unwholesome place filled with misogyny and other hatefulness, but there are a lot of great subreddits that have interesting, respectful exchanges and discussions, such as this discussion of a breakthrough in Graph-theory, or this discussion about global carbon dioxide emissions declining for the second year in a row. When a paper has been recommended on services such as F1000 or Research Highlights, the recommender gives a short summary and explanation for why they find the article of worth.
Form a complete impact narrative
All of this engagement and discussion adds to the way an article's reception and impact can be judged. Not all attention is equal and some of the sources feeding into the altmetric score have more weight than others. And while many altmetric scores are comprised of mostly social media mentions, the non-social media metrics included often carry more weight in the calculation. They add more context and can play an important role in creating the narrative of the success and impact of your research.