Webometrics and altmetrics: digital world measurements
Writing about web page http://altmetrics.org/manifesto/
Research performance measurement often includes an element of output (or publication) counting and assessment, possibly including citation counts, and I've written a lot here about such bibliometrics and assessment.
The digital, web 2.0 world allows for many other, different kinds of metrics to be recorded and reported on, and could one day become a part of researchers' performance assessment, either just for themselves or indeed through more formal processes at institutional level or through an excercise like the Research Excellence Framework (REF).
I've linked to the altmetrics manifesto, and that has some very interesting contributions to the exploration of other kinds of metrics and measurements.
Note that PLoS One are running a special “collection” on altmetrics with a submission deadline just passed in January. And that if you’re an author with an article published by PLoS One, then the number of views for your article are displayed along with the metadata for your article. Warwick’s repository, WRAP, also shows download stats for articles these days, in the metadata records… eg: http://wrap.warwick.ac.uk/933/
The problem with web stats and altmetrics is that there are potentially a lot of sources which will all measure the stats for different versions of the same item, or different elements of the same output, in different ways. This sort of thing is a driver for publication in an open access (OA) journal with one canonical copy of an article in just one place online: the so called "gold" route to OA.
Authors of the future will want all web visitors to go to the publisher’s site, in order to boost the no. of viewers stated there. Well, some already do! But that rather assumes that the publisher will also provide all the functionality for commenting and reviewing and interaction with the research that the authors might like to see, and that the publisher will provide suitable measures to the author, and that the only route for publicising and making your work discoverable that is necessary, is the formal publication route...
The other route to OA is known as the "green" route, and it involves putting an earlier version into an OA repository (or more than one!) in addition to the canonical published version. All such versions should be clearly described and should point to the canonical one, ideally. This would allow for your work to be made available and promoted by all those repositories where you have deposited a copy or allowed a copy to be harvested, eg your institution and a subject specific repository.
The green route follows the "lots of copies keep stuff safe" mentality and contributes to ensuring the longevity of your research's availability and discoverability. And it could also enable new research techniques such as text mining to be employed on your outputs and thus build on your contribution to the discipline, if you've given suitable permissions at the deposit stage.
So, when it comes to altmetrics what we ideally need is some way of recording visitor stats and other metrics for all versions of one article, and collating these into one report.
The altmetrics site I've linked to has a page of tools which I had a play with recently: http://altmetrics.org/tools/ Here is the story of my "playing"!
I gave Total Impact my (rather scrappy) Mendeley profile. I have 3 articles to my name on Mendeley, and Total Impact picked up on 2 papers: in the event, only one of those was actually mine (something wrong in the metadata, I think), and that has had only 2 readers on Mendeley. Which is entirely believable, but not likely to be the “total impact” of my article!
Actually, I know it’s not the "total impact" because the same article is in WRAP and I can see additional visitors to the paper there, without even considering accesses on the journal's own site, but I guess that Total Impact doesn’t know about the other versions of that object.
I tried giving Total Impact a DOI instead… None of my articles have DOIs (I'm not an academic author: practitioner stuff only!), so I gave it the DOI for a different article (the record linked to above), and you can see the report: http://total-impact.org/collection/UMpoWa
Not much more impressive than my article, yet the WRAP stats are more impressive! So it could be that the problem is the size of the Mendeley community, and the fact that Total Impact is not picking up on visitors from elsewhere for articles.
I thought I’d give Total Impact another shot with my Slideshare profile. I’ve not been especially active in Slideshare either, but I have seen healthy stats for my handful of presentations last year. And Slideshare has a relatively large community of users. I like the Total Impact report structure for the Slideshare report: http://total-impact.org/collection/McWgLs It gives info on tweets, facebook likes and other sources of data about the Slideshare items. That’s what I thought altmetrics ought to be!
Some of the other sites that Total Impact can work with are probably worth investigating, too: I don’t know about GitHub or Dryad. I looked GitHub up: https://github.com/ and it seems that’s what I need to try next, to visit there to collate all versions of my articles!
There are other tools on the Altmetrics site that I wish I had time to try out, too!
This week, discussion on UKCoRR's mailing list raised the following altmetrics tool to my attention: http://altmetric.com/bookmarklet.php I installed it on Chrome but couldn't get it to work with the articles I tried on Web of Science and on Cambridge Journals Online. The UKCoRR community are reporting that it doesn't pick up on the DOIs from their repositories either, so I guess it's just another thing that is in development.