All 9 entries tagged Altmetrics

No other Warwick Blogs use the tag Altmetrics on entries | View entries tagged Altmetrics at Technorati | There are no images tagged Altmetrics on this blog

April 25, 2013

A highlight from UKSG: Altmetrics session with Paul Groth

Writing about web page http://www.slideshare.net/UKSG/2-groth-uksg2013-altmetricsstory

These are my notes from a very well presented session at UKSG. I've linked to the slideshare presentation, too.

You can't rank researchers or research, using altmetrics. (Yet, I'm thinking!)

Presence on social media does seem to correlate with other measures of performance, however: Birkholz et al's research indicates that researchers with a presence on LinkedIn have a higher h-index. (I googled for this. Closest I could find was: http://altmetrics.org/workshop2011/birkholz-v0/)

1 in 40 scholars are active on Twitter. (This comes from an article by Heather Piwowar in Nature: http://www.nature.com/nature/journal/v493/n7431/full/493159a.html )

Like me, Paul thinks that academics should use all kinds of metrics to tell a good story about their work. He spoke about citations, Mendeley scores, html accesses and F1000 recommendations as an example of what a researcher could include on a CV. The measures that matter most will vary according to discipline.

The challenge seems to be to tie together activity around a published paper, such as the author's blog post announcing it and the pre-print on arxiv, and then to present all the measures surrounding the whole activity for that output of the research.

Media stories often never cite or link to the original research.

The Journal of Ecology apparently now ask for a 140 character, tweetable abstract for each article they publish. (Nice idea!)

In short, there are some really interesting things to keep an eye on, when it comes to altmetrics.


April 10, 2013

Am I really interested in Naymz?

Follow-up to Which tools monitor your social media influence & impact? from Library Research Support

The quick answer now I've investigated a bit, is "no", but I would like to explain:

1) I don't want to maintain another professional profile there when I already invested in my LinkedIn profile, and all the people I could ever want to network with online are on LinkedIn but not on Naymz.

2) I had a look at my repscore on Naymz: the Leaderboard picks up on people who declare themselves as Jedi knights, so I can't take the network or the score too seriously.

3) The repscore dashboard would be interesting if I did use Naymz for networking, and if I connected all my relevant social network profiles to it. I can't actually do that, because I use some networking sites that Naymz doesn't monitor.

I could use Naymz to watch just Twitter and LinkedIn, and I could make some more effort to use and link up Facebook or other social networking sites that it does measure. What does it tell me about other networking sites? It gives me numbers for:

- Contacts: "The number of contacts/followers on this network". This is interesting, to see where I have the maximum potential reach. But it's not actual reach if the people I'm linked with on this network aren't active users of it and will never see my posts or activity there.

- Posts: "Your recent posts on this network". I'm not sure how recent: I don't recall ever posting on LinkedIn, yet it can find 9 posts. I post a lot on Twitter, but it can only find 35 posts.

- Replies: "Replies/Comments on your posts on this network". Since I don't make any on LinkedIn, then I can't compare the level of interaction of my network members on these two networks, but it would be a potentially useful measure if I did want to compare.

- Likes: "Likes/Shares of your posts on this network". As with replies, this could indicate the actual reach of my presence on a network better than merely the number of contacts I have. I'm not sure how it counts them, though. By putting scores for all of my network profiles into one place, I could compare the networks and decide which one represented best value for my efforts.

The Naymz dashboard gives a percentage rank too, but as this is only in comparison with other Naymz members, who I am not interested in, then it's not so useful for me.

There are other measuring tools than Naymz (I mentioned a couple of others in my blog post that this follows on from), and they might count the interactions in a way that you prefer, if you're looking for a tool to do this.

However, I think that my main reason for not wanting to use Naymz or one of the similar tools, is that I'm already convinced that Twitter is a good route for me to reach the people who I want to reach. If I wasn't sure that the people I wanted to reach were active on Twitter, or I wanted to reach more people who might prefer one of the other networking sites, then I'd be glad to use Naymz and its like, to make comparisons.


April 02, 2013

Which tools monitor your social media influence & impact?

Twitter is the main social media tool that I would recommend to researchers, when it comes to influence and impact. Ideally, I think it should be used alongside blogging in some way: either your own blog if you want to build an online identity and audience for yourself, or as a guest on others' blogs. Guest blogging is a great way of benefitting from others' hard work in gaining audience!

If Twitter is my main social media tool then any tool for measuring online/social media influence and impact will need access to my Twitter account. A quick look at the "Apps" section of my settings on Twitter reminds me of tools that I've once thought might be of value to researchers for the purpose of increasing, measuring and demonstrating the impact of their research. I've not had time to investigate these properly, but I thought that it might be worth sharing which ones I'm interested in, which are:

Naymz - "Manage and measure your reputation. Get rewarded!"

Klout - "Klout finds the most influential people on every topic"

Crowdbooster - "Measure and optimize your social media marketing."

I had a quick look back at these three and found that Crowdbooster now charges a fee: this might be worthwhile, if it covers the social media channels that you use, though it has different pricing mechanisms for different numbers of social media channels.

Naymz - wants to co-ordinate my Google account, Yahoo account, email, Twitter, Facebook and LinkedIn. These are big hitters but not specific to academia.

Klout - lots more options here than there were for Naymz, but none specifically academic.

There are actually lots of tools for measuring social media influence out there, but to find the right tool for you then you need to know what you want to measure. I'm interested in Twitter, website visitors and my blog, but not necessarily combining the scores for them, since they serve different purposes. I do need to investigate more...

For those interested in reading more, this piece from Imperial College has a great summary and table comparing the tools available for measuring and monitoring, in terms of the social media sources they monitor:

http://research20atimperial.wordpress.com/optional-content/evaluation-tools/

There is no substitute for trying things out for yourself, though, and finding out not only which aspects of your social media activity can be monitored by which tools, but also how they produce their scores and what this means for your own work.


February 04, 2013

Measures of journal article quality?

Writing about web page http://techcrunch.com/2013/02/03/the-future-of-the-scientific-journal-industry/

The TechCrunch blog post linked to is by the founder of Academia.edu and it discusses the possible contribution that journal article metrics could make, to academic publishing.

In order to interpret readership metrics provided by sites/services like the three mentioned in that post, Academia, ResearchGate and Mendeley, researchers should ask, "what is the level and quality of activity on these sites?" My experience is that there are a lot of students amongst those "researcher" numbers advertised. Students can be readers too, of course, but we need to be clear about what the metrics are actually telling us. Activity and membership varies from one site to another and from one discipline to another, of course, so researchers would need to investigate for themselves. If you're investigating and interpreting for yourself then you're not going to be entirely comfortable with others using such metrics to make some judgement about the quality of your work!

My previous blog post was about publishers who display reader metrics. I wish I had time to investigate them some more!

Mendeley's metrics used to be available for others to use through an API: as ImpactStory, once called TotalImpact were doing. That seems to me to be the most useful model for researchers: then they can follow readership metrics for their papers from all locations. In my opinion, collated stats are great for researchers to track which activity affects their readership numbers most: their paper featuring on their mate's Twitter feed, or professor x's blog, or being delivered at a conference.

But are reader numbers going to lead to a new way of assessing a journal article's quality? They would need to be available from all sources where the article is displayed: publishers, repositories and networking sites would all need to count reader accesses in the same way, and share their data publicly, so that they can be collated and displayed in a reliable and consistent way. They would need to become trusted and used by the researchers themselves. That is going to take a lot of time and effort, I believe, if all the discussion about citation metrics and altmetrics that I've seen is anything to go by.

August 20, 2012

Is the LinkedIn “appearances in search” metric of interest to an academic author?

Follow-up to Who is interested in my online profile? from Library Research Support

LinkedIn recently emailed me details of who is looking at my profile. It reminds me of a previous blog post that I wrote, about who’s looking at my profile online: I often wonder if academic authors might find it valuable to track who is interested in their work.

LinkedIn told me how many profile views there have been in the last three months, how many “appearances in search” there have been, and who has been looking at my profile. I can see why it would be relevant for academic authors to see the details of others who have been looking at their profile: these might be other academics in the same field, so watching this measure is a bit like seeing who wants to listen to you at a conference. If, indeed, LinkedIn is a conference that you are attending!

I wondered what “appearances in search” meant, and found an explanation in some LinkedIn Q&As, that it is about my profile matching others’ search terms when they were not searching for my name specifically. Should academic authors be interested in this metric? I think probably not, and here is why!

I’m not 100% sure, but it seems to me that the “search” referred to must be the LinkedIn search box, on their own site. So these stats are also reflective of the amount of activity happening in LinkedIn. Since it’s not a dedicated, academic forum, our academics might not be too worried about LinkedIn activity.

If your discipline has some really active discussion groups on LinkedIn, or you wanted to generate interest in your work beyond the academic community and within the LinkedIn one (which is pretty large), then you might want to watch LinkedIn metrics more closely. You might want to see more of those search appearances being converted into profile views, as evidence that your work is relevant to that community, and as a channel to direct readers to your scholarly articles and other outputs. In order to do this, you would need to ensure that your profile describes your work accurately. But this is a good idea anyway, so I see no reason to pay attention to the number of “appearances in search”!

I blogged last time about Google Adwords but I must have had a free preview or something because I can’t find the same feature for free now. I often pop in to Google Analytics and Feedburner to see who is looking at my blog, and I regularly look at the stats for the Library’s Support for Research pages, and using these tools I can see who is looking at my site(s) and what keywords are bringing them there. These are far more rich and valuable to me than the LinkedIn stats, so I guess that they will be to academic authors, too.

But how nice of LinkedIn to send me the stats from time to time: it works for me as a reminder to update my profile!


July 09, 2012

Mendeley and ResearchGate: profile sites and repositories used in tandem to raise research profiles.

Writing about web page http://opus.bath.ac.uk/30227/

There are so many places for authors to put their papers and information about their papers online, so what is the best way to make use of them? I don't have the answer exactly, but I have plenty of ideas!

Drive traffic to the repository by creating links to your papers

Brian Kelly of UKOLN (see Brian's UK Web Focus blog) and I have co-authored a paper for the international repositories conference, OR2012. The full reference is:

Kelly, B. and Delasalle, J., 2012. Can LinkedIn and Academia.edu Enhance Access to Open Repositories? Submitted to: OR2012: the 7th International Conference on Open Repositories, 9-13 July 2012, Edinburgh, Scotland.

and naturally, it is in an open access repository and linked to from this post.

The article title mentions LinkedIn and Academia.edu, and this blog post title mentions Mendeley and ResearchGate, but the concept that the article explores and that this blog post is about, is that these kind of external, profile hosting sites could be useful to researchers in raising the profile of their work, especially when used in conjunction with repositories.

I have blogged in the past about these kinds of profile hosting sites and listed a few other such sites in a piece about Academia.edu, and I have written on this blog about the number of Warwick researchers I could find on such profile sites.

One point explored in the paper is that the profile sites offer a way for authors to create inbound links to their papers in a repository, and such links might help to optimise those papers' search engine rankings, since the number of links to a page or site are a factor in search engine rankings.

I don't quite understand how search engine rankings work (that's their business, and it's getting ever more complex... SEOmoz have a useful article), but inbound links have long been a factor, one way or another. And as a former repository manager and a long-time information professional, I'm very, very aware of the important and sizeable role that Google has to play in bringing visitors to papers in a repository. Some of my early blog posts on the WRAP blog attest to that.

So profile sites are useful to researchers in offering a quick and easy way to generate inbound links to your repository papers: it's a simple concept, but as the example of Brian's work that is given in our paper demonstrates, there are probably a lot of other factors as well that might raise the profile of a researcher's papers.

Maintaining profile details on these sites

Naturally, Brian Kelly and I have profiles on these sites, and our paper is appearing on our publication lists on these sites... thanks Brian, for uploading it and making it easy for me! I confess, that I have left partial profiles on most of these sites: it takes a lot of time to create and update profiles properly. Brian is really good at doing this but I'm not a great example to other authors about how to use these sites.

The two sites I have been looking at most recently are Mendeley and ResearchGate:

I like ResearchGate for making it easy for me to "claim" articles that it has found, as ones that I am an author of. In particular, I like that it harvests records from my institutional repository, so if I kept that up to date with all my papers, then it would be relatively little effort to also keep my profile on ResearchGate up to date. Bravo, ResearchGate! (I have blogged about ResearchGate recently, in greater detail).

However, the thing that I find most irritating about ResearchGate when it comes to using it in tandem with an open access repository, is that it invites me to upload the full text of my paper in a huge box on the top right hand side, and it displays my paper to others with a "Request Full-text" button. Meanwhile, the link to the repository where the full text is available is almost invisible and it is not recognisable as a potential full text source. It simply says "Source:OAI" and the "OAI" part is a link to the WRAP repository record from where the full text can be retrieved.

This makes me have considerable sympathy with authors whose papers I have requested copies of, when I was a repository manager, because it is irritating when your article is already available on open access to all, to be asked to put it in another place as well!

Mendeley has similar features and issues in that I can import records from all sorts of sources using its "web importer", including Google Scholar which does index a lot of repository content... but it's not so simple to use as ResearchGate, when it comes to updating my profile with my own papers from the institutional repository. When I carry out a search on Mendeley itself, I find a sophisticated advanced search form, which I like, although I don't like that I can't edit my search string in the search box after running the search. I tried to do that after my first advanced search and got no results but when I went back to the advanced search form and put my revised criteria into the form, I got results. I think that's clunky and there is work to be done on it as a publications discovery tool.

On Mendeley, I am able to refine the results of my search further by selecting a tick box on the right hand side "Open access articles only". I tried this and was disappointed. It finds papers that I have written, but it doesn't know that the ones in WRAP are available on open access.

How do I tell Mendeley that the paper is already available on OA? Why doesn't it already know?

Both Mendeley and ResearchGate have got it wrong

Or at least, from an open access point of view, they have got it wrong. It ought not to be up to the author to upload their content into several places online. And they should be making it easy for people searching within their environments to get through to the existing open access versions of papers: after all, it's hardly in the spirit of OA to make it difficult for people to access the open access version!

Repository managers' perspectives

One of the points that Brian and I made in our poster for OR2012 was to ask 'why don't repository managers recommend use of external researcher profile sites?' Well, it would help if the profile sites worked nicely with repositories, I think.

And of course another answer to our question is that repository managers have enough of a struggle getting papers for the repository itself, never mind encouraging authors to put their papers elsewhere as well.

Beyond that, it is likely that others at the University are advising on the use of social media, so it might be something that repository managers don't see as their role.

Recently, I posted to a repository managers e-mail list to double check if any of them were recommending such sites:

One replied to say that she had noticed some researchers from her institution who were putting their documents onto sites like these, in full text, but not into the institutional repository. So perhaps repositories should be harvesting from the likes of Mendeley and ResearchGate, too.

At the University of Glasgow, they are sometimes using the "Related URL" field to link to a version of the article on Mendeley (see this example record), which is a step towards integrating these two approaches.

Social Media more generally

One repository manager responded that she did encourage authors to use social media "like LinkedIn, Twitter and a blog". And I was sent a very useful link to a blog post by Melissa Terras at UCL, entitled "Is blogging and tweeting about research papers worth it?" (Short Answer: yes, if you want to attract visitors!)

I think that the use of "social media" is a much bigger topic than the use of profile sites as such. I know that most of the places where researchers can put their profile information are also social media tools in some sense. But this blog post is not intended to cover the social aspects of these tools: that is perhaps for a future blog post!

One more relevant aspect is that publisher websites do often encourage authors to use such profile sites and social media in general, to raise the profiles of their papers. I have blogged about publishers' instructions for authors already.

And finally, I must say that Brian Kelly is an excellent example of an author who uses profile sites and social media. He has uploaded details of his papers onto these sites, but he has also deposited OA copies into his institutional repository and blogged and tweeted about his papers before the conference itself, to raise interest in them. I'm not at all surprised that Brian is the author of the 15 most downloaded papers in the Bath repository, from his department!


July 03, 2012

Mendeley's number of readers

Follow-up to Webometrics and altmetrics: digital world measurements from Library Research Support

I once blogged about Altmetrics and the tool Total Impact, which seemed to use the Mendeley API for tracking papers’ popularity.

I had another look at Total Impact lately and it has been worked on: I can’t give it my Mendeley profile any more, and in fact it didn’t do anything at all for me, but it is in beta and so I sent them some feedback explaining that I got nowhere with the tool, and we shall see.

So, I went directly to Mendeley, and you can see how many “readers” there are for a paper in the results of a search there, but that information is not displayed with the paper’s information once you have added it to your own library or to your list of “my publications” for display on your profile. I was disappointed that apparently only one of my papers is “open access” according to Mendeley’s search filter, even though they are in WRAP and so they are open access... I'm not sure what Mendeley's criteria is for a paper being "open access" according to its search filter.

From what I can tell in the FAQs on the Mendeley site, number of “readers” in Mendeley is the number of distinct users who have added the paper to their Library on Mendeley. It doesn’t actually mean that they’ve read the paper: I added a handful of papers that look interesting to my own Library that I have never read. It’s more of a wish list!

And then I played around with Google some more, to see if there were other tools that were accessing Mendeley’s “reader” numbers API, and I came across Readermeter which looks really interesting because you can give it the author’s name and get all sorts of stats back in a pretty format!


June 28, 2012

Scholarly social media use

A couple of articles have come to my attention lately, documenting researchers' use of social media. One is about early career Victorianists:

Amber K. Regis (2012) Early Career Victorianists and Social Media: Impact, Audience and Online Identities, Journal of Victorian Culture. Online at: http://dx.doi.org/10.1080/13555502.2012.689504

This article compares a tweet to a postcard!

Regis says that social media are important because "they are able to create and sustain inclusive communities", i.e. communities with reach beyond academia. I like this because it relates very much to the work we are doing with the Wolfson Research Exchange and the PG Hub with their digital presences and emphasis on peer support. We use blogs, Facebook and Twitter and websites for both facilities and their communities. And of course it relates to the research impact agenda, as Regis goes on to discuss.

Regis picks out some particular researchers and their blogs:

And Regis describes the changing academic landscape, where job adverts ask for candidates to demonstrate "imagination in terms of the dissemination of research findings", and for a "modern portfolio of research skills". Employers will be thinking of the REF exercise and the priorities of research funders, and googling the names of candidates.

According to Regis, the REF panel criteria only mention social media as a general term once, and blogging gets a mention as a potential citation source beyond academia, but in the matter of public engagement and impact of research, Regis says that "social media haunt the spaces between the lines." What a lovely turn of phrase!

Regis explains that "comments, replies, tweets and retweets are an immediate source of 'third party engagement' and 'user feedback or testimony' as required under the REF" and she quotes Warwick's own Charlotte Mathieson, who says "...public engagement is something that occurs while research is taking place and not simply after the fact." Charlotte has written some good blog posts and guides on the topic of impact, whilst working for us.

I find the Regis article important because of the disciplinary focus it has. It discusses the role of social media with examples from those researching a specific field, that of Victorian culture. However, the points it makes could be widely applicable to other fields of research. A few years ago I was writing an internal report for our library and looking for examples of researchers' blogs, and I found it difficult to identify research blogs by individuals. But perhaps if I had been a researcher within a particular discipline I would have been more likely to find the kind of examples I was looking for, as the author of this article was able to do. Finding good blogs and engaging with social media relevant to your field requires an immersion in and awareness of your field, just as with keeping up to date with research papers and articles.

The other article on the theme of researchers' use of social media that came to my attention lately is on the LSE Impact of Social Science blog, which is one also mentioned by Regis, but which I've been following for some time, latterly on my RSS feed reader and lately via their Twitter feed. It's a blog which covers lots of the themes I'm interested in. In particular the blog post of interest is: Scholars are quickly moving toward a universe of web-native communication

This blog post has multiple authors and a very academic style: it is a taster for a conference paper soon to be delivered. It deals with the theme of altmetrics, which might become important in the online, social media research era, just as bibliometrics have become important in measurement of research through the formal publication channels.

The authors state: "But before we can start to seriously examine scholars’ personal altmetrics, we need to get a sense of how wide and established their presence on the social Web is..." and they go on to describe how they measured the work of a sample of 57 authors who presented at a Science and Technology Indicators conference.

Of their sample, 84% had homepages, 70% were on LinkedIn, 23 % had Google Scholar profiles and 16% were on Twitter. I don't know if they also looked for the authors on other profile sites like Academia.edu or ResearchGate, but I do like their methodology and perhaps other researcher samples could be taken and assessed in this way. I think that their sample might not be representative across the fields.

Another aspect of the work the LSE blog authors carried out was to source activity relating to the researchers' papers, on Mendeley and on CiteULike, and to correlate this activity with the number of citations for the papers on Scopus, and they found some significant correlations. I am interested in that these researchers may or may not have had their own profiles on Mendeley and CiteULike, but that's not the point, because their work can be bookmarked on these sites in any case. They conclude their blog post by saying " It’ll take work to understand and use these new metrics – but they’re not going away."

Having read these two articles in quick succession, I am minded to believe that researchers' use of social media is growing and that these two articles describe two different ways to survey that growth and the significance of it. Regis has investigated blogging within a particular speciality, whilst the LSE blog's authors investigated online presence more broadly.

My next interest is in how researchers keep track of the social media relating to their field, and indeed share that current awareness tracking with others. There were once RSS feed readers but nowadays there are tools and sites like paper.li, storify, pinterest and pinboard and the stacks feature on Delicious, Bundles on Google Reader, Bundlr, and Mendeley and Zotero and CiteULike no doubt offer similar features, etc, etc, etc! These allow you to not only keep track for yourself but to also share your tracking with others: there have always been tools that did this, but there is an abundance these days and I wonder which ones researchers use and why...


February 21, 2012

Webometrics and altmetrics: digital world measurements

Writing about web page http://altmetrics.org/manifesto/

Research performance measurement often includes an element of output (or publication) counting and assessment, possibly including citation counts, and I've written a lot here about such bibliometrics and assessment.

The digital, web 2.0 world allows for many other, different kinds of metrics to be recorded and reported on, and could one day become a part of researchers' performance assessment, either just for themselves or indeed through more formal processes at institutional level or through an excercise like the Research Excellence Framework (REF).

I've linked to the altmetrics manifesto, and that has some very interesting contributions to the exploration of other kinds of metrics and measurements.

Note that PLoS One are running a special “collection” on altmetrics with a submission deadline just passed in January. And that if you’re an author with an article published by PLoS One, then the number of views for your article are displayed along with the metadata for your article. Warwick’s repository, WRAP, also shows download stats for articles these days, in the metadata records… eg: http://wrap.warwick.ac.uk/933/

The problem with web stats and altmetrics is that there are potentially a lot of sources which will all measure the stats for different versions of the same item, or different elements of the same output, in different ways. This sort of thing is a driver for publication in an open access (OA) journal with one canonical copy of an article in just one place online: the so called "gold" route to OA.

Authors of the future will want all web visitors to go to the publisher’s site, in order to boost the no. of viewers stated there. Well, some already do! But that rather assumes that the publisher will also provide all the functionality for commenting and reviewing and interaction with the research that the authors might like to see, and that the publisher will provide suitable measures to the author, and that the only route for publicising and making your work discoverable that is necessary, is the formal publication route...

The other route to OA is known as the "green" route, and it involves putting an earlier version into an OA repository (or more than one!) in addition to the canonical published version. All such versions should be clearly described and should point to the canonical one, ideally. This would allow for your work to be made available and promoted by all those repositories where you have deposited a copy or allowed a copy to be harvested, eg your institution and a subject specific repository.

The green route follows the "lots of copies keep stuff safe" mentality and contributes to ensuring the longevity of your research's availability and discoverability. And it could also enable new research techniques such as text mining to be employed on your outputs and thus build on your contribution to the discipline, if you've given suitable permissions at the deposit stage.

So, when it comes to altmetrics what we ideally need is some way of recording visitor stats and other metrics for all versions of one article, and collating these into one report.

The altmetrics site I've linked to has a page of tools which I had a play with recently: http://altmetrics.org/tools/ Here is the story of my "playing"!

I gave Total Impact my (rather scrappy) Mendeley profile. I have 3 articles to my name on Mendeley, and Total Impact picked up on 2 papers: in the event, only one of those was actually mine (something wrong in the metadata, I think), and that has had only 2 readers on Mendeley. Which is entirely believable, but not likely to be the “total impact” of my article!

Actually, I know it’s not the "total impact" because the same article is in WRAP and I can see additional visitors to the paper there, without even considering accesses on the journal's own site, but I guess that Total Impact doesn’t know about the other versions of that object.

I tried giving Total Impact a DOI instead… None of my articles have DOIs (I'm not an academic author: practitioner stuff only!), so I gave it the DOI for a different article (the record linked to above), and you can see the report: http://total-impact.org/collection/UMpoWa

Not much more impressive than my article, yet the WRAP stats are more impressive! So it could be that the problem is the size of the Mendeley community, and the fact that Total Impact is not picking up on visitors from elsewhere for articles.

I thought I’d give Total Impact another shot with my Slideshare profile. I’ve not been especially active in Slideshare either, but I have seen healthy stats for my handful of presentations last year. And Slideshare has a relatively large community of users. I like the Total Impact report structure for the Slideshare report: http://total-impact.org/collection/McWgLs It gives info on tweets, facebook likes and other sources of data about the Slideshare items. That’s what I thought altmetrics ought to be!

Some of the other sites that Total Impact can work with are probably worth investigating, too: I don’t know about GitHub or Dryad. I looked GitHub up: https://github.com/ and it seems that’s what I need to try next, to visit there to collate all versions of my articles!

There are other tools on the Altmetrics site that I wish I had time to try out, too!

This week, discussion on UKCoRR's mailing list raised the following altmetrics tool to my attention: http://altmetric.com/bookmarklet.php I installed it on Chrome but couldn't get it to work with the articles I tried on Web of Science and on Cambridge Journals Online. The UKCoRR community are reporting that it doesn't pick up on the DOIs from their repositories either, so I guess it's just another thing that is in development.


Subscribe to this blog by e-mail

Enter your email address:

Delivered by FeedBurner

Find out more...

My recently bookmarked sites

Tweet tweet

Search this blog

Most recent comments

  • Oh yes, I'm writing that too! And tidying up my paperwork, plastering each piece with post–it notes … by Jenny Delasalle on this entry
  • A useful list, thanks Jen. I would add "it's never too early to start writing your handover document… by Emma Cragg on this entry
  • Yes, Google does find things very fast: I use it a lot to find sites that I know and regularly visit… by Jenny Delasalle on this entry
  • Mac OS has the ability to share Safari www bookmarks and other data, securely across multiple machin… by Andrew Marsh on this entry
  • Hi Peter, I see that you practice what you preach… and indeed the point that you make about being ac… by Jenny Delasalle on this entry

Blog archive

Loading…
RSS2.0 Atom
Not signed in
Sign in

Powered by BlogBuilder
© MMXIV