All 28 entries tagged Bibliometrics

View all 31 entries tagged Bibliometrics on Warwick Blogs | View entries tagged Bibliometrics at Technorati | There are no images tagged Bibliometrics on this blog

February 10, 2013

Is my book the most highly cited in its field?

To answer this, you need data on how many citations there are to your book and to others in your field. There are two sources of citation data for books, that I know of:

  1. Thomson Reuters' book citation index. Not everyone will have access to this, of course, as it's a subscription product.
  2. Google Scholar: this is available to everyone and is the source I've investigated.

A simple search for your book on Google Scholar will tell you how many citations there are. Note that G Scholar does try to collate records for all versions of your book, but for books available in many editions and reprints, then it might not be too successful at this!

Next, how do you know if your book is the MOST highly cited in your field? It's impossible to tell really, but a good clue is to invstigate the "related articles" link in the results of your search that brought you data about your book. This will find items that are similar to yours so therefore are likely to be in your field.

Within that list, there will be citations and journal articles as well as books. You can look through the results and spot books quite easily: look at how many times the books have been cited. If any are more highly cited than yours, then you know that your book can't be the most highly cited in your field, at least as far as GScholar is concerned. Whether or not you choose to trust their data on citations is a separate matter!

If none of those citations are anywhere near your citation count, then it would seem that there is a good chance that your book is one of the most highly cited in your field. You probably know some of the competitor books to yours: try searching for them on Google Scholar too, to check.

If you don't already know competitor books in your field then I recommend looking on the COPAC union catalogue at the record for your book, and clicking on the subject heading links from within that record to find books in the same subject category.

Best of luck!


February 04, 2013

Measures of journal article quality?

Writing about web page http://techcrunch.com/2013/02/03/the-future-of-the-scientific-journal-industry/

The TechCrunch blog post linked to is by the founder of Academia.edu and it discusses the possible contribution that journal article metrics could make, to academic publishing.

In order to interpret readership metrics provided by sites/services like the three mentioned in that post, Academia, ResearchGate and Mendeley, researchers should ask, "what is the level and quality of activity on these sites?" My experience is that there are a lot of students amongst those "researcher" numbers advertised. Students can be readers too, of course, but we need to be clear about what the metrics are actually telling us. Activity and membership varies from one site to another and from one discipline to another, of course, so researchers would need to investigate for themselves. If you're investigating and interpreting for yourself then you're not going to be entirely comfortable with others using such metrics to make some judgement about the quality of your work!

My previous blog post was about publishers who display reader metrics. I wish I had time to investigate them some more!

Mendeley's metrics used to be available for others to use through an API: as ImpactStory, once called TotalImpact were doing. That seems to me to be the most useful model for researchers: then they can follow readership metrics for their papers from all locations. In my opinion, collated stats are great for researchers to track which activity affects their readership numbers most: their paper featuring on their mate's Twitter feed, or professor x's blog, or being delivered at a conference.

But are reader numbers going to lead to a new way of assessing a journal article's quality? They would need to be available from all sources where the article is displayed: publishers, repositories and networking sites would all need to count reader accesses in the same way, and share their data publicly, so that they can be collated and displayed in a reliable and consistent way. They would need to become trusted and used by the researchers themselves. That is going to take a lot of time and effort, I believe, if all the discussion about citation metrics and altmetrics that I've seen is anything to go by.

January 25, 2013

Sharing metrics relating to articles

Writing about web page http://altmetrics.org/manifesto/

This time last year, PLoS started to display even more article level metrics. I felt intrigued by the openness about article downloads on PLoS. It led me to wonder whether other publishers were sharing such information so publicly.

Repository tools are available to display accesses for articles in the repository, as at the Bath University repository in this record of a journal article. Some repositories have chosen not to make such statistics publicly visible, however.

I don't see many publishers publicly displaying article level metrics like this, but publishers do sometimes showcase the most downloaded content publicly on their website. For example Springer journal home pages display their most downloaded articles: see the journal Artificial Intelligence

Other publishers share download statistics with the authors, although not publicly displaying them on article records. For example, Nature describe this as a benefit to their authors.

I'd be glad to hear of other examples of publishers displaying these download statistics. I think that authors should be able to monitor activity around their papers for themselves, and I wonder if there is a role for such statistics in helping our readers to ascertain the highest quality papers.

However, I am slightly cautious about download statistics being publicly visible because there are often many versions of a journal article available: on the author's website, on the publisher website and in a repository, for example. And I think that all of these versions should be available because this provides an insurance and archival possibility for authors as well as additional discovery and access options for readers, but that a focus on download statistics could lead authors to become wary of sharing their articles in so many places.

The Internet allows us to create lots of metrics about a researcher or a work, as evidenced by all the altmetrics activity (see my link, above). But just because a measure exists, should it be publicly visible? How should we use these new measures? My own answer is "with caution" and indeed that the author is best placed to make use of the download statistics, because the author will be best able to understand what they mean.


October 09, 2012

Which index measures your research?

I recently compiled a little list of measures like the h-index, intended to measure the performance of individual authors. Have I missed out your favourite(s)? Which one(s) do you like and why? How should they be used? (Or not used!)

Personally, I like the h-index best because it is well established and relatively simple to understand. However, it needs to be expressed along with the date and the data source used. Any of these measures ought to be presented along with some explanation and/or examples of well known researchers' scores, to give it context. Sometimes, researchers are asked for their h-index, but if their g-index or m-index score is more impressive, then why not give that too?

h-index – an author with an index of h has published h papers each of which has been cited in other papers at least h times: there’s a great Wikipedia article about it. http://en.wikipedia.org/wiki/H-index

g-index – while the top h papers can have many more citations than the h-index would suggest, the g-index is the highest number g of papers that together received g squared or more citations. This means that the g-index score will be higher than that of the h-index. (I found this explanation at: http://www.researchtrends.com/issue1-september-2007/from-h-to-g/ )

m-index (aka m-quotient) - h/n, where n is the number of years since the first published paper of the scientist. (supposedly handy to differentiate between authors of different vintage in the same discipline)

contemporary h-index –where younger papers accrue higher weightings for each citation, as calculated (and documented) on Publish or Perish (http://www.harzing.com/pophelp/metrics.htm#hcindex).

hI-index – This takes account of co-authorship also documented on Publish or Perish.

hI, norm index – see Publish or Perish

hm-index – see Publish or Perish

AWCR - see Publish or Perish

AWCRpA - see Publish or Perish

AW-index - see Publish or Perish

i10-index – Gscholar “My Citations” gives me this score: “i10-index is the number of publications with at least 10 citations. The second column has the "recent" version of this metric which is the number of publications that have received at least 10 new citations in the last 5 years.”

n-index - Researcher's h-index divided by the highest h-index of the journals of his/her major field of study (n is the first letter of Namazi) Proposed in an article at: http://dx.doi.org/10.4103/0378-6323.62960

A-index – average no. of citations in the article set that makes up the Hirsch core. (Hirsch core is the set of articles whose citation scores count towards the h-index score.) The muddled explanation is mine but you can read about it properly at: http://eprints.rclis.org/bitstream/10760/13282/1/hIndexReviewAlonsoCabrerizoHerrera-Viedma.pdf An excellent article reviewing these measures, published in 2009.

R-index - square root of the sum of citations in the Hirsch core (in the same article linked above!)

m-index – (yes, it looks like another type of m-index entirely & probably explains why the measure listed above is also known as the m-quotient) the median number of citations received by papers in the Hirsch core. See the article linked from my A-index explanation.

NB there are many other measures explained in that article, but by this point I gave up trying to understand them! I quote from the conclusion of that paper instead:

that many h-index variations, although being designed to overcome some of its supposed limitations, do indeed correlate quite heavily. This fact has made some researchers think that there is probably no need to introduce more h-index variations if it is not possible to prove that they are not redundant in real examples.”

The article also concludes: “h-index is quite dependant on the database that it is used and that, in general, it is much more difficult to compute those indices using Google Scholar than ISI Web of Science or Scopus.”

GScholar Metrics use different metrics again which sound like the h-index, but these are really aimed at publication level rather than the author… see: http://scholar.google.com/intl/en/scholar/metrics.html#metrics And indeed Scimago will give you an h-index for a publication, and others have calculated h-indexes and variations of h-indexes for departments and groups of authors.

So, which index has your vote?


September 12, 2012

UK research sector, publishing trends and facts from the Finch report

Writing about web page http://www.researchinfonet.org/publish/finch/

I have been tweeting all the "Finch report facts" that I found in this recent report on accessibility to research publications. This blog entry presents some of those "facts" back in a discussion of what it seems that the Finch report is saying about the UK research sector and publishing trends, when I look at those "facts" plainly and bring in other contexts and ideas.

I'm not commenting here on the recommendations of the Finch report, nor the debate about routes to open access, although I did pull together a Storify collection of reactions to the Finch report, in case you want to read more about those topics.

UK Researchers' productivity

The UK research sector has some particular characteristics. I tweeted:

Finch report fact p37: There are 250, 000 researchers in the UK & p38 'their rate of productivity is more than 50% above world average'

This rather depends on how you measure productivity!

I also tweeted:

Finch report fact p37: UK is successful at research publications but 'relatively weak in producing other kinds of outputs such as patents'

So perhaps the productivity referred to is really all about publication activity, and I went back to the report to check where the productivity fact came from: it's a paragraph all about the number of articles written by researchers, so it's most likely although not entirely clear that the productivity referred to is about numbers of articles. A footnote against this particular fact also states that:

"It should be be noted that it is sometimes argued that high rates of research productivity in the leading research countries are achieved in part by establishing dependency cultures in other countries."

Have UK researchers achieved high publication rates due to multiple author collaborations? Possibly.

Why are UK researchers achieving high publication rates? Is it driven by RAE and REF processes?

The UK's measures of research performance have centred around research outputs which might encourage UK researchers' productivity against this measure. Looking at the RAE 2008 data (Merit project) we can see that of the 222,177 outputs that were measured, 167,831 were journal articles. I'm rubbish at maths but even I can tell that's about 75%. I expect that for the sciences, the percentage of journal articles that make up their outputs for measurement is even higher.

Another couple of tweets, then:

Finch report fact p71: 120,000 articles by UK authors are published each year. According to p62, this is 6% of articles published worldwide

Finch report fact p62 'researchers in the UK comprise just over 4% of the global research community'...

So, UK researchers are publishing plenty of articles and contributing to scholarly knowledge worldwide on a larger scale then their numbers represent.

REF 2014 will be looking at impact as well as outputs, which brings a different dimension into the measurement since RAE 2008 and that might also affect UK researchers' activity in the future.

The potential effect of performance measurement mechanisms on actual performance is addressed in a RIN report on Communicating Knowledgefrom 2009, describing a bibliometric analysis of the outputs produced in 2003 and 2008 by a sample of authors who were included in those two RAEs. Amongst many other interesting findings, they reported a slight increase in the no. of publications per author in 2008 compared to 2003, but a significant increase in no. of multiple-author works. These are multi-institutional and international. They did not find an apparent difference in citation behaviours between the two time periods. All very interesting!

In REF2014 the assessment panels for the science, technology and medicine subjects will have citation data provided to them. On UK researchers' citation scores, I tweeted:

Finch report fact p38: citations to UK articles increased between 2006 and 2010 by 7.2% a year, faster than the world average of 6.3%

and:

Finch report fact p38: UK’s “share of the top 1% of most-highly-cited papers was second only to the US, at 13.8% in 2010.”

Not only are our researchers producing lots of articles, they are also producing highly cited articles. There have been numerous studies and debates about the value of citations as a measure of the quality and influence of research papers (my own main reservation is the difference in disciplinary practices around citation), but at any rate there is plenty of citation activity and evident attention for UK authored articles, according to citation measures.

In agreement with the findings of that 2009 RIN report and the footnote on the earlier fact about UK researchers' productivity in terms of numbers of research articles, I also found in the Finch report:

Finch report fact p71 Nearly half (46%) of the peer reviewed articles with a UK author published in 2010 also listed an author from overseas

I believe that multiple authorship and involvement of overseas authors could be significant in achieving those high citation rates. The more collaborations and network contacts or reach that a researcher has, the more people will be aware of that author's work in terms of its findings but also its quality, and so the more likely the work is to be cited by those contacts or indeed their contacts in turn.

An international scale

UK researchers are operating on a world stage, of course. There are other facts in the Finch report that give some context to the UK researchers' performance. I didn't tweet this quote from page 38 because it was too long(!), but I find it very significant:

...part of the explanation for the UK’s success is that it attracts internationally-mobile researchers. UK researchers are also more likely than those in almost any other major research nation to collaborate with colleagues overseas...

Even though the UK researchers are publishing a lot, researchers from other countries are also publishing a lot:

Finch report fact p37 Rise in the no. of UK-authored articles has not been as fast as in very high growth countries such as India and Brazil

So I think that those collaborations and multi-authored articles are very significant, and the international scale of research is one that favours the UK because it's known for its high quality research already. I really think that this is key to UK "success" in the context of citations, because those collaborations and networks occur due to the migration of internationally mobile researchers to the UK. It seems to me that international reach is a very important element of impact that UK research assessors should be interested in.

Meanwhile, according to the Finch report, the UK doesn't spend a great deal on research. Apparently, the UK ranked 16th for "research intensity" amongst OECD countries in an Elsevier report that is cited on page 38, in a footnote. In actual figures:

Finch report fact, p37: 28% of UK R&D is in HE Sector. UK is 'strongly dependent' on gov.t, charity & overseas funds ow.ly/c50pU...

Finch report fact p38: 09-10 UK total expenditure on R&D: £25.9bn of which £10.4bn from gov, £5.5bn of which from Research Councils & HEFCs

Perhaps the relatively high reliance on government and the HE sector to pay for our research is also part of the reason why the UK has been more successful at getting articles published than at producing patents and other kinds of research outputs.

Perhaps another reason why UK researchers are so much involved in publishing activity is that the UK is also a key player in the worldwide publishing industry:

Finch report fact, p15: UK publishers are responsible for 5000+ journal titles & 1/5 of articles published each year

The UK also seems to be playing an important role in the development of the online open access repositories landscape:

Finch report fact: US, Germany, & UK account for over 1/3 of repositories worldwide. There are 200+ UK repositories: 150 are institutional

And the UK publishes about 7% of open access journals:


Finch report fact, p32: Currently 7600+ open access journals listed in the DOAJ, from 117 countries: 533 in UK ow.ly/c4ZLa #oa


UK researchers do seem to have good access to published articles:

Finch report fact p47 93% of UK researchers had “easy or fairly easy access" to papers. Those without most often find a different item.

Finch report fact p48: Researchers are more likely to have problems accessing conference proceedings and monographs, than journal articles.

Although library expenditure in the UK is falling:

Finch report fact, p23: library expenditure in UK Unis fell from 3.3% to 2.7% as a proportion of total expenditure ow.ly/c4ZfC #oa

The Finch report also says on page 51 that "Access on its own does not necessarily make for effective communication" and although I know that the report is really referring to the role that publishers play in enhancing discoverability through their search platforms and other work, I also interpret it to mean that all those networks and collaborations of our authors are helping to ensure that they are building on the best research that is out there.

It seems to me that international reach is a very important element of impact that UK research assessors should be interested in.

Publishing trends

Open access is one of the changes to publishing that has taken place in recent years, as the worldwide web has enabled online access to scholarly content. It's the main focus of the Finch report, so there are lots of facts relating to it! There are at least two routes to making content available on open access: the gold route where authors pay a fee or "article processing charge" (APC) for the publisher to make the final version available to readers for free, and the green route where authors own copies are deposited into open access repositories, where readers can find it.

My first publishing trend "fact" is:

Finch report fact p39 in '09 OA journals accounted for 14% of articles published worldwide in medicine & biosciences, and 5% of engineering.

The report goes on to say that only 6-7% of articles published in 2009 were available in repositories. This looks as though the repositories are not as successful a route to open access as the OA journals. But the data is only for 2009, and only for limited subject areas. The report itself highlights that science technology and medicine account for 2/3 of OA journals:


Finch report fact, p33: 2/3 of OA articles are published by 10% of publishers: STM account for 2/3 of journals ow.ly/c4ZV8 #oa


At this point it is worth referring to Steven Harnad's blog post "Finch Fiasco in figures" because he's looked into all this in a much more scholarly way, and has a great graph (figure 6) showing the relative balance of green and gold open access availability of articles: it looks like he has very different data, but even in his graph, the balance looks worst for green OA in the biomedical sciences, so the Finch report should also present data across all the subjects, in the interest of objectivity.

On page 69, the Finch report suggests some reasons for the "low take-up of OA" in humanities and social sciences, and it seems clear to me from the reasons given that the report means the low take-up by publishers, ie that gold OA routes are not so readily available in these disciplines. The reasons suggested are: rate of publication and rate of rejection, length of articles, and the larger amount of material in a journal that is not an article and therefore would not bring in an article processing charge as income. Further, on p71 the Finch report refers to the tradition of the independent scholar remaining strong in the humanities: these researchers would have no mechanism through which to pay an APC.

Another trend that the Finch report refers to is the decline of the monograph:

Finch report fact: p44 refers to decline of the monograph as print runs have shrunk, prices have risen & UK libraries spend less on books.

I've already included the fact about the relative decline in expenditure on libraries in UK universities, and the Finch report also points out another difference that electronic format makes in that it means VAT must be paid by Libraries, whilst printed versions don't attract VAT. I know that many of the libraries I have worked at have had their book budgets squeezed by rising journal subscription costs over the years, so I don't doubt that the monograph is not what it was. But I believe that the research monograph carries as much research credibility as it ever did, even if it is not attracting the same revenues for publishers.

A conclusion?

After meandering through these "facts", I'm pleased to see that the UK research sector is publishing so much and attracting so much attention worldwide, in relation to the amount of investment. I believe that we should keep up our international and collaborative efforts in order to sustain this, and we should also keep up our involvement in publishing activities, perhaps by investing in OA routes as this makes access fairer to all. The Finch report recommends that the UK support gold OA publication: perhaps it will as the RCUK policy seems to have followed this route.

Most of all though, I'm interested in what researchers will do. They are making decisions on where to publish what, with whom they will co-author and whether to deposit in a repository or not and all such things. The rest of us (publishers and librarians) are trying to respond to their need to communicate with each other, and to find out what each other are working on.


July 09, 2012

Mendeley and ResearchGate: profile sites and repositories used in tandem to raise research profiles.

Writing about web page http://opus.bath.ac.uk/30227/

There are so many places for authors to put their papers and information about their papers online, so what is the best way to make use of them? I don't have the answer exactly, but I have plenty of ideas!

Drive traffic to the repository by creating links to your papers

Brian Kelly of UKOLN (see Brian's UK Web Focus blog) and I have co-authored a paper for the international repositories conference, OR2012. The full reference is:

Kelly, B. and Delasalle, J., 2012. Can LinkedIn and Academia.edu Enhance Access to Open Repositories? Submitted to: OR2012: the 7th International Conference on Open Repositories, 9-13 July 2012, Edinburgh, Scotland.

and naturally, it is in an open access repository and linked to from this post.

The article title mentions LinkedIn and Academia.edu, and this blog post title mentions Mendeley and ResearchGate, but the concept that the article explores and that this blog post is about, is that these kind of external, profile hosting sites could be useful to researchers in raising the profile of their work, especially when used in conjunction with repositories.

I have blogged in the past about these kinds of profile hosting sites and listed a few other such sites in a piece about Academia.edu, and I have written on this blog about the number of Warwick researchers I could find on such profile sites.

One point explored in the paper is that the profile sites offer a way for authors to create inbound links to their papers in a repository, and such links might help to optimise those papers' search engine rankings, since the number of links to a page or site are a factor in search engine rankings.

I don't quite understand how search engine rankings work (that's their business, and it's getting ever more complex... SEOmoz have a useful article), but inbound links have long been a factor, one way or another. And as a former repository manager and a long-time information professional, I'm very, very aware of the important and sizeable role that Google has to play in bringing visitors to papers in a repository. Some of my early blog posts on the WRAP blog attest to that.

So profile sites are useful to researchers in offering a quick and easy way to generate inbound links to your repository papers: it's a simple concept, but as the example of Brian's work that is given in our paper demonstrates, there are probably a lot of other factors as well that might raise the profile of a researcher's papers.

Maintaining profile details on these sites

Naturally, Brian Kelly and I have profiles on these sites, and our paper is appearing on our publication lists on these sites... thanks Brian, for uploading it and making it easy for me! I confess, that I have left partial profiles on most of these sites: it takes a lot of time to create and update profiles properly. Brian is really good at doing this but I'm not a great example to other authors about how to use these sites.

The two sites I have been looking at most recently are Mendeley and ResearchGate:

I like ResearchGate for making it easy for me to "claim" articles that it has found, as ones that I am an author of. In particular, I like that it harvests records from my institutional repository, so if I kept that up to date with all my papers, then it would be relatively little effort to also keep my profile on ResearchGate up to date. Bravo, ResearchGate! (I have blogged about ResearchGate recently, in greater detail).

However, the thing that I find most irritating about ResearchGate when it comes to using it in tandem with an open access repository, is that it invites me to upload the full text of my paper in a huge box on the top right hand side, and it displays my paper to others with a "Request Full-text" button. Meanwhile, the link to the repository where the full text is available is almost invisible and it is not recognisable as a potential full text source. It simply says "Source:OAI" and the "OAI" part is a link to the WRAP repository record from where the full text can be retrieved.

This makes me have considerable sympathy with authors whose papers I have requested copies of, when I was a repository manager, because it is irritating when your article is already available on open access to all, to be asked to put it in another place as well!

Mendeley has similar features and issues in that I can import records from all sorts of sources using its "web importer", including Google Scholar which does index a lot of repository content... but it's not so simple to use as ResearchGate, when it comes to updating my profile with my own papers from the institutional repository. When I carry out a search on Mendeley itself, I find a sophisticated advanced search form, which I like, although I don't like that I can't edit my search string in the search box after running the search. I tried to do that after my first advanced search and got no results but when I went back to the advanced search form and put my revised criteria into the form, I got results. I think that's clunky and there is work to be done on it as a publications discovery tool.

On Mendeley, I am able to refine the results of my search further by selecting a tick box on the right hand side "Open access articles only". I tried this and was disappointed. It finds papers that I have written, but it doesn't know that the ones in WRAP are available on open access.

How do I tell Mendeley that the paper is already available on OA? Why doesn't it already know?

Both Mendeley and ResearchGate have got it wrong

Or at least, from an open access point of view, they have got it wrong. It ought not to be up to the author to upload their content into several places online. And they should be making it easy for people searching within their environments to get through to the existing open access versions of papers: after all, it's hardly in the spirit of OA to make it difficult for people to access the open access version!

Repository managers' perspectives

One of the points that Brian and I made in our poster for OR2012 was to ask 'why don't repository managers recommend use of external researcher profile sites?' Well, it would help if the profile sites worked nicely with repositories, I think.

And of course another answer to our question is that repository managers have enough of a struggle getting papers for the repository itself, never mind encouraging authors to put their papers elsewhere as well.

Beyond that, it is likely that others at the University are advising on the use of social media, so it might be something that repository managers don't see as their role.

Recently, I posted to a repository managers e-mail list to double check if any of them were recommending such sites:

One replied to say that she had noticed some researchers from her institution who were putting their documents onto sites like these, in full text, but not into the institutional repository. So perhaps repositories should be harvesting from the likes of Mendeley and ResearchGate, too.

At the University of Glasgow, they are sometimes using the "Related URL" field to link to a version of the article on Mendeley (see this example record), which is a step towards integrating these two approaches.

Social Media more generally

One repository manager responded that she did encourage authors to use social media "like LinkedIn, Twitter and a blog". And I was sent a very useful link to a blog post by Melissa Terras at UCL, entitled "Is blogging and tweeting about research papers worth it?" (Short Answer: yes, if you want to attract visitors!)

I think that the use of "social media" is a much bigger topic than the use of profile sites as such. I know that most of the places where researchers can put their profile information are also social media tools in some sense. But this blog post is not intended to cover the social aspects of these tools: that is perhaps for a future blog post!

One more relevant aspect is that publisher websites do often encourage authors to use such profile sites and social media in general, to raise the profiles of their papers. I have blogged about publishers' instructions for authors already.

And finally, I must say that Brian Kelly is an excellent example of an author who uses profile sites and social media. He has uploaded details of his papers onto these sites, but he has also deposited OA copies into his institutional repository and blogged and tweeted about his papers before the conference itself, to raise interest in them. I'm not at all surprised that Brian is the author of the 15 most downloaded papers in the Bath repository, from his department!


July 03, 2012

Mendeley's number of readers

Follow-up to Webometrics and altmetrics: digital world measurements from Library Research Support

I once blogged about Altmetrics and the tool Total Impact, which seemed to use the Mendeley API for tracking papers’ popularity.

I had another look at Total Impact lately and it has been worked on: I can’t give it my Mendeley profile any more, and in fact it didn’t do anything at all for me, but it is in beta and so I sent them some feedback explaining that I got nowhere with the tool, and we shall see.

So, I went directly to Mendeley, and you can see how many “readers” there are for a paper in the results of a search there, but that information is not displayed with the paper’s information once you have added it to your own library or to your list of “my publications” for display on your profile. I was disappointed that apparently only one of my papers is “open access” according to Mendeley’s search filter, even though they are in WRAP and so they are open access... I'm not sure what Mendeley's criteria is for a paper being "open access" according to its search filter.

From what I can tell in the FAQs on the Mendeley site, number of “readers” in Mendeley is the number of distinct users who have added the paper to their Library on Mendeley. It doesn’t actually mean that they’ve read the paper: I added a handful of papers that look interesting to my own Library that I have never read. It’s more of a wish list!

And then I played around with Google some more, to see if there were other tools that were accessing Mendeley’s “reader” numbers API, and I came across Readermeter which looks really interesting because you can give it the author’s name and get all sorts of stats back in a pretty format!


April 20, 2012

Guest post: Hocus Scopus

With 46 million records of peer-reviewed literature, Elsevier’s SciVerse Scopus is the largest abstract and citation database, with citation data for papers published from 1996 onwards. It’s a valuable resource for finding scholarly literature, but also offers tools for analyzing journal performance, finding out where and how frequently authors and articles are being cited, and tracking research trends. Some of the REF panels will be using citation data as one of several indicators of academic significance during the assessment process, and Scopus has been selected as the provider of this data. In this post I’m going to share some tips on how to get started with Scopus.

There is a useful set of short online tutorials on how to use Scopus here, which you can play, pause, or just click through. The SciVerseTraining Desk provides a range of training videos with more details on how to make the most of this database. Also, on the top right of every page of the Scopus there is a ‘help’ link to more information about particular tools and services.

Document Search
  • This tutorial explains how to carry out a document search, and also shows how citation analysis is built into the search results.
  • It is worth registering for a personal account with Scopus if you want to save searches to run in future browsing sessions, set up alerts every time documents matching your search terms are uploaded to Scopus, or every time a document or author is cited.
  • One useful feature is Scopus’ Document Download Manager. If the Library subscribes to journal content listed in the search results, Scopus allows you to save time by downloading multiple article pdfs.
Journal analyser
  • Another tutorial shows how you can select up to ten journals to analyse using SJR and SNIP, as well as simpler metrics like the total number of citations received in a year, and the total number of documents published in a year. You can view the data as a line chart or table. The line chart has data points which you can mouse over get a snapshot of journal performance at a moment in time.
Author and affiliation searching
  • This tutorial shows how to search by author and affiliation, and how you can track research by setting up alerts to be notified when a given author is cited or publishes a new document.
  • The author details page provides information about an author’s publishing history and research interests, and is a starting point for finding co-authors, tracking citations, and using the author evaluator tool. This displays an author’s publishing output, the number of citations received, and the h-index in the form of a graph and document list.
  • Errors of attribution and affiliation do occur. To correct records, click on ‘give feedback’ on the author details page, then ‘request author detail corrections’ and use the ‘wizard’ to input and review the information. The correction goes back to the Scopus feedback team.

Citation data is only one kind of indicator for evaluating research. For a good general introduction to the use of bibliometrics in research assessment, see the Measuring Your Research Impact toolkit.


February 21, 2012

Webometrics and altmetrics: digital world measurements

Writing about web page http://altmetrics.org/manifesto/

Research performance measurement often includes an element of output (or publication) counting and assessment, possibly including citation counts, and I've written a lot here about such bibliometrics and assessment.

The digital, web 2.0 world allows for many other, different kinds of metrics to be recorded and reported on, and could one day become a part of researchers' performance assessment, either just for themselves or indeed through more formal processes at institutional level or through an excercise like the Research Excellence Framework (REF).

I've linked to the altmetrics manifesto, and that has some very interesting contributions to the exploration of other kinds of metrics and measurements.

Note that PLoS One are running a special “collection” on altmetrics with a submission deadline just passed in January. And that if you’re an author with an article published by PLoS One, then the number of views for your article are displayed along with the metadata for your article. Warwick’s repository, WRAP, also shows download stats for articles these days, in the metadata records… eg: http://wrap.warwick.ac.uk/933/

The problem with web stats and altmetrics is that there are potentially a lot of sources which will all measure the stats for different versions of the same item, or different elements of the same output, in different ways. This sort of thing is a driver for publication in an open access (OA) journal with one canonical copy of an article in just one place online: the so called "gold" route to OA.

Authors of the future will want all web visitors to go to the publisher’s site, in order to boost the no. of viewers stated there. Well, some already do! But that rather assumes that the publisher will also provide all the functionality for commenting and reviewing and interaction with the research that the authors might like to see, and that the publisher will provide suitable measures to the author, and that the only route for publicising and making your work discoverable that is necessary, is the formal publication route...

The other route to OA is known as the "green" route, and it involves putting an earlier version into an OA repository (or more than one!) in addition to the canonical published version. All such versions should be clearly described and should point to the canonical one, ideally. This would allow for your work to be made available and promoted by all those repositories where you have deposited a copy or allowed a copy to be harvested, eg your institution and a subject specific repository.

The green route follows the "lots of copies keep stuff safe" mentality and contributes to ensuring the longevity of your research's availability and discoverability. And it could also enable new research techniques such as text mining to be employed on your outputs and thus build on your contribution to the discipline, if you've given suitable permissions at the deposit stage.

So, when it comes to altmetrics what we ideally need is some way of recording visitor stats and other metrics for all versions of one article, and collating these into one report.

The altmetrics site I've linked to has a page of tools which I had a play with recently: http://altmetrics.org/tools/ Here is the story of my "playing"!

I gave Total Impact my (rather scrappy) Mendeley profile. I have 3 articles to my name on Mendeley, and Total Impact picked up on 2 papers: in the event, only one of those was actually mine (something wrong in the metadata, I think), and that has had only 2 readers on Mendeley. Which is entirely believable, but not likely to be the “total impact” of my article!

Actually, I know it’s not the "total impact" because the same article is in WRAP and I can see additional visitors to the paper there, without even considering accesses on the journal's own site, but I guess that Total Impact doesn’t know about the other versions of that object.

I tried giving Total Impact a DOI instead… None of my articles have DOIs (I'm not an academic author: practitioner stuff only!), so I gave it the DOI for a different article (the record linked to above), and you can see the report: http://total-impact.org/collection/UMpoWa

Not much more impressive than my article, yet the WRAP stats are more impressive! So it could be that the problem is the size of the Mendeley community, and the fact that Total Impact is not picking up on visitors from elsewhere for articles.

I thought I’d give Total Impact another shot with my Slideshare profile. I’ve not been especially active in Slideshare either, but I have seen healthy stats for my handful of presentations last year. And Slideshare has a relatively large community of users. I like the Total Impact report structure for the Slideshare report: http://total-impact.org/collection/McWgLs It gives info on tweets, facebook likes and other sources of data about the Slideshare items. That’s what I thought altmetrics ought to be!

Some of the other sites that Total Impact can work with are probably worth investigating, too: I don’t know about GitHub or Dryad. I looked GitHub up: https://github.com/ and it seems that’s what I need to try next, to visit there to collate all versions of my articles!

There are other tools on the Altmetrics site that I wish I had time to try out, too!

This week, discussion on UKCoRR's mailing list raised the following altmetrics tool to my attention: http://altmetric.com/bookmarklet.php I installed it on Chrome but couldn't get it to work with the articles I tried on Web of Science and on Cambridge Journals Online. The UKCoRR community are reporting that it doesn't pick up on the DOIs from their repositories either, so I guess it's just another thing that is in development.


January 17, 2012

Open Access briefing paper from SCONUL

Writing about web page http://www.sconul.ac.uk/news/OAbriefing/OA_impact_briefing.pdf

Alma Swan's latest briefing paper for the Research Libraries UK and SCONUL is available online. It has some great little graphs showing the citation advantage of open access publication for those in Engineering, Clinical Medicine and the Social Sciences. Also, a case study of the effect on citations of deposit in an institutional, open access repository, of an author's works.

The paper also explains the value of an open access repository in supporting the impact of research work, making the scientific findings and resources available to the public, helping to engage lay people in "citizen science" projects like Galaxy Zoo.

The briefing also discusses the value of OA to a knowledge-based economy, and it is a great, brief overview of all these topics.


Subscribe to this blog by e-mail

Enter your email address:

Delivered by FeedBurner

Find out more...

My recently bookmarked sites

Tweet tweet

Search this blog

Most recent comments

  • Oh yes, I'm writing that too! And tidying up my paperwork, plastering each piece with post–it notes … by Jenny Delasalle on this entry
  • A useful list, thanks Jen. I would add "it's never too early to start writing your handover document… by Emma Cragg on this entry
  • Yes, Google does find things very fast: I use it a lot to find sites that I know and regularly visit… by Jenny Delasalle on this entry
  • Mac OS has the ability to share Safari www bookmarks and other data, securely across multiple machin… by Andrew Marsh on this entry
  • Hi Peter, I see that you practice what you preach… and indeed the point that you make about being … by Jenny Delasalle on this entry

Blog archive

Loading…
RSS2.0 Atom
Not signed in
Sign in

Powered by BlogBuilder
© MMXXIII