All 30 entries tagged Publishing
View all 80 entries tagged Publishing on Warwick Blogs | View entries tagged Publishing at Technorati | There are no images tagged Publishing on this blog
September 12, 2012
Writing about web page http://www.researchinfonet.org/publish/finch/
I have been tweeting all the "Finch report facts" that I found in this recent report on accessibility to research publications. This blog entry presents some of those "facts" back in a discussion of what it seems that the Finch report is saying about the UK research sector and publishing trends, when I look at those "facts" plainly and bring in other contexts and ideas.
I'm not commenting here on the recommendations of the Finch report, nor the debate about routes to open access, although I did pull together a Storify collection of reactions to the Finch report, in case you want to read more about those topics.
UK Researchers' productivity
The UK research sector has some particular characteristics. I tweeted:
Finch report fact p37: There are 250, 000 researchers in the UK & p38 'their rate of productivity is more than 50% above world average'
This rather depends on how you measure productivity!
I also tweeted:
Finch report fact p37: UK is successful at research publications but 'relatively weak in producing other kinds of outputs such as patents'
So perhaps the productivity referred to is really all about publication activity, and I went back to the report to check where the productivity fact came from: it's a paragraph all about the number of articles written by researchers, so it's most likely although not entirely clear that the productivity referred to is about numbers of articles. A footnote against this particular fact also states that:
"It should be be noted that it is sometimes argued that high rates of research productivity in the leading research countries are achieved in part by establishing dependency cultures in other countries."
Have UK researchers achieved high publication rates due to multiple author collaborations? Possibly.
Why are UK researchers achieving high publication rates? Is it driven by RAE and REF processes?
The UK's measures of research performance have centred around research outputs which might encourage UK researchers' productivity against this measure. Looking at the RAE 2008 data (Merit project) we can see that of the 222,177 outputs that were measured, 167,831 were journal articles. I'm rubbish at maths but even I can tell that's about 75%. I expect that for the sciences, the percentage of journal articles that make up their outputs for measurement is even higher.
Another couple of tweets, then:
Finch report fact p71: 120,000 articles by UK authors are published each year. According to p62, this is 6% of articles published worldwide
Finch report fact p62 'researchers in the UK comprise just over 4% of the global research community'...
So, UK researchers are publishing plenty of articles and contributing to scholarly knowledge worldwide on a larger scale then their numbers represent.
REF 2014 will be looking at impact as well as outputs, which brings a different dimension into the measurement since RAE 2008 and that might also affect UK researchers' activity in the future.
The potential effect of performance measurement mechanisms on actual performance is addressed in a RIN report on Communicating Knowledgefrom 2009, describing a bibliometric analysis of the outputs produced in 2003 and 2008 by a sample of authors who were included in those two RAEs. Amongst many other interesting findings, they reported a slight increase in the no. of publications per author in 2008 compared to 2003, but a significant increase in no. of multiple-author works. These are multi-institutional and international. They did not find an apparent difference in citation behaviours between the two time periods. All very interesting!
In REF2014 the assessment panels for the science, technology and medicine subjects will have citation data provided to them. On UK researchers' citation scores, I tweeted:
Finch report fact p38: citations to UK articles increased between 2006 and 2010 by 7.2% a year, faster than the world average of 6.3%
Finch report fact p38: UK’s “share of the top 1% of most-highly-cited papers was second only to the US, at 13.8% in 2010.”
Not only are our researchers producing lots of articles, they are also producing highly cited articles. There have been numerous studies and debates about the value of citations as a measure of the quality and influence of research papers (my own main reservation is the difference in disciplinary practices around citation), but at any rate there is plenty of citation activity and evident attention for UK authored articles, according to citation measures.
In agreement with the findings of that 2009 RIN report and the footnote on the earlier fact about UK researchers' productivity in terms of numbers of research articles, I also found in the Finch report:
Finch report fact p71 Nearly half (46%) of the peer reviewed articles with a UK author published in 2010 also listed an author from overseas
I believe that multiple authorship and involvement of overseas authors could be significant in achieving those high citation rates. The more collaborations and network contacts or reach that a researcher has, the more people will be aware of that author's work in terms of its findings but also its quality, and so the more likely the work is to be cited by those contacts or indeed their contacts in turn.
An international scale
UK researchers are operating on a world stage, of course. There are other facts in the Finch report that give some context to the UK researchers' performance. I didn't tweet this quote from page 38 because it was too long(!), but I find it very significant:
...part of the explanation for the UK’s success is that it attracts internationally-mobile researchers. UK researchers are also more likely than those in almost any other major research nation to collaborate with colleagues overseas...
Even though the UK researchers are publishing a lot, researchers from other countries are also publishing a lot:
Finch report fact p37 Rise in the no. of UK-authored articles has not been as fast as in very high growth countries such as India and Brazil
So I think that those collaborations and multi-authored articles are very significant, and the international scale of research is one that favours the UK because it's known for its high quality research already. I really think that this is key to UK "success" in the context of citations, because those collaborations and networks occur due to the migration of internationally mobile researchers to the UK. It seems to me that international reach is a very important element of impact that UK research assessors should be interested in.
Meanwhile, according to the Finch report, the UK doesn't spend a great deal on research. Apparently, the UK ranked 16th for "research intensity" amongst OECD countries in an Elsevier report that is cited on page 38, in a footnote. In actual figures:
Finch report fact, p37: 28% of UK R&D is in HE Sector. UK is 'strongly dependent' on gov.t, charity & overseas funds ow.ly/c50pU...
Finch report fact p38: 09-10 UK total expenditure on R&D: £25.9bn of which £10.4bn from gov, £5.5bn of which from Research Councils & HEFCs
Perhaps the relatively high reliance on government and the HE sector to pay for our research is also part of the reason why the UK has been more successful at getting articles published than at producing patents and other kinds of research outputs.
Perhaps another reason why UK researchers are so much involved in publishing activity is that the UK is also a key player in the worldwide publishing industry:
Finch report fact, p15: UK publishers are responsible for 5000+ journal titles & 1/5 of articles published each year
The UK also seems to be playing an important role in the development of the online open access repositories landscape:
Finch report fact: US, Germany, & UK account for over 1/3 of repositories worldwide. There are 200+ UK repositories: 150 are institutional
And the UK publishes about 7% of open access journals:
Finch report fact, p32: Currently 7600+ open access journals listed in the DOAJ, from 117 countries: 533 in UK ow.ly/c4ZLa #oa
UK researchers do seem to have good access to published articles:
Finch report fact p47 93% of UK researchers had “easy or fairly easy access" to papers. Those without most often find a different item.
Finch report fact p48: Researchers are more likely to have problems accessing conference proceedings and monographs, than journal articles.
Although library expenditure in the UK is falling:
Finch report fact, p23: library expenditure in UK Unis fell from 3.3% to 2.7% as a proportion of total expenditure ow.ly/c4ZfC #oa
The Finch report also says on page 51 that "Access on its own does not necessarily make for effective communication" and although I know that the report is really referring to the role that publishers play in enhancing discoverability through their search platforms and other work, I also interpret it to mean that all those networks and collaborations of our authors are helping to ensure that they are building on the best research that is out there.
It seems to me that international reach is a very important element of impact that UK research assessors should be interested in.
Open access is one of the changes to publishing that has taken place in recent years, as the worldwide web has enabled online access to scholarly content. It's the main focus of the Finch report, so there are lots of facts relating to it! There are at least two routes to making content available on open access: the gold route where authors pay a fee or "article processing charge" (APC) for the publisher to make the final version available to readers for free, and the green route where authors own copies are deposited into open access repositories, where readers can find it.
My first publishing trend "fact" is:
Finch report fact p39 in '09 OA journals accounted for 14% of articles published worldwide in medicine & biosciences, and 5% of engineering.
The report goes on to say that only 6-7% of articles published in 2009 were available in repositories. This looks as though the repositories are not as successful a route to open access as the OA journals. But the data is only for 2009, and only for limited subject areas. The report itself highlights that science technology and medicine account for 2/3 of OA journals:
Finch report fact, p33: 2/3 of OA articles are published by 10% of publishers: STM account for 2/3 of journals ow.ly/c4ZV8 #oa
At this point it is worth referring to Steven Harnad's blog post "Finch Fiasco in figures" because he's looked into all this in a much more scholarly way, and has a great graph (figure 6) showing the relative balance of green and gold open access availability of articles: it looks like he has very different data, but even in his graph, the balance looks worst for green OA in the biomedical sciences, so the Finch report should also present data across all the subjects, in the interest of objectivity.
On page 69, the Finch report suggests some reasons for the "low take-up of OA" in humanities and social sciences, and it seems clear to me from the reasons given that the report means the low take-up by publishers, ie that gold OA routes are not so readily available in these disciplines. The reasons suggested are: rate of publication and rate of rejection, length of articles, and the larger amount of material in a journal that is not an article and therefore would not bring in an article processing charge as income. Further, on p71 the Finch report refers to the tradition of the independent scholar remaining strong in the humanities: these researchers would have no mechanism through which to pay an APC.
Another trend that the Finch report refers to is the decline of the monograph:
Finch report fact: p44 refers to decline of the monograph as print runs have shrunk, prices have risen & UK libraries spend less on books.
I've already included the fact about the relative decline in expenditure on libraries in UK universities, and the Finch report also points out another difference that electronic format makes in that it means VAT must be paid by Libraries, whilst printed versions don't attract VAT. I know that many of the libraries I have worked at have had their book budgets squeezed by rising journal subscription costs over the years, so I don't doubt that the monograph is not what it was. But I believe that the research monograph carries as much research credibility as it ever did, even if it is not attracting the same revenues for publishers.
After meandering through these "facts", I'm pleased to see that the UK research sector is publishing so much and attracting so much attention worldwide, in relation to the amount of investment. I believe that we should keep up our international and collaborative efforts in order to sustain this, and we should also keep up our involvement in publishing activities, perhaps by investing in OA routes as this makes access fairer to all. The Finch report recommends that the UK support gold OA publication: perhaps it will as the RCUK policy seems to have followed this route.
Most of all though, I'm interested in what researchers will do. They are making decisions on where to publish what, with whom they will co-author and whether to deposit in a repository or not and all such things. The rest of us (publishers and librarians) are trying to respond to their need to communicate with each other, and to find out what each other are working on.
July 19, 2012
Yvonne Budden is the University of Warwick's E-Repositories Manager responsible for the Publications service and WRAP, her specialisms include open access, digital repositories and copyright. She also has ten years experience creating and managing metadata.
Metadata is a key tool to aid the dissemination of research, it's not the most exciting of topics but it can make all the difference when trying to locate electronic resources. Good metadata can help elevate the ranking of an item in search tools and guide specific audiences to a resource and conversely bad metadata can mean an item is never found. This post will look at some key concepts of metadata and end with some things to consider if you're looking to publish a yourself.
Metadata is the commonly used term to describe information about other things, for example the metadata of an mp3 will include things like the track title, artist, running time, encoding used etc. Any contextual information provided about something can be considered metadata. Most researchers have a profile page with information about their educational background, department and institution affiliation, research interests, grants, publications etc., this information can be considered metadata about a researcher. Looking specifically at metadata for outputs there are three main types:
- Descriptive metadata - which describes the output for discovery and identification and can include; title, creators, abstract, keywords, journal title, DOI and many more.
- Structural metadata - indicates how compound objects inter-relate, for example how pages should be ordered in a book.
- Administrative metadata - provides information on how the output should be managed, includes date of creation, file type and other technical information. It also describes the intellectual property rights of the item, such as who owns the copyright, and any metadata required for the long term preservation of the item.
Most publishers produce metadata for items they publish and this metadata is then passed an array of services. For journal items the metadata is harvested by Web of Science, Scopus and other indexing services, as well as by Google and Library catalogues like Warwick's Encore service. Book metadata is harvested by bookseller services, libraries and data aggregators. Open Access repositories like the Warwick Research Archive (WRAP) create, harvest and disseminate metadata as widely as possible as part of their role in showcasing Warwick research. The software used for WRAP is specifically optimised to allow its metadata to be easily discovered and indexed by Google and the team undertake work to enhance and expand the metadata supplied by publishers and researchers for better rankings and discoverablity.
Metadata is what drives most of the search engines and discovery platforms for research. All of the services that create metadata, including researchers need to be aware of what the metadata says, as Emerald Publishing's guide for authors puts it:
"The online environment presents researchers with a huge amount of choice in their search for relevant articles. As an author, it is important to remember that your article is competing for attention alongside other articles and online resources." 
Search engines pick up on the metadata in the html headers of web pages, online resources and blog posts and use it to rank these pages in the search. Other services like the OpenURL system that drives link resolvers like SFX and Webbridge use the data to match up metadata on articles with Library holdings to help researchers access articles and e-books subscribed to by the University with little effort to the researcher. Metadata is also used as a way of telling people and machines what they have permission to do with your research once they have found it and to allow you to make an assessment about the quality of the item.
So what to researchers need to consider when creating metadata for their journal articles, blog posts, websites or journals? Below are a few things to bear in mind:
- Short titles - the more words in the title the less likely it is to be download, odd but true 
- Keywords - use tags and keywords that your audience will understand, but try to make sure to write out any acronyms at least once. Repeating keywords in the title and abstract (but not in the same place) will increase visibility to a search engine , 
- Consistency - when using keywords or tags try to be consistent as well as descriptive in the way you use those tags. Most blogging software and tools like Evernote will help you by presenting you a list of tags to choose from. This is especially important in blogs that have a number of contributers to keep things organised.
- Synonyms - when writing an abstract if you have used your key term once, consider using a synonym in later sentences, partially to avoid repetition and to allow users who might have chosen to use a different term to find your work.
- Identifiers - these are vital as they give people an easy way to share your work! Publishers do this for articles with Digital Object Identifers (DOIs). Open access databases create an unique permanent URL for each article (e.g. http://wrap.warwick.ac.uk/43230/) and some, like WRAP for Warwick researchers, create one for each member of staff (http://wrap.warwick.ac.uk/view/author_id/). Most blog software generates a unique URL for each post as well.
- Be comprehensive - when people are adding metadata to objects the temptation can be to add only the 'required' fields, but everything (and anything) you put into the metadata can be used as a way for search engines to find your research so consider spending a little more time on it and giving your audience as many chances as possible to find your research.
- External services - if you are publishing your own journal consider submitting the metadata to other indexing services. Some services, like Web of Science and Scopus have tight criteria on what they index but these services are the great at disseminating journal content as they are places people use to find information. If your journal is open access, listing it in the Directory of Open Access Journals (DOAJ) is useful as the service now holds records from just under 8000 journals and is growing every day. If you have a working paper series listing them, or even hosting them, in an open access archive, such as SSRN, RePEc or WRAP for Warwick based series' is a quick way to benefit from wider dissemination and in WRAP's case enhanced metadata.
- Ruffilo, Nick (2011) "Five Degrees Of Metadata: Small Changes Can Mean Big Sales" Publishers Weekly Soapbox.
- Emerald Publishing "How to... increase online readership of your article"
- Wiley Blackwell "Optimizing Your Article for Search Engines"
- Getting your Journal Indexed (A SPARC Guide)
July 09, 2012
Mendeley and ResearchGate: profile sites and repositories used in tandem to raise research profiles.
Writing about web page http://opus.bath.ac.uk/30227/
There are so many places for authors to put their papers and information about their papers online, so what is the best way to make use of them? I don't have the answer exactly, but I have plenty of ideas!
Drive traffic to the repository by creating links to your papers
Brian Kelly of UKOLN (see Brian's UK Web Focus blog) and I have co-authored a paper for the international repositories conference, OR2012. The full reference is:
Kelly, B. and Delasalle, J., 2012. Can LinkedIn and Academia.edu Enhance Access to Open Repositories? Submitted to: OR2012: the 7th International Conference on Open Repositories, 9-13 July 2012, Edinburgh, Scotland.
and naturally, it is in an open access repository and linked to from this post.
The article title mentions LinkedIn and Academia.edu, and this blog post title mentions Mendeley and ResearchGate, but the concept that the article explores and that this blog post is about, is that these kind of external, profile hosting sites could be useful to researchers in raising the profile of their work, especially when used in conjunction with repositories.
I have blogged in the past about these kinds of profile hosting sites and listed a few other such sites in a piece about Academia.edu, and I have written on this blog about the number of Warwick researchers I could find on such profile sites.
One point explored in the paper is that the profile sites offer a way for authors to create inbound links to their papers in a repository, and such links might help to optimise those papers' search engine rankings, since the number of links to a page or site are a factor in search engine rankings.
I don't quite understand how search engine rankings work (that's their business, and it's getting ever more complex... SEOmoz have a useful article), but inbound links have long been a factor, one way or another. And as a former repository manager and a long-time information professional, I'm very, very aware of the important and sizeable role that Google has to play in bringing visitors to papers in a repository. Some of my early blog posts on the WRAP blog attest to that.
So profile sites are useful to researchers in offering a quick and easy way to generate inbound links to your repository papers: it's a simple concept, but as the example of Brian's work that is given in our paper demonstrates, there are probably a lot of other factors as well that might raise the profile of a researcher's papers.
Maintaining profile details on these sites
Naturally, Brian Kelly and I have profiles on these sites, and our paper is appearing on our publication lists on these sites... thanks Brian, for uploading it and making it easy for me! I confess, that I have left partial profiles on most of these sites: it takes a lot of time to create and update profiles properly. Brian is really good at doing this but I'm not a great example to other authors about how to use these sites.
The two sites I have been looking at most recently are Mendeley and ResearchGate:
I like ResearchGate for making it easy for me to "claim" articles that it has found, as ones that I am an author of. In particular, I like that it harvests records from my institutional repository, so if I kept that up to date with all my papers, then it would be relatively little effort to also keep my profile on ResearchGate up to date. Bravo, ResearchGate! (I have blogged about ResearchGate recently, in greater detail).
However, the thing that I find most irritating about ResearchGate when it comes to using it in tandem with an open access repository, is that it invites me to upload the full text of my paper in a huge box on the top right hand side, and it displays my paper to others with a "Request Full-text" button. Meanwhile, the link to the repository where the full text is available is almost invisible and it is not recognisable as a potential full text source. It simply says "Source:OAI" and the "OAI" part is a link to the WRAP repository record from where the full text can be retrieved.
This makes me have considerable sympathy with authors whose papers I have requested copies of, when I was a repository manager, because it is irritating when your article is already available on open access to all, to be asked to put it in another place as well!
Mendeley has similar features and issues in that I can import records from all sorts of sources using its "web importer", including Google Scholar which does index a lot of repository content... but it's not so simple to use as ResearchGate, when it comes to updating my profile with my own papers from the institutional repository. When I carry out a search on Mendeley itself, I find a sophisticated advanced search form, which I like, although I don't like that I can't edit my search string in the search box after running the search. I tried to do that after my first advanced search and got no results but when I went back to the advanced search form and put my revised criteria into the form, I got results. I think that's clunky and there is work to be done on it as a publications discovery tool.
On Mendeley, I am able to refine the results of my search further by selecting a tick box on the right hand side "Open access articles only". I tried this and was disappointed. It finds papers that I have written, but it doesn't know that the ones in WRAP are available on open access.
How do I tell Mendeley that the paper is already available on OA? Why doesn't it already know?
Both Mendeley and ResearchGate have got it wrong
Or at least, from an open access point of view, they have got it wrong. It ought not to be up to the author to upload their content into several places online. And they should be making it easy for people searching within their environments to get through to the existing open access versions of papers: after all, it's hardly in the spirit of OA to make it difficult for people to access the open access version!
Repository managers' perspectives
One of the points that Brian and I made in our poster for OR2012 was to ask 'why don't repository managers recommend use of external researcher profile sites?' Well, it would help if the profile sites worked nicely with repositories, I think.
And of course another answer to our question is that repository managers have enough of a struggle getting papers for the repository itself, never mind encouraging authors to put their papers elsewhere as well.
Beyond that, it is likely that others at the University are advising on the use of social media, so it might be something that repository managers don't see as their role.
Recently, I posted to a repository managers e-mail list to double check if any of them were recommending such sites:
One replied to say that she had noticed some researchers from her institution who were putting their documents onto sites like these, in full text, but not into the institutional repository. So perhaps repositories should be harvesting from the likes of Mendeley and ResearchGate, too.
At the University of Glasgow, they are sometimes using the "Related URL" field to link to a version of the article on Mendeley (see this example record), which is a step towards integrating these two approaches.
Social Media more generally
One repository manager responded that she did encourage authors to use social media "like LinkedIn, Twitter and a blog". And I was sent a very useful link to a blog post by Melissa Terras at UCL, entitled "Is blogging and tweeting about research papers worth it?" (Short Answer: yes, if you want to attract visitors!)
I think that the use of "social media" is a much bigger topic than the use of profile sites as such. I know that most of the places where researchers can put their profile information are also social media tools in some sense. But this blog post is not intended to cover the social aspects of these tools: that is perhaps for a future blog post!
One more relevant aspect is that publisher websites do often encourage authors to use such profile sites and social media in general, to raise the profiles of their papers. I have blogged about publishers' instructions for authors already.
And finally, I must say that Brian Kelly is an excellent example of an author who uses profile sites and social media. He has uploaded details of his papers onto these sites, but he has also deposited OA copies into his institutional repository and blogged and tweeted about his papers before the conference itself, to raise interest in them. I'm not at all surprised that Brian is the author of the 15 most downloaded papers in the Bath repository, from his department!
July 03, 2012
I once blogged about Altmetrics and the tool Total Impact, which seemed to use the Mendeley API for tracking papers’ popularity.
I had another look at Total Impact lately and it has been worked on: I can’t give it my Mendeley profile any more, and in fact it didn’t do anything at all for me, but it is in beta and so I sent them some feedback explaining that I got nowhere with the tool, and we shall see.
So, I went directly to Mendeley, and you can see how many “readers” there are for a paper in the results of a search there, but that information is not displayed with the paper’s information once you have added it to your own library or to your list of “my publications” for display on your profile. I was disappointed that apparently only one of my papers is “open access” according to Mendeley’s search filter, even though they are in WRAP and so they are open access... I'm not sure what Mendeley's criteria is for a paper being "open access" according to its search filter.
From what I can tell in the FAQs on the Mendeley site, number of “readers” in Mendeley is the number of distinct users who have added the paper to their Library on Mendeley. It doesn’t actually mean that they’ve read the paper: I added a handful of papers that look interesting to my own Library that I have never read. It’s more of a wish list!
And then I played around with Google some more, to see if there were other tools that were accessing Mendeley’s “reader” numbers API, and I came across Readermeter which looks really interesting because you can give it the author’s name and get all sorts of stats back in a pretty format!
May 02, 2012
Guest post by Sam Johnson: Funded by the Wellcome Trust? Get funding for open access publishing fees
Today’s post is from Sam Johnson, Academic Support Librarian for Life Sciences, Medicine & Psychology, and the University of Warwick’s main contact for information about funding to cover the publication fees of Wellcome Trust-funded research.
Are you working on or have you recently completed research funded by the Wellcome Trust?
You will be aware that you are obliged to publish your research in an open access (OA) publication, so that anyone can access and read your findings in full text. OA costs can be quite considerable and in recognition of this, the Wellcome Trust has given the University a sum of money to support the open access publication of its research.
Why Open Access Publishing?
The aim of open access publishing is to disseminate research as widely as possible and to make the full text of current research easily available to anyone in the world. Open access publishing also helps to raise the impact of your research and your own research profile by making it more visible and easily accessible. Whilst OA is great news in terms of generating visibility for your research, the OA publication costs are the responsibility of the author/s and they can be quite significant..
The Wellcome Trust, along with most research funders, have an OA policy that mandates that their funded research is disseminated as widely as possible to maximise the impact and value of the findings. For more information see their Author’s FAQs.
Apply for OA Wellcome funding
Please email Samantha.A.Johnson@warwick.ac.uk for an application form. This will then be forwarded to Research Support Services for processing. Applications will need to come in soon as the funding needs to be spent by the end of September.
April 03, 2012
On 29th March 2012 we invited publisher Emerald to present their ‘Guide to Getting Published’ at the Research Exchange. Many thanks to Sharon Parkinson for her very informative presentation; I wanted to share some of the best tips and advice to come out of the session…
Advice on getting published in journals:
1) Pick the right journal: This might seem obvious, but it was interesting to hear that the majority of rejections made by journal publishers were still due to the article being submitted to an inappropriate journal. You will need to:
- Consider who you research audience is, what they want to know, and what they are reading.
- Read at least one issue of a journal before you choose to submit work for it.
- Make sure you consider usage rates as well as journal rankings (which you are more interested in will depend on your motivations for publishing and what you hope to achieve with your work). Emerald suggested most editors would be happy to provide you with usage/download rates for a journal.
2) Send the editor an abstract: This is a great way to avoid problem 1. If you have done your research, but are still unsure if your paper is right for the journal, send an abstract to the editor asking for their opinion on its suitability. Check the author’s guidelines for the publisher you’re contacting to make sure your abstract fits their specifications. (Emerald’s can be found here: http://www.emeraldinsight.com/authors/guides/write/abstracts.htm)
3) Treat it like a job application: I’m not a fan of analogies, but this one seemed too apt to ignore. Much like you would tailor your CV to each position, Emerald emphasised the importance of tailoring your submission carefully to suit the journal/publisher you are approaching. You can also include a cover letter which, like a job application, should focus very clearly on what your paper has to offer to the journal and its readership, rather than on the benefits for yourself.
4) Get your own peer review: Don’t underestimate the value of getting an objective view; someone who isn’t close to your work will find it much easier to critically appraise it. From a personal perspective, I’ve always thought it useful to have someone outside of your field read your work; they tend to be able to spot jumps in your logic very easily.
5) Don’t give up: Getting a paper rejected is very common and shouldn’t deter you. Get feedback from the editor, work on their points and resubmit elsewhere. Also, requests for revisions can be seen as a very positive step – if a publisher has taken the time to do this, then they have obviously seen potential in your work, so don’t give up at this stage.
Advice on getting books published:
1) Make it travel: Obviously the key difference from publishing in journals is that a book must have considerable commercial appeal. Therefore, it needs to be of interest to and accessible by a wide audience: know your market and make sure your work has reach.
2) Attend a publishers’ conference: Emerald were clear that if you want your book commissioned, conferences are the place to be. You can contact a publisher in advance to book an appointment with a commissioning editor at the conference. Arrive prepared – you should complete a detailed proposal form and be ready to answer the publisher’s queries.
3) Keep track of time: You need to be aware of the time constraints that apply to book publishing. Since the publishers will need to promote the book and publicise its release date, you can’t afford to fall behind. Make sure you discuss targets and timescales carefully with the editor and any other involved authors at an early stage.
February 22, 2012
If you know of any sources, please do share them! I have only been able to collate the following:
1) Cabell’s directory that you have to pay for: http://www.cabells.com/index.aspx
2) The American Psychological Association produce a nice table of their journals’ stats: http://www.apa.org/pubs/journals/features/2010-statistics.pdf
3) MLA Directory of Periodicals: look out for the “Updated” date in the journal’s record, though: if there isn’t a date then the information might not be that current. They try to update records every two years, apparently. And they get their submitted and published numbers from editors of journals.
Obviously, there are journal home pages to explore, too. But it takes a long time for authors to have to navigate through all of those.There are some scholarly articles to be found, on this topic, for some disciplines: I have come across one or two in the biomedical sciences, in the past.
Of course, it may not be so very useful for an author to find such information from any source: the rates are likely to be adjusted in the way they are reported, to make the high quality journals look accessible enough that it is worth the authors' while submitting, and to give a prestigious enough impression at journals which might have high acceptance rates!
And then there is the turnaround time: how quickly the article is accepted or rejected might be more important than the actual chances of acceptance.
I found this page which covers much of the same topic: http://guides.lib.umich.edu/content.php?pid=98218&sid=814212
February 21, 2012
Writing about web page http://altmetrics.org/manifesto/
Research performance measurement often includes an element of output (or publication) counting and assessment, possibly including citation counts, and I've written a lot here about such bibliometrics and assessment.
The digital, web 2.0 world allows for many other, different kinds of metrics to be recorded and reported on, and could one day become a part of researchers' performance assessment, either just for themselves or indeed through more formal processes at institutional level or through an excercise like the Research Excellence Framework (REF).
I've linked to the altmetrics manifesto, and that has some very interesting contributions to the exploration of other kinds of metrics and measurements.
Note that PLoS One are running a special “collection” on altmetrics with a submission deadline just passed in January. And that if you’re an author with an article published by PLoS One, then the number of views for your article are displayed along with the metadata for your article. Warwick’s repository, WRAP, also shows download stats for articles these days, in the metadata records… eg: http://wrap.warwick.ac.uk/933/
The problem with web stats and altmetrics is that there are potentially a lot of sources which will all measure the stats for different versions of the same item, or different elements of the same output, in different ways. This sort of thing is a driver for publication in an open access (OA) journal with one canonical copy of an article in just one place online: the so called "gold" route to OA.
Authors of the future will want all web visitors to go to the publisher’s site, in order to boost the no. of viewers stated there. Well, some already do! But that rather assumes that the publisher will also provide all the functionality for commenting and reviewing and interaction with the research that the authors might like to see, and that the publisher will provide suitable measures to the author, and that the only route for publicising and making your work discoverable that is necessary, is the formal publication route...
The other route to OA is known as the "green" route, and it involves putting an earlier version into an OA repository (or more than one!) in addition to the canonical published version. All such versions should be clearly described and should point to the canonical one, ideally. This would allow for your work to be made available and promoted by all those repositories where you have deposited a copy or allowed a copy to be harvested, eg your institution and a subject specific repository.
The green route follows the "lots of copies keep stuff safe" mentality and contributes to ensuring the longevity of your research's availability and discoverability. And it could also enable new research techniques such as text mining to be employed on your outputs and thus build on your contribution to the discipline, if you've given suitable permissions at the deposit stage.
So, when it comes to altmetrics what we ideally need is some way of recording visitor stats and other metrics for all versions of one article, and collating these into one report.
The altmetrics site I've linked to has a page of tools which I had a play with recently: http://altmetrics.org/tools/ Here is the story of my "playing"!
I gave Total Impact my (rather scrappy) Mendeley profile. I have 3 articles to my name on Mendeley, and Total Impact picked up on 2 papers: in the event, only one of those was actually mine (something wrong in the metadata, I think), and that has had only 2 readers on Mendeley. Which is entirely believable, but not likely to be the “total impact” of my article!
Actually, I know it’s not the "total impact" because the same article is in WRAP and I can see additional visitors to the paper there, without even considering accesses on the journal's own site, but I guess that Total Impact doesn’t know about the other versions of that object.
I tried giving Total Impact a DOI instead… None of my articles have DOIs (I'm not an academic author: practitioner stuff only!), so I gave it the DOI for a different article (the record linked to above), and you can see the report: http://total-impact.org/collection/UMpoWa
Not much more impressive than my article, yet the WRAP stats are more impressive! So it could be that the problem is the size of the Mendeley community, and the fact that Total Impact is not picking up on visitors from elsewhere for articles.
I thought I’d give Total Impact another shot with my Slideshare profile. I’ve not been especially active in Slideshare either, but I have seen healthy stats for my handful of presentations last year. And Slideshare has a relatively large community of users. I like the Total Impact report structure for the Slideshare report: http://total-impact.org/collection/McWgLs It gives info on tweets, facebook likes and other sources of data about the Slideshare items. That’s what I thought altmetrics ought to be!
Some of the other sites that Total Impact can work with are probably worth investigating, too: I don’t know about GitHub or Dryad. I looked GitHub up: https://github.com/ and it seems that’s what I need to try next, to visit there to collate all versions of my articles!
There are other tools on the Altmetrics site that I wish I had time to try out, too!
This week, discussion on UKCoRR's mailing list raised the following altmetrics tool to my attention: http://altmetric.com/bookmarklet.php I installed it on Chrome but couldn't get it to work with the articles I tried on Web of Science and on Cambridge Journals Online. The UKCoRR community are reporting that it doesn't pick up on the DOIs from their repositories either, so I guess it's just another thing that is in development.
February 10, 2012
My thoughts on a new breed of online journals...
SAGE Open and Springer Plus look to me like similar journals the PLoS One model (both were launched relatively recently). PLoS One was launched in 2006: 6 years ago, and it has an impact factor of around 4. Not bad, but is the bulk publishing model something that has worked for the science and medicine community but might not work in other disciplines?
PLoS One are said to have a 70% acceptance rate, and Springer Plus are currently tweeting about how they publish on any topic, but they are also publishing in the sciences. SAGE Open are for the social sciences and humanities in general, and all three of these journals charge authors fees of around $1000.
PLoS claim to discount or waive the fee if the researcher cannot pay, and Springer Plus also offer discounts for those from low income countries. SAGE Open are cheapest of the three, at $700 and are offering an introductory author discount rate at the moment of $395… but who is going to pay, with what?!
We don't have a central authors' fund at Warwick. I'm not sure what the latest news is from institutions that have tried them, but I did hear a while ago that they were under-used by authors.
BioMed Central also publish journals on open access with fees set at journal title level: most in the region of $2000, but the library’s subscription entitles authors to a 50% discount. This is an area for me to investigate in future, because we do get stats from BMC about authors' take up of that discount... and that might be an indication of authors' willingness to use a central fund in some way. That and the handful of enquiries that reach me each year.
But are the biomedical sciences different? The Wellcome Trust not only mandate that research outputs should be made available on OA, but that the publisher should be paid to do this. They pay for it: we have a Wellcome Trust fund at Warwick, for authors whose work is funded by them. Another possible indicator of authors' interest in central funds to investigate... So one major funder could be forcing the publishing environment, and publishers like BioMed Central (incidentally, part of the Springer stable) and PLoS are able to charge fees and get them paid.
So, I also need to watch what funders in the other disciplines are mandating.
What is different about PLoS One and Sage Open and Springer Plus though, is not that they are open access journals which charge authors a fee, but it is their lack of subject specificity which interests me… bulk publishing is a model that would not work if you’re charging traditional subscription fees, but it potentially works on the digital, OA environment.
Lots more space to watch!
January 17, 2012
Writing about web page http://www.sconul.ac.uk/news/OAbriefing/OA_impact_briefing.pdf
Alma Swan's latest briefing paper for the Research Libraries UK and SCONUL is available online. It has some great little graphs showing the citation advantage of open access publication for those in Engineering, Clinical Medicine and the Social Sciences. Also, a case study of the effect on citations of deposit in an institutional, open access repository, of an author's works.
The paper also explains the value of an open access repository in supporting the impact of research work, making the scientific findings and resources available to the public, helping to engage lay people in "citizen science" projects like Galaxy Zoo.
The briefing also discusses the value of OA to a knowledge-based economy, and it is a great, brief overview of all these topics.