All 4 entries tagged Interdisciplinary
View all 11 entries tagged Interdisciplinary on Warwick Blogs | View entries tagged Interdisciplinary at Technorati | There are no images tagged Interdisciplinary on this blog
January 27, 2011
Writing about web page http://www.warwick.ac.uk/go/researchexchange/wor/
Working on food for the future: multidisciplinary experience from Warwick & beyond!
The first Window on Research session of 2011 was held in the Wolfson Research Exchange today. It was a particularly cross-disciplinary theme and our two presenters, Professor Elizabeth Dowler and Professor Rosemary Collier were from the department of Sociology and School of Life Sciences respectively. They have been establishing a Food Security group at the University of Warwick and are looking for researchers and contacts from other disciplines who might have an angle and expertise to contribute to their research.
Professor Dowler began with a definition of Food Security and spoke about various government reports and research, and Professor Collier presented a particularly complex map of all the agents involved in Food, from its production to its consumption which I thought would be a very good clue as to how different academic disciplines might interact with the theme of food security. Some themes which came up in discusion at the end were around the concept of self sufficiency of a country and what that actually meant, cultural issues of food consumption and effects of this on food security, and the applicability of research concerns and expertise to the way the University campus and outlets on campus are run.
Those who attended were from a wide range of disciplines, so the multi-disciplinary theme of this event came through very strongly. It was a great way to get a view on some of the research that is happening at the University of Warwick.
November 04, 2010
The slides from the session I attended last week are now available online at: http://wok.mimas.ac.uk/support/documentation/#presentations
November 02, 2010
I attended a training course held at Oxford University last Friday: it was a session delivered through Mimas, and provided by Thomson Reuters (TR) who publish citation data.
The session began with reference to two University rankings, which I have blogged about in the past. The ARWU from ShanghaiRanking and the Times Higher Education (THE) World University Ranking, both of whom use Thomson Reuters' citation data. There are other University Rankings, of course: QS used to provide THE's ranking and now have their own World University Ranking, and there is the Webometrics Ranking Web of World Universities, who do look at citation data but they use TR's competitor Elsevier's data, available in their product Scopus. And there are other rankings too, which are not at all interested in citations... but the two mentioned on Friday were ARWU and THE.
ARWU's approach is interesting: they are interested in whether any researchers have published in two particular high profile and cross-disciplinary journal titles: "Nature" and "Science". Our trainer also mentioned that ARWU seem to use other citation data, possibly from TR's Essential Science Indicators product. THE's ranking methodology shows that about a third of their ranking score is due to citations data from TR. More reading on University rankings: "International ranking systems for universities and institutions: a critical appraisal" http://dx.doi.org/10.1186/1741-7015-5-30 looks particularly interesting and it cites some other important looking articles on the topic, although all are too old to shed light on the THE's latest methodology, and the way in which THE have normalised for discipline seems to me to be particularly significant.
Our trainer did suggest that we can create our own normalisation by searching for articles in a given journal or from a given subject set, and in a particular year and then creating a report on Web of Science, which would tell us the expected citation rate for that set of articles. (This is the small link towards the top right hand side of the screen "Create citation report", which you can do for any set of results in Web of Science.) I think it's unlikely that the THE did this: they would probably have bought the raw data to manipulate, or at least have purchased it through InCites where you can get reports on expected citation rates.
Criticising the measurements
When using these kinds of citations metrics, or indeed any bibliometrics, you need to bear in mind the source of your data, and our presenter did show us some slides indicating that 40% of the journals in Web of Science carry the vast majority of all citations. TR do add new journal titles to their collection (and they drop some), and they evaluate about 2,500 new titles each year for suitability. They have records for all citations from the journals they index, i.e. including those to journals which they do not index. This means that they have data to indicate that the journals they have not indexed are in fact attracting lots of citations and therefore they ought to cover them...
But we're still only talking about journals and conference proceedings, in the main. TR have mentioned a couple of times recently that they are planning some kind of citations index for books to be launched next year, but they are playing their cards very close to their chests about their source of data for any such index!
We spoke about self-citations and whether these ought to be included in citation measuring sets. I would recommend self-citing from a "bibliometrics optimisation" perspective, although of course there are other reasons than citation measurements to self cite or not.
A colleague from Warwick who was also at the session, Professor Robert Lindley who heads our Institution for Employment Research, also suggested that TR stopped referring to the measure of how many articles an author (or unit) has published as a measure of "Productivity". It is a volume of output, perhaps, but even then only of particular outputs so it would be best to label it as just what it is, the number of journal items published. TR also suggest an "Efficiency" rating which is the percentage of papers with citations as opposed to those without any, and an "Impact" rating of the average number of citations per paper (as used for Journal Impact Factors). Pitfall to avoid: this citation impact is not at all the same as impact in the context of the REF: REF impact is about effects outside the scholarly community, whilst TR's measurement of citations is an activity that clearly happens within the scholarly community.
Journal Impact Factors
The calculation of the Journal Impact Factor was explained, and the purpose of the 5 year Journal Impact Factor as well, which was useful for me to pick up on: I wondered why there were two measures! The original one was measured over two years, and a graph showing the average time for citations for an article to appear by discipline clearly showed that for some disciplines, the peak number of citations will happen after two years since publication. In other words, the measure of a journal's impact being over a two year period was advantaging journals from disciplines which are quickest to cite. These are primarily science, technology and medicine journals, so the 5 year Journal Impact Factor could be really useful for those involved cross-disciplinary and interdisciplinary research who are looking to target journals for their articles which will get them the best reach within the scholarly community as a whole. The 5 year JIF is a better measure if you are trying to compare journals from different disciplines... although of course, it can never take into account the relative value of a citation from each discipline, and indeed the fact that citations from some disciplines will happen in books or other kinds of publication which WoS doesn't (currently) index...
There was more about the H-index and some useful slides that I hope to get a link to in the near future.
July 21, 2010
Writing about web page http://www.researchersoftomorrow.net
This report, funded by the JISC doesn't seem to have identified any particularly unique or different behaviours amongst the Generation Y researchers, as compared with older generations of researchers, as an over-all generalising interpretation of the findings!
However, the detail from page 32 onwards gets very interesting. Figure 6 has a very interesting break-down of the sources of information that the Generation Y researchers (all doctoral students) reported using in their "last critical incident of information-seeking activity", broken down by discipline. It's easy to see that Google/Google Scholar is the highest used source overall, but look closely and you'll see how important the citation database appears to be for the Physical sciences and the Medical, dental and other health sciences. For these disciplines, the citation database is reportedly used more than Google/Google Scholar. Biological sciences seem to use both almost equal in their use of both, and Biomedical and vetinary sciences are only just preferring the citation database. Meanwhile, the Arts and humanities and Social Sciences are much preferring Google/Google Scholar. Not that it's an either/or situation, of course. I'm sure that researchers from all disciplines would use all sources, but it is interesting to see this disciplinary difference in what researchers report was their main source of information for their most recent "critical" information need.
Other information sources reportedly used include the e-journal's search interface, abstracts/indexes (print or online), their own institution's library catalogue, a subject-specific information gateway a cross-institutional library catalogue, publishers' websites and the website of an organisation or person. The last one on this list was very little used at all, and it backs up the evidence I found as WRAP manager, about the importance of Google in bringing visitors to institutional repositories.
I do wonder at the break-down of the resources, and how the students were asked to describe the resources they used, because I find it particularly difficult to differentiate between a citation database and a bibliographic database and a search interface of an e-journal and an online index, as types of information resources. WoS is definitely a kind of citation database, but it is also a bibliographic database and in fact many searches performed on it are simple bibliographic searches and not for citation data at all. Likewise, it's sometimes difficult to tell whether you're on a publisher's website or a journal's home-page as provided by a hosting service. And all librarians can tell of the enquiries they've had about using the "Athens" database - this legendary database has entered library world's mythology!
I'm not sure that our researchers would be able to accurately report on the kind of information resource they were using. I'm also not sure what the researchers were asked: perhaps someone else categorised the resources they reported using. Even so, I'm an information professional and I'd find it difficult to consistently and accurately report that right! Perhaps information skills trainers in the Medical, dental and other health sciences field consistently refer to citation databases and that's why they appear so important?
I do wish that they'd specified whether it was Google or Google Scholar that was of use to the researchers. I spent ages preferring Google Scholar to Google but I now believe that, even for the academic scholar, each one has a different purpose. Google Scholar is great if you're looking for published content on a particular (not very specific) topic. Google is much better if you are trying to find a paper online and you already know the details of that paper. Google Scholar seems a little discriminatory towards publishers' subscription resources you see, and sometimes can't find open access versions, whilst Google has no such elitist restrictions! The various kinds of bibliographic database or online journal hosting services support much more refined and specific enquiries, so they would be my preferred source for academic information on a very specific topic, excepting in fields where very little at all has been published so a simple Google Scholar search would turn everything up anyway.
I have lots of questions but it's certainly clear that there are disciplinary differences and that Google/Google Scholar is dominant, even whilst the other sources have a place for our Generation Y research students.
Looking at Figure 7 on the next page raises more thoughts: it describes the disciplinary differences in the kinds of information sought by our researchers at the time they were using those resources (i.e. in their "last critical incident of information-seeking activity"). This graph differentiates whether the researchers' need was for anything published on a topic or for a particular bibliographic reference, or other kinds of information including data. Mapping the responses to these two questions would indicate whether the researcher had found the information they wanted in the resource that I would have recommended to them!
Figure 8 on page 35 goes on to describe the type of information resource used by the researchers, again broken down by discipline. Most seem to have been looking for a full text journal article, but others were looking for a reference to a journal article or for a printed book. As you might expect, it is the Arts and humanities who were particularly looking for printed books.
There's a lot more of interest to be found out about disciplinary differences in the results described in subsequent pages, including preferences over publishing in open access journals, use of libraries and library staff and so on.
The report also describes that: "Low levels of use of specialist and web 2.0 technologies are confirmed in the Gen Y survey sample and there are virtually no differences at all in responses when compared to the wider survey
sample." (p45) Figure 15 lists the specific tools asked about and responses indicating whether researchers have used them, including social bookmarking and alerting services/RSS. Librarians have got a lot to do in terms of advocating the usefulness of these tools to researchers. Either that or we're focussing on the wrong things, because researchers are not using them because they are not useful or relevant!
Researchers also indicate a wish for training to be provided "on demand" (p48) and on p49 the report becomes very interesting again as researchers describe library training they have received and whether or not they found it useful. Training themes that over 20% of the researchers received and identified as being useful are listed below - those with a score higher than 40% are in bold. (Although some also reported the same themes as not useful, this figure nestles well below those who did find it useful in every instance listed below.)
- managing references and using tools (e.g.Endnotes)
- Copyright/intellectual property rights and research
- finding/using data and datasets online
- finding/using manuscript and archival sources (in
- finding resources beyond own institution (e.g interlibrary
- specific information skills (e.g finding 'grey
- finding/using bibliographic, abstract and
journal research resources (both print and …
- keeping up to date in research (e.g. use of alerting
services, RSS feeds)
- using own institution portal to access electronic
- generic computer skills (e.g. Word, Excel, Access)
Training themes which were not so popular (but still rated as useful by more people than rated them as not useful) were:
- creating digital media, pocasts, wikis, Second Life (NB hardly anyone had attended such a course!)
- e-research methods and tools (data mining,
- e-research infrastructure services (virtual research
environments, campus grid, National Grid Service)
- generic online skills (e.g using Google services, using
Web 2.0 tools to support research)
- information on Research Excellence Framework and
how to publish
- open access publishing/archiving
These lists will be very useful to me indeed, in directing our library's support for researchers.
Figure 18 on page 50 reports on how valuable researchers have found various research support options available to them. Quite a lot of library offerings are reported as "Never used" and amongst these, most are also quite highly reported as "Unaware of availability". The highest value and highest awareness is the supervisor's support, as you might expect. The library services scored by most as "valuable" ("Value was assigned by the respondents whether or not they regularly made use of the support") were inter-library lending services and assistance specifically from library staff, which came joint second.
The report also contains interesting results about where the research students carry out their research work, whether at home or in an office or lab or library: very few of any age or discipline seem to be working in libraries.
Of course, this report on the survey responses and research done so far is a great indicator of researchers' behaviour & preferences, but it is only an indicator. Annex 1 describes the sample of survey respondents. Although there were 5408 survey respondents in total, and nearly 40% of these were from Russell Group Universities, some of the results I've been describing by discipline breakdown report only on the Generation Y respondents, and there were 2061 of these - how many of those were Russell Group I'm not sure. The annex also describes that only 2% of respondents' research is described as interdisciplinary, and it gives a breakdown of the funding sources of respondents, which careers advisors might find interesting!
All in all, it's a pretty rich source of information about researchers today.