Altmetrics: maybe not so “alt” any more?

The recent release of a National Information Standards Organization (NISO) white paper on altmetrics marks a turning point in the development of these research impact measurement tools. The fact that consideration is being given to develop some standards for altmetrics means that they are moving closer to being an accepted part of the academic and publishing landscape. The press release announcing the release of the white paper is available here, and the paper itself is here.

Some of the recommendations in the white paper include developing a definition of alternative metrics, identifying the types of research outputs that are most suitable to have metrics applied to them, and identifying the role of alternative metrics in research evaluation. Although I can see the merit in developing standards around the use of altmetrics, I’m a little concerned that the standards may become too prescriptive and limit the usefulness of altmetrics. Part of the appeal of altmetrics is that they can be used to describe a wide range of research outputs, so introducing definitions of what is and what isn’t an altmetric may limit their growth.

The white paper also notes that awareness of altmetrics is still low amongst researchers, and I think this is something that librarians can help to address. If we are approached by a researcher who has questions about measuring research impact, we should mention altmetric tools such as ImpactStory, as well as the traditional measures of impact such as citation counts. This is probably most relevant for librarians who work with researchers in the social sciences and humanities, who are not well-served by traditional metrics, but may find that there are suitable altmetrics available for their research outputs.

It will be interesting to see the final version of this report, and the standards that come out of it.


Research impact data training for academics

Yesterday I conducted a workshop on research impact data for academic staff who are applying for promotion. As part of the application for promotion, staff need to compile a CV which lists their research outputs with a measure of their impact. This can take the form of the number of citations to a work in Scopus, Web of Science or Google Scholar, as well as the Impact Factor of the journals which the academics have published in. I also briefly discussed tools for creating a profile, such as ResearcherID and ORCID, and some of the altmetrics tools, such as the bookmarklet and ImpactStory. The workshop only went for an hour, so it was a quick overview of these three tools and how to extract the required information from them. We’d also prepared a “Tracking your research” LibGuide for the workshop, so the attendees had something to refer to afterwards.

We had 10 people at the workshop, which was a 100% turnout of the staff who had RSVP’d. They asked some good questions, which I was able to answer, and they seemed to be interested in the information that we were providing.

It will be interesting to see how many follow-up consultations the Research Librarians have with academic staff about the promotions process. We’ll be running two more of these workshops later in the year, so hopefully the academics will find them useful.