All academic researchers know they are never far away from a citation count. And while citations undoubtedly do count, there is much debate about what a count of citations actually counts for.
At Loughborough, we’ve been working hard to increase the influence of our research on other researchers and the field more generally, and citation is an important indicator of this influence. Many factors make a difference. The quality – significance, originality and rigour – of the research itself and our ability to craft the message matter, of course. So does the visibility of our research to the right audiences, through choices about with whom and where to publish. Open access is a game-changer in this respect; our Institutional Repository delivers a citation benefit from enhanced visibility which is why our Open Research Position Statement commits to depositing the full text of all our primary research outputs from 2020.
However, improving citation performance is like turning a supertanker round. For example, the QS 2022 rankings, to which international stakeholders like students and funders pay keen attention, recognise the citation of research published all the way back to 2015 (up to 2019). However, the supertanker is most definitely turning; our hard work has seen Loughborough’s citation count and citations per faculty score almost double since QS 2018.
A number of universities have recently highlighted the Stanford University list of the world’s “top 2% of scientists”. Amongst the many excellent researchers on the list are over 120 Loughborough colleagues, past and present. We congratulate them.
Media coverage has served up multiple cuts through the Stanford data. For example, would you like to know who is in the top 10 of European researchers? Meet Georg, Karl, Peter, Douglas, Charles, Avelino, Guido, Stefan, George and Michael. Spot anything? (To save you the trouble, Avelino is male too). While the ranking metric is carefully crafted, ultimately the list is based on total citation counts and so it’s no surprise to see it perpetuate a view that the world’s “top scientists” are all older white men. We’re in no doubt that these guys all are (or have been) fabulous scientists, but total citation counts (including variations on the theme like h-index) favour certain research fields, long service and output quantity. Consequently, they particularly diminish the profiles of those whose research careers have been interrupted by career breaks and disrupted by balancing a research career with significant caring responsibilities.
This is why, guided by our sector-leading Responsible Use of Research Metrics statement, we make use of field-weighted citation metrics, to accommodate significant differences between disciplines, that are also normalised to avoid emphasis on quantity. We have consistently emphasised output quality (which can only be judged via expert peer review, not citation) and visibility, guided by outlet metrics such as Source-Normalized Impact per Paper (SNIP) and the SCImago Journal Rank (SJR). Output quality and outlet visibility are much more under our direct control, and they have proved to be the essential ingredients that have nearly doubled our citations in 4 years.
That’s why our PDR process (outside of the Arts) uses SNIP, SJR and Field-Weighted Outputs in Top Citation Percentiles to promote reflection on citation levels. Field-weighted metrics also support comparison of ‘topic clusters’ at the whole-institution level, adding weight from a research perspective to our choice of strategy themes – net zero and climate change, sport and health, and vibrant and inclusive societies.
In case you are wondering, the Stanford list top 10 for Loughborough is also all-male, while a ranking we produced by Field-Weighted Citation Index (FWCI, solely for the purpose of this blog) has 4 women in the top 10 (in a University where just over 30% of academics are women). Nobody features in the top 10 of both rankings … or the top 20 and then we stopped checking.
But this is not a quest for a better ranking metric; of the myriad of citation indicators to choose from, each with different meanings and merits, none is a plausible basis to rank individual research performance, nor have we ever felt the urge to do so. At the same time, citations do count. Well-cited work should be celebrated and best practice lessons learned, while less well-cited work should prompt frank reflection. Citations tell us about the academic impact of our research but that’s difficult to influence directly, hence why we concentrate our efforts to increase citation on output quality and visibility.
As we look now to take our research and innovation to the next level, quality, visibility, and impact will all be front of mind. Our new University strategy also foregrounds equity at a time when the need for fair and responsible assessment of research performance has never been greater. Our ‘Same storm, different boats’ study was one of the first to highlight how the pandemic had amplified gendered differences in caring responsibilities with clear consequences for career progression. The more recent study by the ‘500 Women Scientists’ organisation. shows nothing has changed, suggesting that “women in science have experienced career disruptions that will take years, or even decades, to undo”. We commit here that our assessments of research performance will repay the commitment of colleagues through the pandemic with the nuance required to acknowledge these extraordinary lived experiences.
Professor Steve Rothberg
Pro Vice-Chancellor (Research)