Open Research across disciplines

By Camilla Gilmore, Chair of Loughborough University’s Open Research Group and Professor of Mathematical Cognition

One of the challenges of institutional change around open research practices is the diversity of disciplines involved. Open research covers a range of activities that promote the openness, transparency, rigour, and reproducibility of research. These values are relevant to all disciplines, but the way these activities are applied and the (perceived) barriers to using them can look very different in different disciplines. 

The challenges of promoting open data provide a clear example of this. In behavioural sciences, where quantitative and qualitative research data comes from human participants, one of the major challenges is how to share data ethically and anonymously. In contrast, in STEM subjects, particularly where industrial partnerships are common, the challenges are around confidentiality, commercial sensitivity and IP protection instead. Consequently, promoting open data at an institutional level must be informed by these different concerns and challenges and provide appropriate disciplinary-specific training and support.

This was a problem that I became immediately aware of when I took over the role of chair of Loughborough’s Open Research Working Group (ORWG) in early 2020. As individual researchers, our perceptions of the “state of the art” of open research are informed by our own disciplinary experience. But to make institutional change, we need to ensure that the systems supporting open research and the opportunities and incentives we promote apply to researchers in all disciplines. I felt that I didn’t know enough about what open research looks like in other disciplines.

Fortunately, I was not alone in feeling like this. Professor Emily Farran (Academic Lead, Research Culture and Integrity, University of Surrey) had similar concerns, and so we decided that it would be beneficial to draw together examples of open research practices and resources across as wide a range of disciplines as possible. This project quickly became a substantial task and benefitted from many authors and contributors. The resulting document, Open Research: Examples of good practice, and resources across disciplines ( was initially launched in December 2020. The document is updated annually in response to suggestions and feedback from readers (if you think good practice in your discipline is missing, why not suggest it here?).

This work has now been incorporated into the UKRN (UK Reproducibility Network) webpages, where 28 separate disciplinary pages provide case studies, examples of open research practices and disciplinary-specific resources. These highlight that, while open research practices may look different in different disciplines, there is much to learn by looking beyond our own discipline and seeing commonalities in approaches.

At Loughborough, we are ensuring that our institutional activities are sensitive to disciplinary differences by creating Open Research Leads in each school who sit on the ORWG. But we are building on the commonality of challenges by working across schools to provide training and opportunities. Look out for more opportunities in the coming academic year.

The views and opinions of this article are the author’s and do not reflect those of the University…although hopefully they do reflect Loughborough University values.

A different kind of diversity

By Lara Skelly, Open Research Manager for Data and Methods

A few years ago, I submitted a methodological paper to a discipline-specific journal. The reviewers were not kind, one of them saying “There is no narrative of the findings.” Well naturally not, as the findings were the methodology I was describing. While entirely likely that I presented the purpose of the paper poorly, being a freshly minted PhD with limited publication experience, I remember the confusion I felt around the limited expectation of the reviewers.

Methodological papers are still a rarity, despite the slightly increased popularity that I saw during the COVID lockdowns. Most researchers that I encounter still see the typical paper of introduction-literature review-methods-results-discussion as the only format worth putting out into the world. And as is the case in any one-size-fits-all approach, much is lost by this homogeneity.

Research and the people who work in research are anything but homogenous. I have seen all manner of opinions of what counts for science, what data are, and ways of engaging with their craft. I’ve known researchers who are interested in the broad and the narrow, the individual and the collective, the future and the past. Boxing this variety into a homogenous communication is in this day-and-age, down-right daft.

We are in a wonderful age that strives to see diversity as a celebration. The time has come to celebrate the diversity in our research as well. To recognise that the typical paper format is perfectly fine, but researchers are not restricted to it. Sharing code, protocols, data, any of the ingredients of our research is one way that we can live our diversity, upholding a value that has become global.

Thanks to Katie Appleton and Gareth Cole for insightful comments on early drafts.

The views and opinions of this article are my own and do not reflect those of the University…although hopefully they do reflect Loughborough University values.


Last week we attended the Association of Research Managers and Administrators‘ (ARMA) annual conference in Brighton. We were presenting on our Research Data Repository which was launched at the end of April.

Although our session part of the last panel of the conference it was still well attended with representatives from both universities and funders. As part of our talk we had decided to hold c30-45 minute breakout/discussion sessions. Not only were these sessions an opportunity for attendees to ask us additional questions about our repository but it was also an opportunity for us to discover the lay of the land at other institutions.

As someone with a research background but who has worked in libraries for the past 10 years it was interesting to hear some of the comments from the research managers and administrators. It is a view I have heard before but until the growth of open access and research data management at universities many research office staff were not aware that the ‘Library did research’.

One of the many advantages of the current “open landscape” at universities is that many departments that previously had limited or even no contact with each other (or contact only in very specific areas) e.g. Research Office, IT, Library, now have regular and meaningful contact across a number of areas. (For example, within a week of starting at Loughborough I had met colleagues from IT and the Research Office as part of my induction and work with them on a weekly (if not daily) basis.) Not only does this regular contact help to reduce any duplication of effort but it also means that staff working in those departments now have the opportunity to have a more holistic view of how research is conducted and supported at their organisation. As such, they are able to do their jobs with an understanding of how their decisions and work may affect others at the institution. Most importantly, it means that we are better placed to provide the support that researchers and academics may require.

This holistic view is particularly important at the moment when one considers the demands on academic staff in both research and teaching.

Updated ESRC policy on Research Data

The Economic and Social Research Council (ESRC) has recently released an updated version of its research data policy. This can be found at: (link to the PDF of the policy at the bottom of the page).

The ESRC policy now maps more clearly to the RCUK Common Principles on Data Policy. In addition, the updated ESRC policy explains in far greater detail than before the responsibilities of: ESRC grant applicants, ESRC grant holders, grant holders’ institutions, the ESRC itself, and ESRC data service providers.

If you are ESRC funded, based at Loughborough University, and wondering how this policy may affect you please do contact me (Gareth Cole – Research Data Manager) on g.j.cole(at)


Over 1,000 research data repositories available in

In August 2012 – the Registry of Research Data Repositories went online with 23 entries. Two years later the registry provides researchers, funding organisations, libraries and publishers with over 1,000 listed research data repositories from all over the world making it the largest and most comprehensive online catalog of research data repositories on the web. provides detailed information about the research data repositories, and its distinctive icons help researchers easily identify relevant repositories for accessing and depositing data sets.

To more than 5,000 unique visitors per month offers reliable orientation in the heterogeneous landscape of research data repositories. An average of 10 repositories are added to the registry every week. The latest indexed data infrastructure is the new CERN Open Data Portal:

[Taken from a Re3data update in November 2014]

RDM case study

Dr Erika Whiteford has kindly written a research data management case study for the Lakes and the Artic Carbon cycle project. Erika’s case study highlights the value historic research data brings to our understanding of the future. It also illustrates the importance of managing research data and sharing it for possible future studies.

Erika’s case study is available via the university Library’s Research Support web pages.

RCUK Data Policies

Research Councils UK have devised Common Principles on Data Policy. This establishes their position on the management of research data produced by projects they fund, as well as taking steps to making data publically available.

Each Council has slightly different requirements in relation to proposals for funding and the research data arising from the projects they fund. The Digital Curation Centre have pulled together a useful summary table together with information on funders’ data policies. See their Overview of funders’ data policies web page for further details.

UK HEI RDM survey and the DCC

This post makes an interesting addition to earlier posts regarding our UK HEI Research Data Management survey and the benefits of being part of the JISCMRD programme.

We looked at the survey results and DCC engagement institutions to see whether there was any evidence to suggest that institutions with RDM services in place received DCC support.

An initial mapping of data from the survey against data on institutions receiving DCC support revealed similar numbers had a research data management policy. Thirty-eight percent (9 out of 15) receiving DCC support had a research data policy compared with 39% (15 out of 38) of all respondents. However, un-supported institutions, those not supported by the DCC s or not part of the JISCMRD programme were less likely to have a research data management policy (20% [3 out of 15]).

Research Data Policy

Research Data Policy

Similarly, 15% (2 out of 13) of those receiving DCC support had a research data service in place, but this proportion falls to 7% (1 out 15) for un-supported institutions.

Research Data Service

Research Data Service

Thus support, whether through funded projects, DCC support or a mixture of both has a significant impact in policy development. This applies even when such development is not a condition of funding; only MRD-funded projects had such a requirement.

It’s worth noting some overlaps between JISCMRD and DCC support. A few of the institutions in our survey had a bit of both. More generally DCC helped facilitate JISCMRD workshops and have a continuing brief to promote its lessons across other institutions.

Assessing the impact of the JISCMRD programme

In this third post focusing on our recent RDM survey of UK HEIs, we consider the impact of the JISCMRD programme on the work UK HEIs are undertaking.

Jisc recently funded projects to undertake a range of initiatives investigating the management of research data. Specific programme strands were:

  1. Research Data Management Infrastructure (RDMI)
  2. Research Data Management Planning (RDMP)
  3. Support and Tools
  4. Citing, Linking, Integrating and Publishing Research Data (CLIP)
  5. Research Data Management Training Materials

Although the programme concluded in 2013 the influence these project on the broader UK HEI sector has yet to be established. While these projects were underway evidence of their research and development activities was recorded in a blog by Laura Molloy (JISC MRD: Evidence Gathering. Encouraging discussion across the JISC Managing Research Data programme, Laura Molloy, Laura is the Curation and Preservation Officer at the Humanities Advanced Technology and Information Institute (HATII), University of Glasgow and Preservation Researcher with the Digital Curation Centre. Posts indicate the effectiveness of the programme in providing shared experience, expertise and materials for reuse. The last blog post (May 2013) describes the role of research funders in ensuring effective RDM (not least to allow for the implementation of their own policies), and in particular the need for them to consider the place and nature of peer review with regards to assessing the RDM component of proposals.

So, what has been happening since these projects ended and why is it important to consider the influence the work of these projects may be having on current RDM related activities in UK universities?

It is apparent from analysis of survey data from UK HEIs that JISCMRD funding does seem to put institutions in a better position to develop policy and services to support their research data management initiatives. For further information on this please see our previous discussion (Institutional readiness for managing research data). However, when considering the ‘ripple effect’ of these projects data is more difficult to locate. Personal communications across the sector for information about the work of a project or requests to use materials are not centrally recorded.

We believe that impact assessment is inherently valuable for evaluating the success of a programme and its impact over time. Information gathered can be used to inform and shape future work and to lobby for funding for such activities.

We therefore welcome responses from those involved in the JISCMRD programme and institutions grappling with RDM issues describing the benefits they derived from involvement in JISCMRD projects or having access to information from these projects. Comments can be added to this blog or by contacting Stéphane Goldstein <>. We also hope that, in due course, Jisc itself will show an interest in following up on our findings.

Previous posts in this seriess:

Staffing research data management in UK HEIs

Following on from our earlier post about Institutional readiness for managing research data, Stéphane and I have been looking at staffing levels in UK HEIs for establishing and supporting RDM. The data comes from our recent survey of UK HEIs RDM practices and a mapping to JISCMRD programme institutions.

A further set of indicators can serve to substantiate the argument that HEIs which took part in the JISCMRD programme are more advanced in the development of institutional RDM practice. These are the figures that relate to staffing levels for all the different categories of RDM-related personnel identified in the Loughborough survey of research data management activities at UK HEIs. The following tables describe this:

  Number %
Number of institutions receiving Jisc funding who have any fixed-term staff in post 7 out of 15 47%
Number of institutions not receiving Jisc funding who have any fixed-term staff in post 6 out of 23 26%


  Number %
Number of institutions receiving Jisc funding who have any permanent staff in post 8 out of 15 53%
Number of institutions not receiving Jisc funding who have any permanent staff in post 5 out of 23 22%


  Mean Median
Average fixed-term staffing level (FTE) for institutions receiving Jisc funding 1.07 (*) 0
Average fixed-term staffing level (FTE) for institutions not receiving Jisc funding 0.54 0

(*) It is reasonable to say that the mean, in this instance, is somewhat skewed by one responding university having 5.5 FTE employed in RDM related roles.


  Mean Median
Average permanent staffing level (FTE) for institutions receiving Jisc funding 1.13 1
Average permanent staffing level (FTE) for institutions not receiving Jisc funding 0.24 0

These figures are perhaps hardly surprising, particularly for the fixed-term staff, who might have been recruited specifically for the duration of the Jisc projects; it’s noteworthy though that institutions receiving Jisc funding have also invested much more than others in permanent staff.

 Full survey results and discussion are available in a blog post by Martin Hamilton (