Implementing FAIR Workflows: A Proof of Concept Study in the Field of Consciousness is a 3-year project funded by the Templeton World Charity Foundation. In this project, DataCite works with a number of partners on providing an exemplar workflow that researchers can use to implement FAIR practices throughout their research lifecycle. In this monthly blog series, the different project participants will share perspectives on FAIR practices and recommendations.
In this post, Xiaoli Chen, project lead at DataCite, reflects on the gap between acknowledging FAIR and practicing FAIR.
The first thing I learned when I started working with (other) researchers on Open research some 10 years ago, is that they are all for it – the equity, accountability, transparency, reusability, all of it. Very little effort was needed to make that case. Everybody knows. The second thing I learned is that knowing the destination, and setting off on the journey, are two very different notions staring each other down on the opposite banks of the great chasm named uncertainty.
Over the years, stakeholders in the Open and FAIR research community poured themselves into bridging that gap – from infrastructure (Ficarra, Victoria et al., 2020; Armeni et al., 2021; Cousijn et al., 2021), to organizational policy (Levels, 2014) and guidelines (Giofrè et al., 2017), to tool/service frameworks (EOSC, OpenAIRE), and institutional, national (Lasthiotakis et al., 2015), regional (European Commission, 2016), and global (CODATA et al., 2019; RDA) initiatives. And still, we face the great mystery: what does it take to get the researchers onboard, not just conceptually, but in terms of practice? A great number of surveys, interviews, and focus groups were done to identify motivation and barriers of data and code sharing, the most recent evidence being another piece on the topic, reporting that aside from our old friends lack-of-time, fear-of-scoop, and concerns-over-misuse, the value, process, and workflows are still somewhat unclear to the researchers (Gomes et al., 2022). Uncertainty, we meet again.
What better way to disperse uncertainties than to take the matter into our own hands, to implement the various FAIR practices, and gather empirical data points on the cost and benefits, challenges, and navigating maneuvers? This is what we are doing in the Implementing FAIR Workflows project – laying down a concrete, real-life example of a research project carried out with FAIRness in mind from the start. Now that we are 400 days into the project, here are some of my observations.
What happens when a researcher is expected to practice FAIR
Practicing FAIR is like healthy living – although skipping sugary beverages helps, it’s hardly the right, full, or most importantly, suitable answer to achieving your goal. There are dozens of practices that can be adopted through endless possible configurations, each a small building block that, over the long term, contribute to the incremental change that can only be appreciated when looked back at where we were before we started. When researchers come in excited about FAIRness and Openness, desiring to be an agent of change, they will probably experience the following, in no particular order:
- motivation fueled by the discontent with the status quo
- confusion and disorientation about alternative practice
- not knowing when or where to find support
- falling back into old habits
- underestimating the time needed
- overestimating the time needed
- starting to connect the dots between FAIR concepts and disciplinary reality
- being uncomfortable with novel tools
- being too comfortable with novel tools
- peer-pressured back into old habits
- starting to see some tangible benefits
We have seen all of the above a year into the project, many small setbacks but there’s light peeking through at the end of the tunnel – and we expect to see the following, going forward:
- becoming better at selecting and using tools
- adopting a set of practices that makes the most sense
- building efficient and easy to use FAIR workflows
Some heads-ups
Steep learning curve
We found that despite the FAIRly straightforward principles, how the infrastructure and the protocols make FAIR a reality is not obvious, to say the least. Understanding the layers and aspects of FAIR practices in the disciplinary context, incorporating FAIRness into research design considerations, evaluating available tools, and building workflows around the tools and practices, are all mini-research tasks by their own rights. The good thing is that one only needs to learn it once, but it’s important to set realistic expectations.
Decision fatigue
We found that practicing FAIR is decision-making, being aware, and being intentional. What to share, when to share, what tools to adopt, and how to coordinate with teammates. When sharing, what should be the level of accessibility, the extent of findability, the scope of interoperability, the conditions of reuse… endless questions, each demands a decision, none is easy to make. The key to mitigate decision fatigue, is to draw from the experience of the wider community – in making research FAIR, one never walks alone.
The power of habits
We found that maintaining a FAIR research workflow requires reminders, reminders, and reminders. Being reminded that alternative research outputs can also be of immense value; that a community of support exists and is easy to reach; that no process should be taken for granted and that everything can be improved. It is breaking old habits and forming new ones. Both hard, but incredibly rewarding to do.
Will FAIR research become easier with practice? We will keep thorough notes and report back later in the project.
References
Armeni, K., Brinkman, L., Carlsson, R., Eerland, A., Fijten, R., Fondberg, R., Heininga, V. E., Heunis, S., Koh, W. Q., Masselink, M., Moran, N., Baoill, A. Ó., Sarafoglou, A., Schettino, A., Schwamm, H., Sjoerds, Z., Teperek, M., van den Akker, O. R., van’t Veer, A., & Zurita-Milla, R. (2021). Towards wide-scale adoption of open science practices: The role of open science communities. Science and Public Policy, 48(5), 605–611. https://doi.org/10.1093/scipol/scab039
CODATA, C. on D. of the I. S. C., CODATA International Data Policy Committee, CODATA And CODATA China High-Level International Meeting On Open Research Data Policy And Practice, Hodson, S., Mons, B., Uhlir, P., & Zhang, L. (2019). The Beijing Declaration on Research Data. https://doi.org/10.5281/ZENODO.3552330
European Commission. (2016). New policy initiative: The establishment of an Open Science Policy Platform. https://ec.europa.eu/research/swafs/pdf/pub_open_science/new_policy_initiative.pdf
Ficarra, Victoria, Fosci, Mattia, Chiarelli, Andrea, Kramer, Bianca, & Proudman, Vanessa. (2020). Scoping the Open Science Infrastructure Landscape in Europe. Zenodo. https://doi.org/10.5281/ZENODO.4159838
Giofrè, D., Cumming, G., Fresc, L., Boedker, I., & Tressoldi, P. (2017). The influence of journal submission guidelines on authors’ reporting of statistics and use of open research practices. PLOS ONE, 12(4), e0175583. https://doi.org/10.1371/journal.pone.0175583
Gomes, D. G. E., Pottier, P., Crystal-Ornelas, R., Hudgins, E. J., Foroughirad, V., Sánchez-Reyes, L. L., Turba, R., Martinez, P. A., Moreau, D., Bertram, M. G., Smout, C. A., & Gaynor, K. M. (2022). Why don’t we share data and code? Perceived barriers and benefits to public archiving practices. Proceedings of the Royal Society B: Biological Sciences, 289(1987), 20221113. https://doi.org/10.1098/rspb.2022.1113
Lasthiotakis, H. a, Kretz, A. a, & Sá, C. b. (2015). Open science strategies in research policies: A comparative exploration of Canada, the US and the UK. Policy Futures in Education, 13(8), 968–989. https://doi.org/10.1177/1478210315579983
Levels, P. (2014). ATLAS Data Access Policy May 21st 2014.
This project was made possible through the support of a grant from Templeton World Charity Foundation, Inc. The opinions expressed in this publication are those of the author(s) and do not necessarily reflect the views of Templeton World Charity Foundation, Inc.
