Background
Open Science is a term used to encompass making methodologies, datasets, analyses, and results of research publicly accessible for anyone to use freely12. This term gained popularity in the early 2010s when researchers noticed that they were unable to replicate or reproduce prior work done within a discipline3. There was also a large amount of ambiguity when trying to understand what processes were used in a study. Open science, as a result, started to gain more traction to provide greater context, robustness, and reproducibility metrics. From there, many disciplines of research created their own formal definition and recommended practices. The widespread adoption of open science increased exponentially when large scale studies conducted in the mid 2010s found that numerous works were difficult or impossible to reproduce and replicate in psychology4 and other disciplines5.
There are numerous processes that open science can be broken down into such as open data, open materials, open methodology, and preregistration. Open Data specifically targets datasets and their documentation for public use without restriction, typically under a permissive license or in the public domain6. Not all data can be openly released (such as with personally identifiable information); but there are specifications for protected access that allow anonymized datasets to be released or a method to obtain the raw dataset itself. Open Materials is similar in regard except it concerns tools, source code, and their documentation7. This tends to be synonymous with Open Source in the context of software development, but materials are used to encompass the source in addition to available, free-to-use technologies. Open Methodology defines the full workflow and processes used to conduct the research, including how the participants were recruited and the procedure they went through1. The methodologies typically expand upon the original paper, such as technicalities that would not fit in the paper format, or survey items or test questions administered to participants. Finally, Preregistration acts as a time-stamped copy of the initial methodology and analysis plan before the start of a study, defining the process of research without knowledge of the outcomes89. Preregistrations can additionally be updated to preserve the initial experiment conducted and the development as more context is generated and changes must be made (but all changes are documented).
Open science practices and reproducibility metrics are becoming more commonplace within numerous scientific disciplines. Within subfields of educational technology, however, the adoption and review of these practices and metrics are neglected or sparsely considered10. There are some subfields of education technology that have taken the initiative to introduce open science practices (special education11; gamification12, education research13); however, other subfields have seen little to no adoption. Concerns and inexperience in what can be made publicly available to how to reproduce another’s work are some of the few reasons why education researchers may choose to avoid or postpone discussion on open science and reproducibility. On the other hand, lack of discussion can lead to tediousness and repetitive communication for datasets and materials or cause a reproducibility crisis5 within the field of study. As such, there is a need for accessible resources and understanding on open science, how it can be used, and how to mitigate any potential issues that may arise within one’s work at a later date.
-
Kraker, P., Leony, D., Reinhardt, W., & Beham, G. (2011). The case for an open science in technology enhanced learning. International Journal of Technology Enhanced Learning 3(6), 643–654. https://doi.org/10.1504/IJTEL.2011.045454 ↩↩
-
Vicente-Saez, R. & Martinez-Fuentes, C. (2018). Open Science now: A systematic literature review for an integrated definition. Journal of Business Research 88, 428–436. https://doi.org/10.1016/j.jbusres.2017.12.043 ↩
-
Spellman, B. (2015). A Short (Personal) Future History of Revolution 2.0. Perspectives on Psychological Science 10(6), 886–899. https://doi.org/10.1177/1745691615609918 ↩
-
Open Science Collaboration. (2015). Estimating the reproducibility of psychological science. Science 349(6251), aac4716. https://doi.org/10.1126/science.aac4716 ↩
-
Baker, M. (2016). 1,500 scientists lift the lid on reproducibility. Nature 533, 7604, 452–454. https://doi.org/10.1038/533452a ↩↩
-
Murray-Rust, P. (2008). Open data in science. Nature Precedings 1(1), 1. https://doi.org/10.1038/npre.2008.1526.1 ↩
-
Johnson-Eilola, J. (2002). Open Source Basics: Definitions, Models, and Questions. In Proceedings of the 20th Annual International Conference on Computer Documentation (Toronto, Ontario, Canada) (SIGDOC ’02). Association for Computing Machinery, New York, NY, USA, 79–83. https://doi.org/10.1145/584955.584967 ↩
-
Nosek, B., Beck, E., Campbell, L., Flake, J., Hardwicke, T., Mellor, D., Veer, A., and Vazire, S. (2019). Preregistration Is Hard, And Worthwhile. Trends in Cognitive Sciences 23(10), 815–818. https://doi.org/10.1016/j.tics.2019.07.009 ↩
-
Nosek, B., Ebersole, C., DeHaven, A., Mellor, D. (2018). The preregistration revolution. Proceedings of the National Academy of Sciences 115(11), 2600–2606. https://doi.org/10.1073/pnas.1708274114 ↩
-
Nosek, B. (2022). Making the Most of the Unconference. Presented at the Unconference on Open Scholarship Practices in Education Research. Available at https://osf.io/9k6pd/ ↩
-
Cook, B. G., Lloyd, J. W., Mellor, D., Nosek, B. A., & Therrien, W. J. (2018). Promoting open science to increase the trustworthiness of evidence in special education. Exceptional Children, 85(1), 104-118. https://doi.org/10.1177/0741932516637198 ↩
-
García-Holgado, A., García-Peñalvo, F. J., de la Higuera, C., Teixeira, A., Ehlers, U. D., Bruton, J., ... & Burgos, D. (2020, October). Promoting Open Education Through Gamification in Higher Education: the OpenGame project. In Eighth International Conference on Technological Ecosystems for Enhancing Multiculturality. 399-404. https://doi.org/10.1145/3434780.3436688 ↩
-
Makel, M. C., Smith, K. N., McBee, M. T., Peters, S. J., & Miller, E. M. (2019). A path to greater credibility: Large-scale collaborative education research. AERA Open, 5(4). https://doi.org/10.1177/2332858419891963 ↩