Progress and future
directions in open science
Ground
floor, The
Meeting House, University
of Sussex
1-3pm 4th July 2023
Admission Free Register here
Overview.
This symposium
will consist of five researchers from four different countries,
highlighting new ideas for making science more open. We consider
changes that could be made by individual scientists in their own
workflow, and by groups of scientists working informally together. We
consider reforming the current publication model that ill suits us,
and reforming governance at institutional level. In all cases, the
reforms, many now ongoing, challenge existing practice to become more
open and collaborative. Reny
Baykova focuses on the issues with computational
reproducibility in the field of psychology. She introduces a pilot
project in which researchers can submit their papers to a
reproducibility check before publication. Rob McIntosh
considers the challenges facing neuropsychologists in embracing
Registered Reports, which often require high power even though cases
can be hard to come by. He presents solutions focusing on the wider
adoption of more flexible, field-sensitive criteria for the
Registered Reports format. Zoltan Dienes argues that to
address the credibility crisis, change is needed at the institutional
level. Science will only function optimally if the culture by
which it is governed becomes aligned with the way of thinking
required in science itself. He suggests a series of graduated reforms
to university governance, based on established open democratic
practices. Balazs Aczel considers the problem of analytic
flexibility and how it may be addressed by multiverse and
multi-analyst approaches. He introduces consensus-based guidance for
conducting and reporting multi-analyst studies. Finally, Corina
Logan considers the Peer Community In Registered Reports (PCI RR)
platform, in which Registered Report preprints are editorially taken
through to the final acceptance of the Stage 2 Registered Report by
the community of recommenders (editors), in a way that is free for
authors and free for readers. Quality is assured by the team of
recommenders and openness of processes. About two dozen journals have
guaranteed to publish those submissions recommended by PCI RR (given
journal remit, power/ Bayes factor requirements, and payment of
APCs).
1.
Ensuring the computational reproducibility of psychology
research
Reny Baykova University
of Sussex, England
The findings of a
research paper are relevant only if they are computationally
reproducible – rerunning the same analysis on the same dataset
yields the same numerical results, figures, and inferential
conclusions. Studies have found that in the field of psychology, only
about a quarter to a third of published papers are
computationally reproducible. Such statistics erode confidence
in science and require that we establish new processes to ensure
research integrity. A project I am currently working on explores a
potential solution to this problem – certifying
the computational reproducibility of papers before they are
submitted for publication. As part of the pilot, researchers at
the School of Psychology at the University of Sussex can volunteer to
submit their papers to a reproducibility check conducted by an
independent statistician. The statistician helps improve the
reproducibility of the paper and compiles a report that the
researchers can
reference in their manuscript, setting them
apart in terms of transparency and rigour in the eyes of editors,
reviewers, and readers. By helping researchers gain the skills to
conduct reproducible research and showing them the benefits of
doing so, the project aims to create a push towards better
quality control in academia from within the research community.
2.
Challenges (and solutions) for Registered Reports in Neuropsychology
(by
Zoom)
Rob
McIntosh Open
Research Office, British Neuropsychological Society & University
of Edinburgh, Scotland
The Registered
Reports article format is a uniquely valuable route for the
generation and publication of unbiased evidence on scientific
questions of interest. The format is designed to eliminate reporting
and publication biases but it also offers collateral benefits for the
quality of evidence. For instance, the common requirement for very
high statistical power (≥ .9) can enhance the informativeness of
both positive and null results. But a rigid application of an
idealised model of Registered Reports can also create barriers to
wider engagement, particularly for studies of hard-to-access samples,
such as people with neuropsychological symptoms consequent on brain
damage or degeneration. There is then a danger that an inability to
meet high power requirements will prevent certain fields from
unlocking the core benefit of unbiased publication: we risk throwing
the lack-of-bias baby out with the bathwater of an ideal study
design. As Open Science Officer for the British Neuropsychological
Society, I will summarise the special challenges that our scientific
and clinical membership perceive for conducting Registered Reports. I
will also suggest some solutions, focusing on the wider adoption of
more flexible, field-sensitive criteria for the format, which
consider the practical constraints affecting the research alongside
the value of the scientific question.
3.
Does open science depend on open democratic governance?
Zoltan
Dienes University
of Sussex, England
One reason for the
credibility crisis may be the larger context in which scientists are
embedded, including the culture produced by the governance structures
of research institutions. Many universities have become almost
entirely top down in governance, with a management class that
measures and rewards a researcher's worth by the Key Performance
Indicators (KPIs) they promote. These KPIs often impede open science.
Relatedly, the way decisions are made at a university mismatches the
way science itself functions, which is at core a democratic process
involving the selection of ideas according to critical arguments
(i.e, science exemplifies an open society). Science may flourish best
when embedded in an open society, rather than a closed authoritarian
one. I propose a graduated series of radical reforms to university
governance drawing on the open democratic movement, for example the
use of deliberative polls, Ciitizen's Assemblies, and use of lot, to
enable universities to function more similarly to how science itself
functions. I conjecture that there will be a synergy when governance
and science match their core processes, enabling science to flourish
more optimally.
4.
The Multi-Analyst Approach
Balazs
Aczel Eotvos
Lorand University, Hungary
When analyzing a
research question on a dataset, the analyst often has the freedom to
choose between different - but equally justifiable - analytical
options. When the final result and conclusion are the outputs of one
analytical path taken by one analyst, an important source of
uncertainty remains unexplored: we don’t know whether other
analysts taking similarly justifiable choices would have arrived at
the same or different results. The multiverse and multi-analyst
approaches try to compensate for this weakness of traditional data
analyses by exploring the relevant analytical space. In this talk,
I’ll discuss the challenges that this analytical multiplicity
presents and I’ll introduce a consensus-based guidance for
conducting and reporting multi-analyst studies.
5.
How Peer Community in Registered Reports lets researchers take back
control of the publishing process (by
Zoom)
Corina
Logan Max
Planck Institute for Evolutionary Anthropology, Germany
There is a desperate need to reform the production and dissemination of scholarly outputs to increase transparency, reproducibility, timeliness, academic rigor, and equity. I will discuss what researchers are doing to address these issues by sharing ways to tackle biases and facilitate higher quality research that puts researchers back in control using Peer Community in Registered Reports - a free, supra-journal platform that reviews and recommends registered reports across all research fields