Skip to main content
Log in

Journal of Academic Ethics - Call for Papers - Trust in Science and Responsible Research

Guest Editors
Hub ZWART, Erasmus University Rotterdam, The Netherlands, zwart@esphil.eur.nl
Loreta TAUGINIENĖ, Kazimieras Simonavicius University, Lithuania, loreta.tauginiene@ksu.lt

According to the Rathenau Institute, trust in science is quite considerable, at least in the Netherlands (this opens in a new tab). There is more trust in science among Dutch citizens than in other institutions (e.g., government, courts of law, newspapers, television, trade unions, corporations). Moreover, during the corona–crisis, trust in science has increased. Still, paradoxically, public distrust in science increased as well and became more vocal in the public sphere. Many voices in contemporary societies are questioning whether scientific information is sufficiently valid, disinterested and objective. At the same time, in order to effectively address the global disruptions facing Europe and the rest of the world, evidence-based insights and potential solutions provided by science are indispensable. Therefore, the credibility, reliability and trustworthiness of science is an issue of crucial importance (Oreskes, 2019; Kitcher, 2001)

Many participants in public debate experience an increase of polarisation and especially social media offer space for distrustful, polarised and accusatory forms of communication. The public sphere – as an arena where citizens come together, exchange opinions and deliberate – has fragmented into multiple bubbles so it seems (Nguyen, 2020).

The IANUS project (this opens in a new tab) aims to foster trust in science but recognises that trust in science is never a given, nor should it be. In fact, “healthy scepticism” is crucial, not only as an intrinsic dimension of research methodologies, but also in the context of public debate. One of the purposes of public debate, one could argue, is to determine whether and when public trust in science is warranted. We build on the conviction that fostering trust in science requires science to be open and responsive to societal values. This includes issues such as the ethics of research production, the ethics of university strategic directions, but also responsible research and epistemic justice or epistemic inclusion (Koch, 2020).

Besides the question how to foster trustworthiness in science, however, another question emerged in the aftermath of the COVID-19 experience as well, namely whether scientists can still trust public debate? We notice scepticism among scientists whether a safe and respectful exchange of views is still possible in the public sphere and whether scientific expertise and validated knowledge are still sufficiently valued. The term science may refer to a broad spectrum of perspectives, moreover, where many disciplines and paradigms are involved, at times endorsing diverging perspectives and resulting in diverging views concerning policy and decision-making. Involving multiple disciplinary perspectives is an important requirement for developing a comprehensive approach, for instance in the case of the COVID-19 crisis (Sulik et al., 2021), which was not only about viruses and vaccines, but also about cultures, values, governance and behaviour. Yet, the spectacle of a plethora of contradictory positions may either challenge public trust in academic knowledge (e.g., the claim that, for any possible position, and expert can be found who supports it), or may raise the question among scientists whether becoming involved in such debates is a meaningfulness exercise.

This special edition will cover several possible topics, such as:

What is trust in science? Public trust in science can mean several things: trust in what scientists say (epistemic trust) and what they do – trust in a scientific method (reproducibility and replication of their work), in research findings, in individual scientists, in research institutions, in products of research and innovation, in science as a system.

Science and society: proximity or distance? Societal trust in science hinges on many factors, including the cohesion of scientific consensus on a given topic, the role science is assigned by government and policymakers, the over-extension (or sobriety) of media reports on preliminary research results, and the dilution of scientific objectivity and political neutrality through industry funding and interest conflicted science. While collaboration with societal stakeholders and industry is part of interactive and participatory research, allowing science to broaden its knowledge base, safeguarding transparency and academic independence are important dimensions of trustworthy interactive research.

Scientific misconduct and trust in science. The most long-lasting solution is to examine the forces which made such misconduct possible. This allows institutions to re-calibrate the modus operandi of science, including the role of ‘perverse incentives’. Decisive is not the occurrence of misconduct per se, but the response of research performing organisations to misconduct: can they move away from a defensive focus on minimising reputation damage towards a proactive approach, opting from prevention by strengthening the resilience of the research ecosystem (Zwart & Ter Meulen, 2019).

Trust in science and university governance. How can university governance as one of internal factors affecting the scientists shape the trust in science (e.g. through research security). How are academic institutions fostering open, responsive, responsible, impact-driven, and inclusive research, how are academic institutions reconsidering their reward systems, e.g., when it comes to acknowledgment and reward of impact-driven research, and how may this affect public trust in science?

Trust in science, conspiracy theories and the COVID-experience. The COVID-19 experience has been a watershed event also concerning trust in science. On the one hand, scientific expertise was seen as decisive in addressing the global challenge and fostering preparedness. On the other hand, the COVID-19 experience fuelled conspiracy theories (e.g., the recently published document entitled The Conspiracist Manifesto). How to analyse an assess this experience and what can we learn in terms of pathways for change?

In line with the JAET’s scope and aims, we are delighted to invite you to submit a full paper for consideration for inclusion in this special edition until 30 June 2024.

All submitted papers will be subject to double-blind peer review and approval by the editorial team and JAET editor-in-chief. At least two peer reviewers will be selected from the JAET reviewers’ database. Please follow this link for more information about the peer review process (this opens in a new tab)

Please check JAET’s Instructions for Authors (this opens in a new tab) before you start writing the paper.

To be considered for publication, papers will usually concern empirical research, with either a quantitative or qualitative approach. In addition, the JAET publishes papers about conceptual (theoretical) research, systematic reviews and opinion papers. Research and review papers’ length can range from 8,000 words and 10,000 words (inclusive of all content).

The link to submit a manuscript via Editorial Manager (this opens in a new tab). The corresponding author will be asked if s/he submits to a Special Issue. So, s/he should tick “Yes” and opt for an option “Trust in Science and Responsible Research”.

Additional Information
Peer Review Policy, Process and Guidance (this opens in a new tab)
Peer Reviewer Selection (this opens in a new tab)

Please contact Hub Zwart, zwart@esphil.eur.nl (this opens in a new tab) if you have any questions.

Timeline:
Deadline for full paper submission 30 June 2024
Date first review round completed September 2024
Date revised manuscripts due November 2024
Review process completed January 2025
Publication date February 2025

References

Biddle, J. (2018). “Antiscience Zealotry”? Values, Epistemic Risk and the GMO Debate. Philosophy of Science, 85(3), 360–379. https://doi.org/10.1086/697749

Kitcher, P. (2001). Science, Truth, and Democracy. Oxford University Press.

Koch, S. (2020). Responsible research, inequality in science and epistemic injustice: an attempt to open up thinking about inclusiveness in the context of RI/RRI. Journal of Responsible Innovation, 7(3), 672–679. https://doi.org/10.1080/23299460.2020.1780094

Nguyen, C. T. (2020). Echo Chambers and epistemic bubbles. Episteme. 17(2), 141–161. https://doi.org/10.1017/epi.2018.32

Oreskes, N. (2019). Why trust science? Princeton University Press.

Sulik, J., Deroy, O., Dezecache, G., Newson, M., Zhao, Y., El Zein, M., & Tunçgenç, B. (2021). Facing the pandemic with trust in science. Humanities and Social Sciences Communications, 8, 301. https://doi.org/10.1057/s41599-021-00982-9

Zwart, H., & Ter Meulen, R. (2019). Editorial: Addressing Research Integrity Challenges: From penalising individual perpetrators to fostering research ecosystem quality care. Life Sciences, Society and Policy, 15, 5. https://doi.org/10.1186/s40504-019-0093-6

Navigation