Call for Papers: Special Issue on Online Information Disorder: Fake News, Bots, and Trolls

Important Dates
Submission Deadline: extended to 31 May 2021
First Notification: 15 July 2021
Revisions Due: 31 August 2021
Final Notification: 30 September 2021
Final Manuscript: 31 October 2021

Aims and Scope
Recent years have seen a tremendous increase in the propagation of different types of misinformation and disinformation, including among others fake news, rumors, clickbait and conspiracy theories. Misinformation involved in satire and clickbaits among others, has a different intention from disinformation. Whereas the former lacks the intent of misleading, disinformation has intentional deception of the public opinion as its main aim.

Both misinformation and disinformation are not new phenomena. Still, even if they have existed since a long time ago, the popularity of social media and the ease of publishing content online have exacerbated the problem and its consequences in the society. The propagation of fake news, rumors and conspiracy theories leads to severe consequences in different domains, including economy, politics, and health. For example, the circulation of fake news and conspiracy theories about the Coronavirus disease (COVID-19) has led to distorted information
regarding its origin, prevention, and effective treatment which often lack scientific ground and has a large potential to negatively impact the society. In the political domain, fake-news stories have been heavily recognized as having a significant impact on the results of various elections and referendums.

Despite many attempts by the research community, the development of technology necessary to assist experts in detecting mis/disinformation remains an open problem due to a number of challenges. Fake news are intentionally written to confuse the readers, often containing a mixture of false and real information.
In addition, the different types of misinformation and disinformation (e.g., hoaxes, rumours) could be written with different motivations. To this end, the approaches to the detection of such content should also consider the motivation and focus on those that intend and that probably do the most harm. Another challenge is that there are different types of users involved in the creation and
circulation of disinformation. On the one hand, some users spread fake news and/or conspiracy theories intentionally. On the other hand, some other users share the same materials considering them as truth and hence worth spreading. Fact-checking and evidence retrieval are also important aspects for the detection of fabricated content since they can be used for the verification of the claim.

In this special issue we invite researchers and practitioners, both from academia and industry, from different disciplines and fields such as machine learning, deep learning, natural language processing, data mining, computational linguistics, social network analysis and other related areas to submit novel and significantly extended qualitative and quantitative research papers, that focus on the automatic prevention and detection of online misinformation and disinformation including satire, clickbaits, fake news, rumors, conspiracy theories, hoaxes, bots and trolls.

Topics of interest
We solicit original, unpublished, and innovative research work on all aspects around, but not limited to, the following themes:

• Computational approaches for the detection of online misinformation and disinformation (e.g., satire, clickbaits, fake news, rumors, conspiracy theories, hoaxes, bots and trolls)
• Computational approaches to identify and analyze the use of propaganda in misinformation and dis-information campaigns
• Computational approaches for spotting users purposefully spread different types of mis/disinformation (e.g., detection of fake news spreaders, detection of conspiracy spreaders, detection of water armies)
• Identification of information spread by bots and trolls
• Network analysis in fake news propagation and circulation in social media and social networks
• Automatic identification and verification of claims
• Intention detection for the different types of disinformation
• Credibility assessment of online information sources
• Detection of polarization in online communities
• Computational approaches for multimodal disinformation detection
• Computational approaches for early detection of fake news
• Fake news prevention, filtering and containment
• Analysis/detection of multi-platform fake news spreading
• Measurements and analysis of fake news impact
• Resources for journalists for fake news detection
• Datasets and evaluation methodologies for disinformation detection in social media

Guest Editors
• Anastasia Giachanou (lead guest editor)
Utrecht University, Utrecht, the Netherlands

• Xiuzhen Jenny Zhang
RMIT University, Australia

• Alberto Barrón-Cedeño
Universit a di Bologna, Forli, Italy

• Olessia (Elena) Koltsova
National Research University Higher School of Economics, St. Petersburg, Russia

• Paolo Rosso
Universitat Politecnica de Valencia, Valencia, Spain

Submission guidelines

Submitted papers should present original, unpublished work, relevant to one of the topics of the Special Issue. It is the policy of the journal that no submission, or substantially overlapping submission, be published or be under review at another journal or conference at any time during the review process. Manuscripts will be subject to a peer reviewing process and must conform to the author guide lines available on the JDSA website at:

Author Resources

Authors are encouraged to submit high-quality, original work that has neither appeared in, nor is under consideration by other journals.  

All papers will be reviewed following standard reviewing procedures for the Journal. 

Papers must be prepared in accordance with the Journal guidelines:

Springer provides a host of information about publishing in a Springer Journal on our Journal Author Resources page, including  FAQs,  Tutorials along with Help and Support.

Other links include: