Call for Papers: Explainable Artificial Intelligence Solutions for In-the-wild Human Behavior Analysis [1236]

Understanding human behavior is currently an active and challenging research domain in computer vision. One of the main reasons behind this growing interest lies in the numerous applications that involve several fields such as Human-Computer and Human-Robot Interaction (keystroke dynamics analysis, cooperative and facial expression recognition, and body gesture recognition), video surveillance systems (crowd analysis, action recognition), healthcare (detection of emotional and cognitive disorders), and biometrics (gait recognition).

Intensive emphasis on enhancing prediction performance frequently comes at the expense of explainability. Due to their opaque nature and increasing complexity, Artificial Intelligence (AI) solutions are difficult to comprehend and examine. In most cases, AI agents are regarded as black-boxes, whose take millions of data points as inputs and correlate specific data features to produce an output. Consequently, explainable AI highlights the strengths and limitations of the decision strategy and clarifies the logic of the decision support system, bringing transparency for the decisions taken and accountability for the corresponding effects.

The purpose of this Special Issue is to investigate new challenges, application fields, and modalities in human behavior analysis in wild contexts using AI explainability. We examine research activities oriented toward the development of novel methodologies, database collections, and benchmarks, as well as algorithms and systems for machine analysis of human behavior, with a focus on facial expressions, gestures, and body movements.

The Special Issue aims to collect contributions from academics or industry professionals on advanced techniques that enhance the state of the art in Explainable AI for human behavior analysis, including (but not limited to) the following topics of interest:

● Explainable AI for human behavior analysis in the context of social networks (social behavioral biometrics);
● Behavior recognition based on bodily and facial expressions in wild contexts by measuring the explainability of decisions;
● Decision model visualization for human behavior analysis in human-robot interaction;
● An explainable framework for multi-sensor data fusion for
action/activity/gesture/emotion recognition in uncontrolled environments;
● Crowd behavior analysis and prediction from video sequences with explainable AI techniques;
● Integrating explainability into existing AI systems for action/activity/gesture/emotion recognition;
● Datasets, benchmarks and evaluations, robustness, novel metrics, and bias in datasets for explainability/interpretability purposes;
● Action/activity/gesture recognition from skeleton data or depth maps by measuring explainability in decisions.

Guest Editors

Chiara Pero - University of Salerno, Italy

Lucia Cascone - University of Salerno, Italy

Hugo Proença - University of Beira Interior, Portugal

Important Dates

Submission dates:
Open: March 19, 2023
Close: September 19, 2023

Submission Guidelines:
Authors should prepare their manuscript according to the Instructions for Authors available from the Multimedia Tools and Applications website. Authors should submit through the online submission site at and select “SI 1236 - Explainable Artificial Intelligence Solutions for In-the-wild Human Behavior Analysis" when they reach the “Article Type” step in the submission process. Submitted papers should present original, unpublished work, relevant to the topics of the special issue. All submitted papers will be evaluated on the basis of relevance, significance of contribution, technical quality, scholarship, and quality of presentation, by at least three independent reviewers. It is the policy of the journal that no submission, or substantially overlapping submission, be published or be under review at another journal or conference at any time during the review process.