Call for Papers on "Information Geometry for Deep Learning"
Editor: Frank Nielsen
Publication of the Special Issue: 2022
Deep learning is a subset of machine learning which is quickly developing in recent years both in terms of methodology and practical applications.
Deep neural networks (DNNs) are artificial neural network architectures with parameter spaces geometrically interpreted as neuromanifolds (with singularities) and learning algorithms visualized as trajectories/flows on neuromanifolds.
The aim of this special issue is to comprise original theoretical/experimental research articles which address the recent developments and research efforts on information-geometric methods in deep learning.
The topics include but are not limited to:
- Properties and complexity of neural networks/neuromanifolds
- Geometric dynamic learning with singularities
- Optimization with natural gradient methods, proximal methods, and other alternative methods
- Information geometry of generative models (f-GANs, VAEs, etc)
- Wasserstein/Fisher-Rao metric spectral properties
- Information bottleneck of neural networks
- Neural network simplification and quantization
- Geometric characterization of robustness and adversarial attacks
Please select 'S.I.: Information Geometry for Deep Learning' in the submission form of INGE.
For any further queries, please contact Frank Nielsen