PDE Methods for Machine Learning (Submission Deadline: 31st August 2021)

Submission Deadline: 31st August 2021

Guest Editors: Jeff Calder (University of Minnesota), Xiuyuan Cheng (Duke University), Adam Oberman (McGill University), Lars Ruthotto (Rutgers University)

This special issue will feature recent developments in the application of partial differential equations (PDE) to problems in machine learning. In machine learning, PDEs arise as continuum limits for discrete learning algorithms, such as deep neural networks and graph-based learning. Recent work has started to harness tools and analysis from the theory of PDEs to understand the big data limit of machine learning, and to develop new algorithms with better stability properties and rigorous performance guarantees. This connection provides a rich supply of interesting and important research problems that have spread in many directions, including mean field limits for neural networks, continuum limits for deep neural networks, Hamilton-Jacobi control theoretic approaches to training neural networks, and PDE continuum limits for graph-based learning. We solicit high quality original research papers targeting the analysis and applications of PDEs to problems in machine learning and data science.
 

For more information on submitting to this issue, please see the below Call for Papers:

PDE Methods for Machine Learning