Skip to main content
Book cover

Domain Adaptation and Representation Transfer, and Distributed and Collaborative Learning

Second MICCAI Workshop, DART 2020, and First MICCAI Workshop, DCL 2020, Held in Conjunction with MICCAI 2020, Lima, Peru, October 4–8, 2020, Proceedings

  • Conference proceedings
  • © 2020

Overview

Part of the book series: Lecture Notes in Computer Science (LNCS, volume 12444)

Included in the following conference series:

Conference proceedings info: DART 2020, DCL 2020.

This is a preview of subscription content, log in via an institution to check access.

Access this book

eBook USD 39.99
Price excludes VAT (USA)
  • Available as EPUB and PDF
  • Read on any device
  • Instant download
  • Own it forever
Softcover Book USD 54.99
Price excludes VAT (USA)
  • Compact, lightweight edition
  • Dispatched in 3 to 5 business days
  • Free shipping worldwide - see info

Tax calculation will be finalised at checkout

Other ways to access

Licence this eBook for your library

Institutional subscriptions

Table of contents (20 papers)

  1. DART 2020

  2. DCL 2020

Other volumes

  1. Domain Adaptation and Representation Transfer, and Distributed and Collaborative Learning

Keywords

About this book

This book constitutes the refereed proceedings of the Second MICCAI Workshop on Domain Adaptation and Representation Transfer, DART 2020, and the First MICCAI Workshop on Distributed and Collaborative Learning, DCL 2020, held in conjunction with MICCAI 2020 in October 2020. The conference was planned to take place in Lima, Peru, but changed to an online format due to the Coronavirus pandemic. 

For DART 2020, 12 full papers were accepted from 18 submissions. They deal with methodological advancements and ideas that can improve the applicability of machine learning (ML)/deep learning (DL) approaches to clinical settings by making them robust and consistent across different domains.

For DCL 2020, the 8 papers included in this book were accepted from a total of 12 submissions. They focus on the comparison, evaluation and discussion of methodological advancement and practical ideas about machine learning applied to problems where data cannot be stored in centralized databases; where information privacy is a priority; where it is necessary to deliver strong guarantees on the amount and nature of private information that may be revealed by the model as a result of training; and where it's necessary to orchestrate, manage and direct clusters of nodes participating in the same learning task.





Editors and Affiliations

  • Technical University Munich, Munich, Germany

    Shadi Albarqouni

  • University of Pennsylvania, Philadelphia, USA

    Spyridon Bakas

  • Imperial College London, London, UK

    Konstantinos Kamnitsas

  • King's College London, London, UK

    M. Jorge Cardoso

  • Vanderbilt University, Nashville, USA

    Bennett Landman

  • NVIDIA Ltd., Cambridge, UK

    Wenqi Li

  • NVIDIA GmbH and Johnson & Johnson, Munich, Germany

    Fausto Milletari

  • NVIDIA GmbH, Munich, Germany

    Nicola Rieke

  • NVIDIA Corporation, Bethesda, USA

    Holger Roth, Daguang Xu

  • NVIDIA Corporation, Santa Clara, USA

    Ziyue Xu

Bibliographic Information

Publish with us