Read While You Wait - Get immediate ebook access, if available*, when you order a print book

Linear Algebra and Optimization for Machine Learning

A Textbook

Authors: Aggarwal, Charu

Free Preview
  • First textbook to provide an integrated treatment of linear algebra and optimization with a special focus on machine learning issues
  • Includes many examples to simplify exposition and facilitate in learning semantically
  • Complemented by a solution manual for the numerous exercises in the book
see more benefits

Buy this book

eBook $54.99
price for USA in USD (gross)
  • ISBN 978-3-030-40344-7
  • Digitally watermarked, DRM-free
  • Included format: EPUB, PDF
  • ebooks can be used on all reading devices
  • Immediate eBook download after purchase
Hardcover $69.99
price for USA in USD
  • ISBN 978-3-030-40343-0
  • Free shipping for individuals worldwide
  • Immediate ebook access, if available*, with your print order
  • Usually dispatched within 3 to 5 business days.
About this Textbook

This textbook introduces linear algebra and optimization in the context of machine learning. Examples and exercises are provided throughout this text book together with access to a solution’s manual. This textbook targets graduate level students and professors in computer science, mathematics and data science. Advanced undergraduate students can also use this textbook. The chapters for this textbook are organized as follows:

1. Linear algebra and its applications: The chapters focus on the basics of linear algebra together with their common applications to singular value decomposition, matrix factorization, similarity matrices (kernel methods), and graph analysis. Numerous machine learning applications have been used as examples, such as spectral clustering, kernel-based classification, and outlier detection. The tight integration of linear algebra methods with examples from machine learning differentiates this book from generic volumes on linear algebra. The focus is clearly on the most relevant aspects of linear algebra for machine learning and to teach readers how to apply these concepts.

2. Optimization and its applications: Much of machine learning is posed as an optimization problem in which we try to maximize the accuracy of regression and classification models. The “parent problem” of optimization-centric machine learning is least-squares regression. Interestingly, this problem arises in both linear algebra and optimization, and is one of the key connecting problems of the two fields.  Least-squares regression is also the starting point for support vector machines, logistic regression, and recommender systems. Furthermore, the methods for dimensionality reduction and matrix factorization also require the development of optimization methods. A general view of optimization in computational graphs is discussed together with its applications to back propagation in neural networks. 

A frequent challenge faced by beginners in machine learning is the extensive background required in linear algebra and optimization. One problem is that the existing linear algebra and optimization courses are not specific to machine learning; therefore, one would typically have to complete more course material than is necessary to pick up machine learning. Furthermore, certain types of ideas and tricks from optimization and linear algebra recur more frequently in machine learning than other application-centric settings. Therefore, there is significant value in developing a view of linear algebra and optimization that is better suited to the specific perspective of machine learning.

About the authors

Charu C. Aggarwal is a Distinguished Research Staff Member (DRSM) at the IBM T. J. Watson Research Center in Yorktown Heights, New York. He completed his undergraduate degree in Computer Science from the Indian Institute of Technology at Kanpur in 1993 and his Ph.D. in Operations Research from the Massachusetts Institute of Technology in 1996. He has published more than 400 papers in refereed conferences and journals and has applied for or been granted more than 80 patents. He is author or editor of 19 books, including textbooks on data mining, neural networks, machine learning (for text), recommender systems, and outlier analysis. Because of the commercial value of his patents, he has thrice been designated a Master Inventor at IBM. He has received several internal and external awards, including the EDBT Test-of-Time Award (2014), the IEEE ICDM Research Contributions Award (2015), and the ACM SIGKDD Innovation Award (2019). He has served as editor-in-chief of the ACM SIGKDD Explorations, and is currently serving as an editor-in-chief of the ACM Transactions on Knowledge Discovery from Data. He is a fellow of the SIAM, ACM, and the IEEE, for “contributions to knowledge discovery and data mining algorithms.”

Table of contents (11 chapters)

Table of contents (11 chapters)

Buy this book

eBook $54.99
price for USA in USD (gross)
  • ISBN 978-3-030-40344-7
  • Digitally watermarked, DRM-free
  • Included format: EPUB, PDF
  • ebooks can be used on all reading devices
  • Immediate eBook download after purchase
Hardcover $69.99
price for USA in USD
  • ISBN 978-3-030-40343-0
  • Free shipping for individuals worldwide
  • Immediate ebook access, if available*, with your print order
  • Usually dispatched within 3 to 5 business days.
Loading...

Recommended for you

Loading...

Bibliographic Information

Bibliographic Information
Book Title
Linear Algebra and Optimization for Machine Learning
Book Subtitle
A Textbook
Authors
Copyright
2020
Publisher
Springer International Publishing
Copyright Holder
Springer Nature Switzerland AG
eBook ISBN
978-3-030-40344-7
DOI
10.1007/978-3-030-40344-7
Hardcover ISBN
978-3-030-40343-0
Edition Number
1
Number of Pages
XXI, 495
Number of Illustrations
67 b/w illustrations, 26 illustrations in colour
Topics

*immediately available upon purchase as print book shipments may be delayed due to the COVID-19 crisis. ebook access is temporary and does not include ownership of the ebook. Only valid for books with an ebook version. Springer Reference Works are not included.