Skip to main content

The Projected Subgradient Algorithm in Convex Optimization

  • Book
  • © 2020

Overview

  • Studies the influence of computational errors for the generalized subgradient projection algorithm
  • Contains solutions to a number of difficult and interesting problems in the numerical optimization
  • Useful for experts in applications of optimization, engineering, and economics
  • Focuses on the subgradient projection algorithm for minimization of convex and nonsmooth functions and for computing the saddle points of convex-concave functions under the presence of computational errors

Part of the book series: SpringerBriefs in Optimization (BRIEFSOPTI)

This is a preview of subscription content, log in via an institution to check access.

Access this book

eBook USD 39.99
Price excludes VAT (USA)
  • Available as EPUB and PDF
  • Read on any device
  • Instant download
  • Own it forever
Softcover Book USD 54.99
Price excludes VAT (USA)
  • Compact, lightweight edition
  • Dispatched in 3 to 5 business days
  • Free shipping worldwide - see info

Tax calculation will be finalised at checkout

Other ways to access

Licence this eBook for your library

Institutional subscriptions

Table of contents (5 chapters)

Keywords

About this book

This focused monograph presents a study of subgradient algorithms for constrained minimization problems in a Hilbert space. The book is of interest for experts in applications of optimization  to engineering and economics. The goal is to obtain a good approximate solution of the problem in the presence of computational errors. The discussion takes into consideration the fact that for every algorithm its iteration consists of several steps and that computational errors for different steps are different, in general.  The book is especially useful for the reader because it contains solutions to a number of difficult and interesting problems in the numerical optimization.  The subgradient  projection algorithm is one of the most important tools in optimization theory and its applications. An optimization  problem is described by an objective function and a set of feasible points. For this algorithm each iteration consists of two steps. The first step requires a calculation of a subgradient of the objective function; the second requires a calculation of a projection on the feasible set. The computational errors in each of these two steps are different.  This book shows that the algorithm discussed, generates a good approximate solution, if all the computational errors are bounded from above by a small positive constant. Moreover, if computational errors for the two steps of the algorithm are known, one discovers an approximate solution and how many iterations one needs for this.  In addition to their mathematical interest, the generalizations considered in this book have a significant practical meaning.

Reviews

“The book is rigorously written, and organized taking into account the cursiveness of reading. The long proofs of the theorems are placed in annexes to chapters, in order to emphasize the importance of every result in a generating methodology of studying and solving problems.” (Gabriela Cristescu, zbMATH 1464.90063, 2021)

Authors and Affiliations

  • Department of Mathematics, Technion – Israel Institute of Technology, Haifa, Israel

    Alexander J. Zaslavski

About the author

​Alexander J. Zaslavski is professor in the Department of Mathematics, Technion-Israel Institute of Technology, Haifa, Israel.​

Bibliographic Information

  • Book Title: The Projected Subgradient Algorithm in Convex Optimization

  • Authors: Alexander J. Zaslavski

  • Series Title: SpringerBriefs in Optimization

  • DOI: https://doi.org/10.1007/978-3-030-60300-7

  • Publisher: Springer Cham

  • eBook Packages: Mathematics and Statistics, Mathematics and Statistics (R0)

  • Copyright Information: The Author(s), under exclusive license to Springer Nature Switzerland AG 2020

  • Softcover ISBN: 978-3-030-60299-4Published: 26 November 2020

  • eBook ISBN: 978-3-030-60300-7Published: 25 November 2020

  • Series ISSN: 2190-8354

  • Series E-ISSN: 2191-575X

  • Edition Number: 1

  • Number of Pages: VI, 146

  • Topics: Optimization, Numerical Analysis

Publish with us