Skip to main content
Log in

Convergence and efficiency of subgradient methods for quasiconvex minimization

  • Published:
Mathematical Programming Submit manuscript

Abstract.

We study a general subgradient projection method for minimizing a quasiconvex objective subject to a convex set constraint in a Hilbert space. Our setting is very general: the objective is only upper semicontinuous on its domain, which need not be open, and various subdifferentials may be used. We extend previous results by proving convergence in objective values and to the generalized solution set for classical stepsizes t k →0, ∑t k =∞, and weak or strong convergence of the iterates to a solution for {t k }∈ℓ2∖ℓ1 under mild regularity conditions. For bounded constraint sets and suitable stepsizes, the method finds ε-solutions with an efficiency estimate of O-2), thus being optimal in the sense of Nemirovskii.

This is a preview of subscription content, log in via an institution to check access.

Access this article

Price excludes VAT (USA)
Tax calculation will be finalised during checkout.

Instant access to the full article PDF.

Similar content being viewed by others

Author information

Authors and Affiliations

Authors

Additional information

Received: October 4, 1998 / Accepted: July 24, 2000¶Published online January 17, 2001

Rights and permissions

Reprints and permissions

About this article

Cite this article

Kiwiel, K. Convergence and efficiency of subgradient methods for quasiconvex minimization. Math. Program. 90, 1–25 (2001). https://doi.org/10.1007/PL00011414

Download citation

  • Issue Date:

  • DOI: https://doi.org/10.1007/PL00011414

Navigation