Line search methods: Difference between revisions

From Cornell University Computational Optimization Open Textbook - Optimization Wiki
Jump to navigation Jump to search
Line 2: Line 2:


==Introduction==
==Introduction==
When solving unconstrained optimization problems, the user need to supply a starting point for all algorithms. With the initial starting point, <math>x_0</math>, optimization algorithms generate a sequence of iterates <math>\{x_k\}_{k=0}^{\infty}</math> which terminates when an approximated solution has been achieved or no more progress can be made. Line Search is one of the two fundamental strategies for locating the new <math>x_{k+1}</math> given the current point.


==Generic Line Search Method==
==Generic Line Search Method==

Revision as of 12:29, 28 November 2021

Authors: Lihe Cao, Zhengyi Sui, Jiaqi Zhang, Yuqing Yan, and Yuhui Gu (6800 Fall 2021).

Introduction

When solving unconstrained optimization problems, the user need to supply a starting point for all algorithms. With the initial starting point, $ x_0 $, optimization algorithms generate a sequence of iterates $ \{x_k\}_{k=0}^{\infty} $ which terminates when an approximated solution has been achieved or no more progress can be made. Line Search is one of the two fundamental strategies for locating the new $ x_{k+1} $ given the current point.

Generic Line Search Method

Basic Algorithm

Search Direction for Line Search

Step Length

Convergence

Exact Search

Steepest Descent Method

Given the intuition that the negative gradient $ - \nabla f_k $ can be an effective search direction, steepest descent follows the idea and establishes a systematic method for minimizing the objective function. Setting $ - \nabla f_k $ as the direction, steepest descent computes the step-length $ \alpha^k $ by minimizing a single-variable objective function. More specifically, the steps of Steepest Descent Method are as follows.

PSEUDOCODE HERE

One advantage of the steepest descent method is that it has a nice convergence theory. For a steepest descent method, it converges to a local minimal from any starting point.

Theorem: global convergence of steepest descent[1] Let the gradient of $ f \in C^1 $ be uniformly Lipschitz continuous on $ R^n $. Then, for the iterates with steepest-descent search directions, one of the following situations occurs:

  • $ \nabla f(x_k) = 0 $ for some finite $ k $
  • $ \lim_{k \to \infty} f(x_k) = -\infty $
  • $ \lim_{k \to \infty} \nabla f(x_k) = 0 $

Steepest descent method is a special case of gradient descent in that the step-length is rigorously defined. Generalization can be made regarding the choice of $ \alpha $.

Inexact Search

Backtracking

Zoutendijk’s Theorem

Numeric Example

Reference

  1. Dr Raphael Hauser, Oxford University Computing Laboratory, Line Search Methods for Unconstrained Optimization [1]