• Stars
    star
    113
  • Rank 310,034 (Top 7 %)
  • Language
    Julia
  • License
    Other
  • Created about 8 years ago
  • Updated 3 months ago

Reviews

There are no reviews yet. Be the first to send feedback to the community and the maintainers!

Repository Details

Line search methods for optimization and root-finding

LineSearches

Build Status Codecov branch

Description

This package provides an interface to line search algorithms implemented in Julia. The code was originally written as part of Optim, but has now been separated out to its own package.

Available line search algorithms

In the docs we show how to choose between the line search algorithms in Optim.

  • HagerZhang (Taken from the Conjugate Gradient implementation by Hager and Zhang, 2006)
  • MoreThuente (From the algorithm in More and Thuente, 1994)
  • BackTracking (Described in Nocedal and Wright, 2006)
  • StrongWolfe (Nocedal and Wright)
  • Static (Takes the proposed initial step length.)

Available initial step length procedures

The package provides some procedures to calculate the initial step length that is passed to the line search algorithm. See the docs for its usage in Optim.

  • InitialPrevious (Use the step length from the previous optimization iteration)
  • InitialStatic (Use the same initial step length each time)
  • InitialHagerZhang (Taken from Hager and Zhang, 2006)
  • InitialQuadratic (Propose initial step length based on a quadratic interpolation)
  • InitialConstantChange (Propose initial step length assuming constant change in step length)

Documentation

For more details and options, see the documentation

  • STABLE — most recently tagged version of the documentation.
  • LATEST — in-development version of the documentation.

Example usage

Here is how to get a simple linesearch for a one-dimensional function working:

using LineSearches

ϕ(x) = (x - π)^4
dϕ(x) = 4*(x-π)^3
ϕdϕ(x) = ϕ(x),dϕ(x)

α0 = 9.0
Ï•0 = Ï•(0.0)
dϕ0 = dϕ(0.0)

for ls in (Static,BackTracking,HagerZhang,MoreThuente,StrongWolfe)
    res = (ls())(ϕ, dϕ, ϕdϕ, α0, ϕ0,dϕ0)
    println(ls, ": ", res)
end

For more examples, see the documentation.

References

  • W. W. Hager and H. Zhang (2006) "Algorithm 851: CG_DESCENT, a conjugate gradient method with guaranteed descent." ACM Transactions on Mathematical Software 32: 113-137.
  • Moré, Jorge J., and David J. Thuente. "Line search algorithms with guaranteed sufficient decrease." ACM Transactions on Mathematical Software (TOMS) 20.3 (1994): 286-307.
  • Nocedal, Jorge, and Stephen Wright. "Numerical optimization." Springer Science & Business Media, 2006.