Skip to content

Commit

Permalink
update solvers to follow ProximalAlgorithms (#19)
Browse files Browse the repository at this point in the history
* update solvers to follow ProximalAlgorithms

* updated title

* remove println

* REQUIRE -> Project.toml

* re-enabled tests

* increment tolerances a bit

* add Julia 1.2 to travis
  • Loading branch information
lostella authored Jul 8, 2019
1 parent 2620090 commit 4dde469
Show file tree
Hide file tree
Showing 13 changed files with 176 additions and 294 deletions.
1 change: 1 addition & 0 deletions .travis.yml
Original file line number Diff line number Diff line change
Expand Up @@ -6,6 +6,7 @@ os:
julia:
- 1.0
- 1.1
- 1.2
- nightly
matrix:
allow_failures:
Expand Down
29 changes: 29 additions & 0 deletions Project.toml
Original file line number Diff line number Diff line change
@@ -0,0 +1,29 @@
name = "StructuredOptimization"
uuid = "46cd3e9d-64ff-517d-a929-236bc1a1fc9d"
version = "0.2.0"

[deps]
AbstractOperators = "d9c5613a-d543-52d8-9afd-8f241a8c3f1c"
DSP = "717857b8-e6f2-59f4-9121-6e50c889abd2"
FFTW = "7a1cc6ca-52ef-59f5-83cd-3a7055c09341"
LinearAlgebra = "37e2e46d-f89d-539d-b4ee-838fcccc9c8e"
ProximalAlgorithms = "140ffc9f-1907-541a-a177-7475e0a401e9"
ProximalOperators = "a725b495-10eb-56fe-b38b-717eba820537"
RecursiveArrayTools = "731186ca-8d62-57ce-b412-fbd966d074cd"

[compat]
AbstractOperators = "≥ 0.1.0"
DSP = "≥ 0.5.1"
FFTW = "≥ 0.2.4"
ProximalAlgorithms = "≥ 0.3.0"
ProximalOperators = "≥ 0.8.0"
RecursiveArrayTools = "≥ 0.18.0"
julia = "1.0.0"

[extras]
LinearAlgebra = "37e2e46d-f89d-539d-b4ee-838fcccc9c8e"
Random = "9a3f8284-a2c9-5f02-9a11-845980a1fd5c"
Test = "8dfed614-e22c-5e08-85e1-65c5234f0b40"

[targets]
test = ["LinearAlgebra", "Test", "Random"]
7 changes: 0 additions & 7 deletions REQUIRE

This file was deleted.

43 changes: 16 additions & 27 deletions docs/src/solvers.md
Original file line number Diff line number Diff line change
Expand Up @@ -9,7 +9,7 @@
!!! note "Problem warm-starting"

By default *warm-starting* is always enabled.
For example, if two problems that utilize the same variables are solved consecutively,
For example, if two problems that involve the same variables are solved consecutively,
the second one will be automatically warm-started by the solution of the first one.
That is because the variables are always linked to their respective data vectors.
If one wants to avoid this, the optimization variables needs to be manually re-initialized
Expand All @@ -18,22 +18,21 @@

## Specifying solver and options

As shown above it is possible to choose the type of algorithm and specify its options by creating a `Solver` object.
Currently, the following algorithms are supported:
You can pick the algorithm to use as `Solver` object from the
[`ProximalAlgorithms.jl`](https://github.com/kul-forbes/ProximalAlgorithms.jl))
package. Currently, the following algorithms are supported:

* *Proximal Gradient (PG)* [[1]](http://www.mit.edu/~dimitrib/PTseng/papers/apgm.pdf), [[2]](http://epubs.siam.org/doi/abs/10.1137/080716542)
* *Fast Proximal Gradient (FPG)* [[1]](http://www.mit.edu/~dimitrib/PTseng/papers/apgm.pdf), [[2]](http://epubs.siam.org/doi/abs/10.1137/080716542)
* *ZeroFPR* [[3]](https://arxiv.org/abs/1606.06256)
* *PANOC* [[4]](https://doi.org/10.1109/CDC.2017.8263933)
* `ProximalAlgorithms.ForwardBackward`, also known as *proximal gradient*
method [[1]](http://www.mit.edu/~dimitrib/PTseng/papers/apgm.pdf), [[2]](http://epubs.siam.org/doi/abs/10.1137/080716542). Nesterov acceleration can be enabled, which significantly
improves its performance for convex problems.
* `ProximalAlgorithms.ZeroFPR`, a Newton-type forward-backward algorithm,
proposed in [[3]](https://arxiv.org/abs/1606.06256), using L-BFGS
directions to accelerate convergence.
* `ProximalAlgorithms.PANOC`, another Newton-type forward-backward algorithm,
proposed in [[4]](https://doi.org/10.1109/CDC.2017.8263933), also using
L-BFGS directions.

```@docs
PG
FPG
ZeroFPR
PANOC
```

## Build and solve
## Parse and solve

The macro [`@minimize`](@ref) automatically parse and solve the problem.
An alternative syntax is given by the function [`problem`](@ref) and [`solve`](@ref).
Expand All @@ -43,18 +42,8 @@ problem
solve
```

It is important to stress out that the `Solver` objects created using
the functions above ([`PG`](@ref), [`FPG`](@ref), etc.)
specify only the type of algorithm to be used together with its options.
The actual solver
(namely the one of [`ProximalAlgorithms.jl`](https://github.com/kul-forbes/ProximalAlgorithms.jl))
is constructed altogether with the problem formulation.
The problem parsing procedure can be separated from the solver application using the functions [`build`](@ref) and [`solve!`](@ref).

```@docs
build
solve!
```
Once again, the `Solver` objects is to be picked from
[`ProximalAlgorithms.jl`](https://github.com/kul-forbes/ProximalAlgorithms.jl)).

## References

Expand Down
15 changes: 13 additions & 2 deletions src/StructuredOptimization.jl
Original file line number Diff line number Diff line change
Expand Up @@ -9,7 +9,18 @@ using ProximalOperators
using ProximalAlgorithms

include("syntax/syntax.jl")
include("calculus/precomposeNonlinear.jl") #TODO move to ProximalOperators?
include("solvers/solvers.jl")
include("calculus/precomposeNonlinear.jl") # TODO move to ProximalOperators?
include("arraypartition.jl") # TODO move to ProximalOperators?

# problem parsing
include("solvers/terms_extract.jl")
include("solvers/terms_properties.jl")
include("solvers/terms_splitting.jl")

# solver calls
include("solvers/solvers_options.jl")
include("solvers/build_solve.jl")
include("solvers/minimize.jl")


end
36 changes: 36 additions & 0 deletions src/arraypartition.jl
Original file line number Diff line number Diff line change
@@ -0,0 +1,36 @@
import ProximalOperators
import RecursiveArrayTools

@inline function ProximalOperators.prox(
h::ProximalOperators.ProximableFunction,
x::RecursiveArrayTools.ArrayPartition,
gamma...
)
# unwrap
y, fy = ProximalOperators.prox(h, x.x, gamma...)
# wrap
return RecursiveArrayTools.ArrayPartition(y), fy
end

@inline function ProximalOperators.gradient(
h::ProximalOperators.ProximableFunction,
x::RecursiveArrayTools.ArrayPartition
)
# unwrap
grad, fx = ProximalOperators.gradient(h, x.x)
# wrap
return RecursiveArrayTools.ArrayPartition(grad), fx
end

@inline ProximalOperators.prox!(
y::RecursiveArrayTools.ArrayPartition,
h::ProximalOperators.ProximableFunction,
x::RecursiveArrayTools.ArrayPartition,
gamma...
) = ProximalOperators.prox!(y.x, h, x.x, gamma...)

@inline ProximalOperators.gradient!(
y::RecursiveArrayTools.ArrayPartition,
h::ProximalOperators.ProximableFunction,
x::RecursiveArrayTools.ArrayPartition
) = ProximalOperators.gradient!(y.x, h, x.x)
100 changes: 30 additions & 70 deletions src/solvers/build_solve.jl
Original file line number Diff line number Diff line change
@@ -1,11 +1,12 @@
export build

"""
`build(terms::Tuple, solver_opt::ForwardBackwardSolver)`
parse_problem(terms::Tuple, solver::ForwardBackwardSolver)
Takes as input a tuple containing the terms defining the problem and the solver options.
Takes as input a tuple containing the terms defining the problem and the solver.
Returns a tuple containing the optimization variables and the built solver.
Returns a tuple containing the optimization variables and the problem terms
to be fed into the solver.
# Example
Expand All @@ -18,82 +19,37 @@ julia> A, b = randn(10,4), randn(10);
julia> p = problem( ls(A*x - b ) , norm(x) <= 1 );
julia> build(p, PG());
```
"""
function build(terms::Tuple, solver::ForwardBackwardSolver)
function parse_problem(terms::Tuple, solver::T) where T <: ForwardBackwardSolver
x = extract_variables(terms)
# Separate smooth and nonsmooth
smooth, nonsmooth = split_smooth(terms)
# Separate quadratic and nonquadratic
quadratic, smooth = split_quadratic(smooth)
kwargs = Array{Any, 1}()
if is_proximable(nonsmooth)
g = extract_proximable(x, nonsmooth)
append!(kwargs, [(:g, g)])
if !isempty(quadratic)
fq = extract_functions(quadratic)
Aq = extract_operators(x, quadratic)
append!(kwargs, [(:fq, fq)])
append!(kwargs, [(:Aq, Aq)])
end
kwargs = Dict{Symbol, Any}(:g => g)
if !isempty(smooth)
if is_linear(smooth)
fs = extract_functions(smooth)
As = extract_operators(x, smooth)
append!(kwargs, [(:As, As)])
else
fs = extract_functions_nodisp(smooth)
As = extract_affines(x, smooth)
fs = PrecomposeNonlinear(fs, As)
f = extract_functions(smooth)
A = extract_operators(x, smooth)
kwargs[:A] = A
else # ??
f = extract_functions_nodisp(smooth)
A = extract_affines(x, smooth)
f = PrecomposeNonlinear(f, A)
end
append!(kwargs, [(:fs, fs)])
kwargs[:f] = f
end
return build_iterator(x, solver; kwargs...)
return (x, kwargs)
end
error("Sorry, I cannot solve this problem")
end

################################################################################
export solve!

"""
`solve!( x_solver )`
Takes as input a tuple containing the optimization variables and the built solver.
Solves the problem returning a tuple containing the iterations taken and the build solver.
# Example
```julia
julia> x = Variable(4)
Variable(Float64, (4,))
julia> A, b = randn(10,4), randn(10);
julia> p = problem( ls(A*x - b ) , norm(x) <= 1 );
julia> x_solver = build(p, PG(verbose = 0));
julia> solve!(x_solver);
```
"""
function solve!(x_and_iter::Tuple{Tuple{Vararg{Variable}}, ProximalAlgorithms.ProximalAlgorithm})
x, iterator = x_and_iter
it, x_star = ProximalAlgorithms.run!(iterator)
~x .= x_star
return it, iterator
error("Sorry, I cannot parse this problem for solver of type $(T)")
end


export solve

"""
`solve(terms::Tuple, solver_opt::ForwardBackwardSolver)`
solve(terms::Tuple, solver::ForwardBackwardSolver)
Takes as input a tuple containing the terms defining the problem and the solver options.
Expand All @@ -102,22 +58,26 @@ Solves the problem returning a tuple containing the iterations taken and the bui
# Example
```julia
julia> x = Variable(4)
Variable(Float64, (4,))
julia> A, b = randn(10,4), randn(10);
julia> solve(p,PG());
it | gamma | fpr |
------|------------|------------|
1 | 7.6375e-02 | 1.8690e+00 |
12 | 7.6375e-02 | 9.7599e-05 |
julia> p = problem(ls(A*x - b ), norm(x) <= 1);
```
julia> solve(p, ProximalAlgorithms.ForwardBackward());
julia> ~x
4-element Array{Float64,1}:
-0.6427139974173074
-0.29043653211431103
-0.6090539651510192
0.36279278640995494
```
"""
function solve(terms::Tuple, solver::ForwardBackwardSolver)
built_slv = build(terms, solver)
return solve!(built_slv)
x, kwargs = parse_problem(terms, solver)
x_star, it = solver(~x; kwargs...)
~x .= x_star
return x, it
end
9 changes: 0 additions & 9 deletions src/solvers/solvers.jl

This file was deleted.

Loading

2 comments on commit 4dde469

@lostella
Copy link
Member Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

@JuliaRegistrator
Copy link

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Registration pull request created: JuliaRegistries/General/1890

After the above pull request is merged, it is recommended that a tag is created on this repository for the registered package version.

This will be done automatically if Julia TagBot is installed, or can be done manually through the github interface, or via:

git tag -a v0.2.0 -m "<description of version>" 4dde4691217ceeb297fed77288d29790e35cf7f2
git push origin v0.2.0

Please sign in to comment.