Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

non mutating gradient #159

Draft
wants to merge 1 commit into
base: master
Choose a base branch
from
Draft

Conversation

ArnoStrouwen
Copy link
Contributor

Prototype of a version that is Zygote (nested differentiation) and StaticArrays compatible to fix #148 .
This works as long as the input element type and the output type are the same.
Could use some advice on how to convert the element type of an arbitrary array to the correct output type.

Quite a bit of performance might be sacrificed for this generalization.

This approach differs from how this is handled in ForwardDiff, where there is a special dispatch for StaticArrays, and it is assumed that any other AbstractArray is able to handle mutation or gets transformed into a similar mutable array, and Zygote is (or should be) handled by a custom rule.

@@ -130,7 +130,12 @@ function finite_difference_gradient(
end
end
cache = GradientCache(df, x, fdtype, returntype, inplace)
finite_difference_gradient!(df, f, x, cache, relstep=relstep, absstep=absstep, dir=dir)
if typeof(x) <: AbstractArray && fdtype == Val(:central)
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

why the check for AbstractArray? I would think that the non-! call should always do a non-mutating version by default.

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

The check for AbstractArray is only because I changed the vector->scalar code and not the scalar->vector code.
Since by #157 , the latter should be replaced with jacobian calls.
Once that is done, x should be restrained to Num.

@ArnoStrouwen ArnoStrouwen marked this pull request as draft November 14, 2022 23:58
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

finite_difference_gradient is mutating arrays
2 participants