-
Notifications
You must be signed in to change notification settings - Fork 218
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Can DifferentiationInterface be useful for Turing? #2187
Comments
Turing's current interface to autodiff backends is based on |
Looks great! Is there a summary of how this is different from AbstractDifferentiation.jl somewhere?:) |
I updated the summary in this issue: JuliaDiff/AbstractDifferentiation.jl#131 |
Hi, I think |
Sounds reasonable, I have opened this issue to keep track: |
Closing this in favour of tpapp/LogDensityProblemsAD.jl#29; Turing will automatically use DI when tpapp/LogDensityProblemsAD.jl#29 is merged. |
Hi there!
@adrhill and I recently started https://github.com/gdalle/DifferentiationInterface.jl to provide a common interface for automatic differentiation in Julia. We're currently chatting with Lux.jl, Flux.jl and Optimization.jl to see how they can benefit from it, and so my mind went to Turing.jl as another AD power user :)
DifferentiationInterface.jl only guarantees support for functions of the type
f(x) = y
orf!(y, x)
with standard numbers or arrays in and out. Within these restrictions, we are compatible with 13 different AD backends, including the cool kids like Enzyme.jl and even the hipsters like Tapir.jl. Do you think it could come in handy?Ping @yebai @willtebbutt
The text was updated successfully, but these errors were encountered: