We read every piece of feedback, and take your input very seriously.
To see all available qualifiers, see our documentation.
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
For operators like gradient and jacobian in DifferentiationInterface.jl, I'm wondering if the best thing to do is just call the relevant AbstractDifferentiation functions? From what I can read in https://github.com/JuliaDiff/Diffractor.jl/blob/main/src/AbstractDifferentiation.jl, it all comes down to the pushforward anyway.
gradient
jacobian
Related:
The text was updated successfully, but these errors were encountered:
I would say this is the diffractor naitive interface.
Diffractor.jl/src/AbstractDifferentiation.jl
Lines 17 to 18 in a444b7f
Its what I normally work with. It's not documented as such but it does just come down to that pushforward.
Sorry, something went wrong.
Fair enough! The idea is that when we compare autodiff backends in DifferentiationInterface, we want to have the optimal performance for each. In Diffractor's case, every one of our operators can be implemented using https://github.com/gdalle/DifferentiationInterface.jl/blob/07d272af4351bc44344c9ba90705df6c4454a9f8/ext/DifferentiationInterfaceDiffractorExt/DifferentiationInterfaceDiffractorExt.jl#L12-L16 so if there are no faster shortcuts in specific cases, let's keep it that way!
No branches or pull requests
For operators like
gradient
andjacobian
in DifferentiationInterface.jl, I'm wondering if the best thing to do is just call the relevant AbstractDifferentiation functions? From what I can read in https://github.com/JuliaDiff/Diffractor.jl/blob/main/src/AbstractDifferentiation.jl, it all comes down to the pushforward anyway.Related:
The text was updated successfully, but these errors were encountered: