Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Devasmit/AD_gsoc #14

Closed
wants to merge 4 commits into from
Closed

Conversation

DevasmitDutta
Copy link

No description provided.

Copy link
Member

@oriolcg oriolcg left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

@DevasmitDutta I added a minor comment. The priority and intensity need to be filled in as well.

I think it would be good to involve someone that has been involved in the latest devs of GridaDistributed. @amartinhuertas @JordiManyer , are you interested?

2024/ideas-list.md Outdated Show resolved Hide resolved
@JordiManyer
Copy link
Member

JordiManyer commented Feb 4, 2024

Hi @oriolcg , @DevasmitDutta . Sorry if I may be ruining the project, but this is done already. It's just not part of the main repo. I plan on bringing it to GridapDistributed at some point, when time allows. If it's something you need, I can speed it up.

The main idea is that automatic differentiation is a cell-wise thing, which means we want it to happen locally. However, when we have a functional defined by an anonymous function, like we generally do, the Measure attached to the weakform is a DistributedMeasure. This means we cannot evaluate that functional locally.
Because of how anonymous functions work in Julia, we cannot retrieve the DistributedMeasure from the function, so it is not possible (without rewriting most of the AD code for distributed) to keep using anonymous functions. This is even more tricky when we have multiple triangulations/measures involved.

All in all, the solution to this problem is quite elegant, and involves creating a new structure

struct IntegrandWithMeasure{A,B<:Tuple}
  F  :: A       # F(u,v,dΩ...):: B    # Measures that F needs
end

which holds the same info as the anonymous function, but in a way that can be accessed. The evaluation and gradient functions are then quite trivial.

Moreover, this also has benefits when considering functionals that depend on more variables than just u and v. For instance, in Topology Optimization scenarios you have functionals that depend on the level set functions. We want to be able to take directional derivatives wrt those variables as well. This structure allows us to do so.

@DevasmitDutta
Copy link
Author

DevasmitDutta commented Feb 4, 2024

@oriolcg looks like there might be a new release to GridapDistributed where this is probably already done by @JordiManyer. So, now having said that, would you like to propose some new topic for the GSoC ? (@amartinhuertas @JordiManyer probably you may also like to chime in for proposing any new topic that might be good to work on with GridapDistributed for this year's GSoC as well. )

@JordiManyer
Copy link
Member

JordiManyer commented Feb 4, 2024

Just as another comment, in the PR that is currently been considered to overhaul the ODE module, Alex and I have fixed a couple edge cases where AD in serial failed. We might also try to solve the AD for multifield bug.

@JordiManyer
Copy link
Member

See changes in gridap/Gridap.jl#978 and gridap/GridapDistributed.jl#144

@oriolcg
Copy link
Member

oriolcg commented Feb 5, 2024

@DevasmitDutta, given that these developments are already on the way, I would suggest to look into another topic. I will propose the same topic as last year on adjoint-based PDE constrained optimization.

You can close this PR and add this new one on the list.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

3 participants