Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

gradient and jacobian #40

Open
dinesh286 opened this issue Jun 24, 2021 · 1 comment
Open

gradient and jacobian #40

dinesh286 opened this issue Jun 24, 2021 · 1 comment

Comments

@dinesh286
Copy link

How gradient and jacobian is computed in PSOPT to supply for IPOPT. I have gone through the example code, garadient of jacobian computation are not computed.

@vmbecerra
Copy link
Contributor

Hello

The gradient and Jacobian used by the NLP solver are computed by the library Adol-C by default, or they are computed by finite differences (gradient) or sparse finite differences (Jacobian). The code for the finite differences is in file derivatives.cxx.

The code to compute the derivatives using Adol-c for IPOPT is included in file IPOPT_interface.cxx.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants