Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Add guards on null active eigenvalues #1102

Merged
merged 1 commit into from
Jan 17, 2025

Conversation

edan-bainglass
Copy link
Member

There were no guards on the active eigenvalues. So if someone selected to define eigenvalues, but didn't (leaving all as -1), it raised an error.

@AndresOrtegaGuerrero
Copy link
Member

really , but i thought , https://www.quantum-espresso.org/Doc/INPUT_PW.html , -1 means not set by fault

@AndresOrtegaGuerrero
Copy link
Member

could you show the error we get ?

return [
tuple(eigenvalue)
for eigenvalue in eigenvalues_array.reshape(new_shape).tolist()
]

Copy link
Member Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

@AndresOrtegaGuerrero without the guard, active_eigenvalues might be empty (if all were left -1), in which case, new_shape is (0, 4).

Two errors were raised here:

  1. The shape was actually (0.0, 4) due to np.prod returning a float
  2. Can't reshape an array of size 0 - eigenvalues_array has a null shape due to active_eigenvalues being empty

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

i see

Copy link
Member

@AndresOrtegaGuerrero AndresOrtegaGuerrero left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

LGTM! Thank you @edan-bainglass

@edan-bainglass edan-bainglass merged commit 2f53644 into aiidalab:main Jan 17, 2025
7 checks passed
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

2 participants