Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

AMReX/pyAMReX/PICSAR: Weekly Update #5126

Merged
merged 5 commits into from
Aug 16, 2024
Merged

Conversation

ax3l
Copy link
Member

@ax3l ax3l commented Aug 11, 2024

Weekly update to latest AMReX.
Weekly update to latest pyAMReX.
Weekly update to latest PICSAR (no changes).

./Tools/Release/updateAMReX.py
./Tools/Release/updatepyAMReX.py
./Tools/Release/updatePICSAR.py

@ax3l ax3l added the component: third party Changes in WarpX that reflect a change in a third-party library label Aug 11, 2024
@ax3l ax3l mentioned this pull request Aug 11, 2024
1 task
@ax3l ax3l requested a review from EZoni August 11, 2024 03:03
@ax3l
Copy link
Member Author

ax3l commented Aug 13, 2024

@WeiqunZhang there seem to be new issues in this PR from the AMReX update (UB san as well as a series of tests see issues).

@ax3l

This comment was marked as resolved.

@ax3l ax3l force-pushed the topic-amrexWeekly branch 2 times, most recently from 111fe99 to 14b605c Compare August 14, 2024 22:42
@ax3l
Copy link
Member Author

ax3l commented Aug 15, 2024

@WeiqunZhang something changed in the way ThetaImplicitPicard_1d and SemiImplicitPicard_1d are run, which triggers a warning

!!! WARNING : [high][Performance] Too many resources / too little work!
             It looks like you requested more compute resources than there are
             total number of boxes of cells available (1). You started with (2)
             MPI ranks, so (1) rank(s) will have no work.
             Consider decreasing the amr.blocking_factor and amr.max_grid_size
             parameters and/or using fewer MPI ranks.
             More information:
             https://warpx.readthedocs.io/en/latest/usage/workflows/parallelization.html

in WarpX, which we abort on.

@WeiqunZhang
Copy link
Member

WeiqunZhang commented Aug 15, 2024 via email

@ax3l
Copy link
Member Author

ax3l commented Aug 15, 2024

@WeiqunZhang It looks like this PR AMReX-Codes/amrex#4083 fails some tests in WarpX.

Failing EB tests (compiled and used at RT):

  • ElectrostaticSphereEB_mixedBCs
  • Python_ElectrostaticSphereEB
  • Python_magnetostatic_eb_3d
  • embedded_boundary_cube
  • embedded_boundary_cube_2d
  • embedded_boundary_cube_macroscopic
  • embedded_circle
  • magnetostatic_eb_3d
  • ElectrostaticSphereEB

Failing EB tests in RZ (compiled and used at RT):

  • ElectrostaticSphereEB_RZ_MR
  • Python_magnetostatic_eb_rz
  • particle_boundary_interaction
  • spacecraft_charging
  • ElectrostaticSphereEB_RZ

Regression/WarpX-tests.ini Outdated Show resolved Hide resolved
Regression/WarpX-tests.ini Outdated Show resolved Hide resolved
Binning functions are refactored from `unsigned int` to `int`
in AMReX-Codes/amrex#3684 for performance
reasons. This updates our usage to reflect the changes.
Address a breaking change in AMReX.
AMReX default change: `amr.max_grid_size` is 64 from 32 now.

Since these tests are run with 2 MPI ranks, ensure we have at least
one box per MPI rank.
Instead of 20=40/2 set again to 32, to keep checksums the same.
@ax3l
Copy link
Member Author

ax3l commented Aug 16, 2024

@WeiqunZhang this should now be ready for approval & merge :)

@ax3l ax3l enabled auto-merge (squash) August 16, 2024 16:44
@ax3l ax3l merged commit 9a4de9c into ECP-WarpX:development Aug 16, 2024
47 checks passed
@ax3l ax3l deleted the topic-amrexWeekly branch August 16, 2024 18:24
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
component: third party Changes in WarpX that reflect a change in a third-party library
Projects
None yet
Development

Successfully merging this pull request may close these issues.

2 participants