Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

JDBetteridge/merge upstream #22

Merged
merged 498 commits into from
Oct 9, 2024
Merged

JDBetteridge/merge upstream #22

merged 498 commits into from
Oct 9, 2024

Conversation

JDBetteridge
Copy link
Member

No description provided.

balay and others added 30 commits August 30, 2024 19:06
PCFIELDSPLIT: implement PCSetUpOnBlocks

See merge request petsc/petsc!7796
Knepley/fix plex extrusion normal

See merge request petsc/petsc!7793
configure: update SuiteSparse/SuperLU/CMake/zstd

See merge request petsc/petsc!7802
Knepley/fix orientation input

See merge request petsc/petsc!7625
PCFIELDSPLIT: follow-up of !7796

Closes #1641

See merge request petsc/petsc!7800
Thanks-to: Junchao Zhang <[email protected]>
do not create a name if not present
set maxnz to nz for proper display of allocated nonzeros
Minor docs fixes

See merge request petsc/petsc!7803
add extra information for PCView
minimize messages for simple repartitioning
minimize matrix permutations with simple redistribution
expose API to set graph symmetrization (this can be useful in nested solvers when the GAMG block is symmetric but the monolithic matrix is not)
update examples output
configure: python-3.13 does no have xdrlib, _parse_makefile in sysconfig

Closes #1604

See merge request petsc/petsc!7790
build: 'make check' should work before 'make install' for a prefix build

See merge request petsc/petsc!7789
PCGAMG: some optimizations

See merge request petsc/petsc!7798
Closes #1638.
Reported-by: Christophe Prud'homme @prudhomm
Back-port !7787 in release

See merge request petsc/petsc!7808
balay and others added 23 commits September 27, 2024 19:06
…into 'main'

Add -mpiuni-allow-multiprocess-launch with MPIUni for special use case.

See merge request petsc/petsc!7856
Add PetscOptionsGetBool3()

See merge request petsc/petsc!7888
CI: update linux-pgi to use nvhpc/24.7

See merge request petsc/petsc!7890
… I/O

- Add DMPlexGetDepthStratumGlobalSize()
- Add name and compression info to HDF5ReadCtx
- Add compression argument to PetscViewerHDF5ReadSizes_Private() and PetscViewerHDF5Load_Internal()
- Gather compressed info to all procs when loading
- Set name for local coordinates
- Add logging
- Add -is_view_compress to turn off compression
- Add -dm_plex_view_coordinate_section to just output coordinates
- Add -dm_plex_view_labels to turn off label output
- Support loading coordinates without section
CUVEC: fix copy from device to default to copy to device

See merge request petsc/petsc!7889
…nto 'main'

Add results for the mixed element types test for partitionning and ovelap...

See merge request petsc/petsc!7877
IS+HDF5: Add run-length compression to IS I/O

See merge request petsc/petsc!7862
Plex+HDF5: Now VIZ output is default in version 1.1.0

See merge request petsc/petsc!7885
sys: Add parens around PetscMalloc/PetscCalloc parameters

Closes #1654

See merge request petsc/petsc!7891
@Ig-dolci
Copy link

Ig-dolci commented Oct 4, 2024

We should update this PR with the latest PETSc changes that include the MPICH version update. This ensures PETSc installation on macOS Sequoia works. Can I do this?

@JDBetteridge
Copy link
Member Author

I can update this and my SLEPc branch now, I'll retrigger Firedrake CI too, I want to test for compatibility with mpi4py==4.0

@Ig-dolci
Copy link

Ig-dolci commented Oct 4, 2024

I can update this and my SLEPc branch now, I'll retrigger Firedrake CI too, I want to test for compatibility with mpi4py==4.0

Perfect! Thanks.

@JDBetteridge
Copy link
Member Author

Currently being tested here: firedrakeproject/firedrake#3777

@JDBetteridge
Copy link
Member Author

Having moved things forward again I'm running into issues:
https://github.com/firedrakeproject/firedrake/actions/runs/11181264621/job/31168588590?pr=3777
Seems like something may have changed with DMPlex:

ValueError: Mesh (firedrake_default_topology) already exists in /tmp/pytest-of-firedrake/pytest-0/popen-gw2/test_io_timestepping_setting_t0/test_io_timestepping_setting_time_dump.h5, but the global number of DMPlex points is inconsistent: 9

@dham dham merged commit 1c1d65e into firedrake Oct 9, 2024
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.