Skip to content

Self Gravity with Multigrid

Kengo TOMIDA edited this page May 4, 2021 · 22 revisions

(Important notice: Currently the Multigrid solver is in the beta test phase and available only in the multigrid branch. It is not officially guaranteed to work until it is merged into the master branch. If you want to use this solver in your work before the official release, please contact the developer @tomidakn .)

In this page, practical usage of the Multigrid self-gravity module is explained. For the details of the Multigrid algorithm, please see Multigrid. Also, for the other gravity solver using FFT, see Self-Gravity with FFT.

Configuration and compilation

To enable the Multigrid self-gravity sovler, simply configure the code with the --grav mg option. In the pgen directory, two sample codes are provided: poisson.cpp is a simple benchmark test using sinusoidal waves, and jeans.cpp is a linear wave with self gravity. Corresponding input files are provided in inputs/hydro. Also, these can be run from the regression test script.

To configure and compile the code with poisson.cpp, use the following command:

> python configure.py --grav mg --prob poisson
> make clean ; make

Optionally you can enable MPI parallelization with -mpi. OpenMP parallelization is also implemented but it is not well tested at this moment (and I would appreciate if someone can help me and test it).

Setting and input parameters

For simulations with self gravity, the gravitational constant G or 4ΠG must be set in the problem generator file (e.g. in Mesh::InitMeshData) using the Mesh::SetGravityConstant or Mesh::SetFourPiG function. The value must be correctly scaled with the units of other physical quantities. This design enables a user to change the constant later during the simulation, for example, to set up the initial condition without gravity and then turns it on later. In poisson.cpp, this value is read from <problem> four_pi_G in the input file.

The parameters for the gravity solver are set in the <gravity> block.

<gravity>
mgmode          = FMG       # set the solver mode: FMG or MGI
threshold       = 0.0       # set the convergence threshold
output_defect   = true      # output the defect
fill_ghost      = true      # fill all ghost cells
ix1_bc          = periodic  # Inner X1 boundary condition
ox1_bc          = periodic  # Outer X1 boundary condition
ix2_bc          = periodic  # Inner X2 boundary condition
ox2_bc          = periodic  # Outer X2 boundary condition
ix3_bc          = periodic  # Inner X3 boundary condition
ox3_bc          = periodic  # Outer X3 boundary condition
#mporder        = 4         # order of multipole expansion, 2 or 4

Solver parameters

The mgmode parameter specifies the solver mode (see Multigrid), where FMG is the full multigrid algorithm and MGI is the standard V-cycle iterations. FMG is generally recommended for both robustness and efficiency. threshold controls the accuracy of the solution and when the iterations stop. This value must be carefully chosen according to each problem. While threshold = 0.0 makes the code iterate until it reaches saturation, it is not always necessary. On the other hand, the code stops after the first FMG sweep if a large threshold value is set. This may work for some problems, but high-frequency noises arising from the red-black iteration pattern may become visible, particularly in long-term simulations. So it is recommended to set a reasonable value according to your simulation setup.

As explained in Multigrid, the MeshBlock size must be power of 2, i.e. the &lt;meshblock&gt; nx? parameters must be 2n. There is no other restriction regarding the Mesh structure and parallelization. The code works with mesh refinement without any additional setting.

Output

The code outputs the gravitational potential as phi when prim, cons or phi is specified in an &lt;output&gt; block. In addition, when output_defect = true is set, the defect is also dumped in the file as defect-phi.

The current implementation of the Multigrid solver uses one ghost cell on each side of a MeshBlock. When ghost cell data are written in output files, only the first ghost cells have physical values and default values are filled beyond them. However, it is sometimes needed to fill all the ghost cells for analysis or for additional physics. The extra ghost cells can be filled by setting fill_ghost = true. It should be noted, however, the extra ghost cells are not fully consisntent with the first ghost cells when mesh refinement is in use, because the mass-conservation formula is applicable only for the first ghost cells, and simple linear interpolation is used to fill the extra cells. Also, this process takes some costs because it requires additional communications between MeshBlocks.

Boundary conditions

The ix?_bc and ox?_bc parameters specify the boundary conditions, similar to the boundary conditions for hydrodynamics. Currently, periodic, zerograd, zerofixed, multipole and user are supported. Note that, although the zerograd and zerofixed conditions are implemented, these are very naive boundary conditions and they do not necessarily behave as expected.

The multipole boundary condition uses multipole expansion to realize isolated (also called "free" or "vacuum") boundary conditions, i.e. there is no other object outside the box and the potential approaches zero at infinity. To use this, set the mporder. Currently, mporder = 2 (upto quadrapole) and mporder = 4 (upto hexadecapole) are implemented. The code assumes that the origin of the multipole expansion is the zero point of the coordinates, (x, y, z) = (0.0, 0.0, 0.0), and the code calculates the dipole moment instead of shifting the origin to the center of mass. To achieve better accuracy of the expansion, it is better to have a large box and put boundaries away from the region of interest as far as possible.

Accessing the potential

To access the gravitational potential in the code, it is stored in the Gravity class, which is a member of MeshBlock. For example, from MeshBlock::UserWorkInLoop, it can be accessed as follows:

void MeshBlock::UserWorkInLoop() {
  for (int k = ks; k <= ke; ++k) {
    for (int j = js; j <= je; ++j) {
      for (int i = is; i <= ie; ++i) {
        Real potc = pgrav->phi(k,j,i);
        Real potl = pgrav->phi(k,j,i-1);
        Real potr = pgrav->phi(k,j,i+1);
        ...
      }
    }
  }
  return;
}

Performance and scalability

Multigrid scaling

This figure is a bit outdated as it is measured on the Perseus cluster at Princeton equipped with Broadwell CPUs, but it illustrates the scaling behavior of the Multigrid solver quite well. This is the results of a weak scaling test of the Multigrid Poisson solver on a uniform grid with periodic boundaries using different MeshBlock sizes. The FMG mode is used, where only one FMG sweep and no additional V-cycle iteration is performed (i.e. the fastest but least accurate configuration). Different lines indicate different MeshBlocks assigned to each process. The red nearly horizontal line is the weak scaling of the MHD part (VL2 + PLM + HLLD, 643 cells per process). With 643 cells per MeshBlock, the Multigrid solver (cyan) is about 10 times faster than MHD even with more than a few thousand processes. While it takes two Multigrid sweeps per step and probably we need a few additional V-cycle iterations to improve the accuracy, we still can say the Multigrid solver costs less than the MHD solver.

Note that the black line in the figure is FFT. FFT does not scale very well with many processes, but this is nothing wrong with the FFT solver. This is simply because the computational cost of FFT scales as O(N logN).

Clone this wiki locally