Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

WIP: Halo finding and halo catalog section #160

Open
wants to merge 2 commits into
base: main
Choose a base branch
from
Open
Show file tree
Hide file tree
Changes from all commits
Commits
File filter

Filter by extension

Filter by extension

Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
21 changes: 21 additions & 0 deletions content/50.halo_finding_and_catalogs.md
Original file line number Diff line number Diff line change
@@ -1 +1,22 @@
## Halo-Finding and Catalogs {#sec:halo_finding}

In cosmological simulations, dark matter is nearly always represented as a collection of collisionless particles. **CITE SOMETHING ABOUT VLASOV GRID SOLUTIONS**
This is well-suited to approximating it as a collisionless fluid; however, the identification of structures within that collection of particles can take on several forms, addressing different use cases to different degrees of accuracy. **CITE HALO FINDING COMPARISON PAPER**
Being able to identify halos, as well as their associated baryonic content, is necessary for rapid analysis of cosmological simulations.
Furthermore, convergence studies and cross-simulation comparisons requires a consistent method for identifying dark matter halos, as well as the ability to track their growth over time.

In past versions of `yt`, several specific halo finders were bundled and made available to work on any class of data `yt` was able to read.
These included the HOP halo finder, the classic Friends-of-Friends (FOF) halo finder [@doi:10.1086/191003], a scalable and Parallel HOP [@doi: 10.1086/305535], and a wrapping of the ORIGAMI code [@doi:10.1142/9789814623995_0378] for filament identification.
To do so, `yt` would utilize direct in-memory connectors with these implementations; whereas typically data connectors are written for each individual dataset format for individual halo finding methods, this enabled a single connector to be written from `yt` to the halo finder.
In addition to these bundled halo finders, a direct in-memory interface with Rockstar [@doi:10.1088/0004-637X/762/2/109] was developed that sidestepped Rockstar's built in load-balancing to minimize data duplication and transfer.

`yt` provides a unique set of functionality for accessing halo catalogs, as it provides the ability to query their values both *as* catalogs and as the original, underlying datasets.
This means that the same selection and analysis operations that can be conducted on a "primary" dataset can also be conducted on the halo catalog; furthermore, the halo catalog can be used as input to data selection operations.
This enables, for instance, querying original dark matter particle values in a halo (as defined by any characteristic radius of that halo) simultaneously with querying baryonic cells or particles included within.

As discussed in [@doi:10.5281/zenodo.8349044], this can be used as input into other tools to provide sophisticated, graph-based queries of datasets and halo merger trees over cosmological time, such as with the `ytree` package.

`yt` also includes an internal halo finding tool that is not widely exposed, built on its implementation of a union-find data structure for identifying topologically connected sets.
This implementation, a "particle contour tree," uses this union-find data structure to connect (via percolation) particles into simple Friends-of-Friends collections.
These can then be used as input into more sophisticated phase-space finders, such as Rockstar, and as done in [@doi:10.48550/arXiv.1407.2600].

4 changes: 2 additions & 2 deletions content/60.analysis_modules.md
Original file line number Diff line number Diff line change
Expand Up @@ -3,5 +3,5 @@
For much of its development history, `yt` took the approach of bundling as many analysis modules as possible in the primary repository.
This provided the advantage of having all work be centralized, and ensuring that each download or installation of `yt` was a fully-featured system for analyzing a large swath of data, but it brought with it the development overhead of the entire `yt` package for what in many cases were isolated pieces of functionality with separable responsibilities.

As a result of the slowing in speed of development as a result of review requirements (and limited personnel to conduct those reviews), some of the analysis modules that were bundled with `yt` have been "spun out" into their own repository, `yt_astro_analysis`.
This repository, which is developed, released and installed separately from `yt`, includes modules for cosmological observation (upon which Triden, which is discussed in @sec:trident, is based), dark matter halo finding and analysis, tools for interacting with position-position-velocity cubes, and a system for exporting from `yt` to RADMC-3D [@ascl:1202.015].
As a result of the slowing in speed of development as a result of review requirements (and limited personnel to conduct those reviews), some of the analysis modules that were bundled with `yt` have been "spun out" into their own repository, `yt_astro_analysis` [@doi:10.5281/zenodo.8431185].
This repository, which is developed, released and installed separately from `yt`, includes modules for cosmological observation (upon which Triden, which is discussed in @sec:trident, is based), dark matter halo finding and analysis, tools for interacting with position-position-velocity cubes, light cone generation [@doi:10.1088/0004-637X/698/2/1795] and a system for exporting from `yt` to RADMC-3D [@ascl:1202.015].
Loading