Skip to content

Commit

Permalink
Healpix doc (nasa-gcn#2498)
Browse files Browse the repository at this point in the history
Resolves: nasa-gcn#2035
  • Loading branch information
Vidushi-GitHub authored Jan 20, 2025
1 parent 210e916 commit 0242bde
Show file tree
Hide file tree
Showing 2 changed files with 214 additions and 0 deletions.
178 changes: 178 additions & 0 deletions app/routes/docs.client.samples.md
Original file line number Diff line number Diff line change
Expand Up @@ -12,6 +12,9 @@ and some samples from the FAQs section of the [gcn-kafka-python](https://github.

To contribute your own ideas, make a GitHub pull request to add it to [the Markdown source for this document](https://github.com/nasa-gcn/gcn.nasa.gov/blob/CodeSamples/app/routes/docs.client.samples.md), or [contact us](/contact).

- [Working with Kafka messages](#parsing)
- [HEALPix Sky Maps](#healpix-sky-maps)

## Parsing

Within your consumer loop, use the following functions to convert the
Expand Down Expand Up @@ -162,3 +165,178 @@ for message in consumer.consume(end[0].offset - start[0].offset, timeout=1):
continue
print(message.value())
```

## HEALPix Sky Maps

[HEALPix](https://healpix.sourceforge.io) (<b>H</b>ierarchical, <b>E</b>qual <b>A</b>rea, and iso-<b>L</b>atitude <b>Pix</b>elisation) is a scheme for indexing positions on the unit sphere.
For localization of events, the multi-messenger community uses the standard HEALPix [@2005ApJ...622..759G] with the file extension `.fits.gz`, as well as multi-resolution HEALPix [@2015A&A...578A.114F] with the file extension `.multiorder.fits`. The preferred format is the multi-resolution HEALPix format.

### Multi-Order Sky Maps

GCN strongly encourages the use of multi-order sky maps. They utilize a variable resolution, with higher probability regions having higher resolution and lower probability regions being encoded with a lower resolution. Variable resolution allows multi-order sky maps to be significantly more efficient than single-resolution HEALPix sky maps with respect to both storage footprint and read speed.

#### Reading Sky Maps

Sky maps can be parsed using Python and a small number of packages. While this documentation covers the use of `astropy-healpix`, there are several packages that can be used for this purpose; a number of [alternatives](#other-documentation-and-healpix-packages) are listed at the bottom of this page.

```python
import astropy_healpix as ah
import numpy as np

from astropy import units as u
from astropy.table import QTable
```

A given sky map can then be read in as:

```python
skymap = QTable.read('skymap.multiorder.fits')
```

#### Most Probable Sky Location

Let's calculate the index of the sky point with the highest probability density, as follows:

```python
hp_index = np.argmax(skymap['PROBDENSITY'])
uniq = skymap[hp_index]['UNIQ']

level, ipix = ah.uniq_to_level_ipix(uniq)
nside = ah.level_to_nside(level)

ra, dec = ah.healpix_to_lonlat(ipix, nside, order='nested')
```

#### Probability Density at a Known Position

Similarly, one can calculate the probability density at a known position:

```python
ra, dec = 197.450341598 * u.deg, -23.3814675445 * u.deg

level, ipix = ah.uniq_to_level_ipix(skymap['UNIQ'])
nside = ah.level_to_nside(level)

match_ipix = ah.lonlat_to_healpix(ra, dec, nside, order='nested')

match_index = np.flatnonzero(ipix == match_ipix)[0]

prob_density = skymap[match_index]['PROBDENSITY'].to_value(u.deg**-2)
```

#### 90% Probability Region

The estimation of a 90% probability region involves sorting the pixels, calculating the area of each pixel, and then summing the probability of each pixel until 90% is reached.

```python
#Sort the pixels by decending probability density
skymap.sort('PROBDENSITY', reverse=True)

#Area of each pixel
level, ipix = ah.uniq_to_level_ipix(skymap['UNIQ'])
pixel_area = ah.nside_to_pixel_area(ah.level_to_nside(level))

#Pixel area times the probability
prob = pixel_area * skymap['PROBDENSITY']

#Cummulative sum of probability
cumprob = np.cumsum(prob)

#Pixels for which cummulative is 0.9
i = cumprob.searchsorted(0.9)

#Sum of the areas of the pixels up to that one
area_90 = pixel_area[:i].sum()
area_90.to_value(u.deg**2)
```

### Flat Resolution HEALPix Sky maps

A sky map `.fits.gz` file can be read in using `healpy`:

```python
import healpy as hp
import numpy as np
from matplotlib import pyplot as plt

# Read both the HEALPix image data and the FITS header
hpx, header = hp.read_map('skymap.fits.gz', h=True)

# Plot a Mollweide-projection all-sky image
np.mollview(hpx)
plt.show()
```

#### Most Probable Sky Location

The point on the sky with the highest probability can be found by finding the maximum value in the HEALPix sky map:

```python
# Reading Sky Maps with Healpy
healpix_image = hp.read_map('bayestar.fits.gz,0')
npix = len(hpx)

# Lateral resolution of the HEALPix map
nside = hp.npix2nside(npix)

# Find the highest probability pixel
ipix_max = np.argmax(hpx)

# Probability density per square degree at that position
hpx[ipix_max] / hp.nside2pixarea(nside, degrees=True)

# Highest probability pixel on the sky
ra, dec = hp.pix2ang(nside, ipix_max, lonlat=True)
ra, dec
```

#### Integrated probability in a Circle

We can calculate the integrated probability within an arbitrary circle on the sky:

```python
# First define the Cartesian coordinates of the center of the circle
ra = 180.0
dec = -45.0
radius = 2.5

# Calculate Cartesian coordinates of the center of the Circle
xyz = hp.ang2vec(ra, dec, lonlat=True)

# Obtain an array of indices for the pixels inside the circle
ipix_disc = hp.query_disc(nside, xyz, np.deg2rad(radius))

# Sum the probability in all of the matching pixels:
hpx[ipix_disc].sum()
```

#### Integrated probability in a Polygon

Similar to the integrated probability within a circle, it is possible to calculate the integrated probability of the source lying within an arbitrary polygon:

```python
# Indices of the pixels within a polygon (defined by the Cartesian coordinates of its vertices)

xyz = [[0, 0, 0],
[1, 0, 0],
[1, 1, 0],
[0, 1, 0]]
ipix_poly = hp.query_polygon(nside, xyz)
hpx[ipix_poly].sum()
```

#### Other Documentation and HEALPix Packages

For more information and resources on the analysis of pixelated data on a sphere, explore the following links:

- Additional information can be found on the [LIGO website](https://emfollow.docs.ligo.org/userguide/tutorial/multiorder_skymaps.html)

- [healpy](https://healpy.readthedocs.io/en/latest/): Official Python library for handling the pixelated data on sphere

- [astropy-healpix](https://pypi.org/project/astropy-healpix/): Integrates HEALPix with Astropy for data manipulation and analysis

- [mhealpy](https://mhealpy.readthedocs.io/en/latest/): Object-oriented wrapper of healpy for handling the multi-resolution maps

- [MOCpy](https://cds-astro.github.io/mocpy/): Python library allowing easy creation, parsing and manipulation of Multi-Order Coverage maps.

## Bibilography
36 changes: 36 additions & 0 deletions references.bib
Original file line number Diff line number Diff line change
Expand Up @@ -12,3 +12,39 @@ @ARTICLE{1995Ap&SS.231..255A
adsurl = {https://ui.adsabs.harvard.edu/abs/1995Ap&SS.231..255A},
adsnote = {Provided by the SAO/NASA Astrophysics Data System}
}

@ARTICLE{2005ApJ...622..759G,
author = {{G{\'o}rski}, K.~M. and {Hivon}, E. and {Banday}, A.~J. and {Wandelt}, B.~D. and {Hansen}, F.~K. and {Reinecke}, M. and {Bartelmann}, M.},
title = "{HEALPix: A Framework for High-Resolution Discretization and Fast Analysis of Data Distributed on the Sphere}",
journal = {Astrophysical Journal},
keywords = {Cosmology: Cosmic Microwave Background, Cosmology: Observations, Methods: Statistical, Astrophysics},
year = 2005,
month = apr,
volume = {622},
number = {2},
pages = {759-771},
doi = {10.1086/427976},
archivePrefix = {arXiv},
eprint = {astro-ph/0409513},
primaryClass = {astro-ph},
adsurl = {https://ui.adsabs.harvard.edu/abs/2005ApJ...622..759G},
adsnote = {Provided by the SAO/NASA Astrophysics Data System}
}

@ARTICLE{2015A&A...578A.114F,
author = {{Fernique}, P. and {Allen}, M.~G. and {Boch}, T. and {Oberto}, A. and {Pineau}, F. -X. and {Durand}, D. and {Bot}, C. and {Cambr{\'e}sy}, L. and {Derriere}, S. and {Genova}, F. and {Bonnarel}, F.},
title = "{Hierarchical progressive surveys. Multi-resolution HEALPix data structures for astronomical images, catalogues, and 3-dimensional data cubes}",
journal = {Astronomy and Astrophysics},
keywords = {surveys, atlases, astronomical databases: miscellaneous, catalogs, virtual observatory tools, methods: statistical, Astrophysics - Instrumentation and Methods for Astrophysics},
year = 2015,
month = jun,
volume = {578},
eid = {A114},
pages = {A114},
doi = {10.1051/0004-6361/201526075},
archivePrefix = {arXiv},
eprint = {1505.02291},
primaryClass = {astro-ph.IM},
adsurl = {https://ui.adsabs.harvard.edu/abs/2015A&A...578A.114F},
adsnote = {Provided by the SAO/NASA Astrophysics Data System}
}

0 comments on commit 0242bde

Please sign in to comment.