Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Simplifying detection efficiency validation notebook. #1035

Merged
Merged
Changes from all commits
Commits
File filter

Filter by extension

Filter by extension

Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
202 changes: 87 additions & 115 deletions docs/notebooks/demo_DetectionEfficiencyValidation.ipynb
Original file line number Diff line number Diff line change
Expand Up @@ -8,7 +8,13 @@
"source": [
"# Detection Efficiency Filter Demo\n",
"\n",
"This notebook demonstrates the detection efficiency/fading function filter."
"This notebook demonstrates the detection efficiency/fading function filter. The equation for the fading function is taken from Veres & Chesley (2017):\n",
"\n",
"$$\n",
"\\epsilon(m) = \\frac{F}{1 + e^\\frac{m-m_{5\\sigma}}{w}}\n",
"$$\n",
"\n",
"where $\\epsilon(m)$ is the probability of detection, $F$ is the peak detection efficiency, $m$ and $m_{5\\sigma}$ are respectively the observed magnitude and $5\\sigma$ limiting magnitude of the pointing, and $w$ is the width of the fading function."
]
},
{
Expand All @@ -17,8 +23,7 @@
"metadata": {},
"outputs": [],
"source": [
"from sorcha.modules import PPFootprintFilter as fp\n",
"from sorcha.modules import PPFadingFunctionFilter as ff\n",
"from sorcha.modules.PPFadingFunctionFilter import PPFadingFunctionFilter\n",
"from sorcha.modules.PPModuleRNG import PerModuleRNG"
]
},
Expand All @@ -29,34 +34,22 @@
"outputs": [],
"source": [
"import pandas as pd\n",
"import sqlite3 as sql\n",
"import matplotlib.pyplot as plt\n",
"import numpy as np\n",
"import os"
"import numpy as np"
]
},
{
"cell_type": "code",
"execution_count": null,
"cell_type": "markdown",
"metadata": {},
"source": [
"## Create a dataframe of synthetic observations"
]
},
{
"cell_type": "markdown",
"metadata": {},
"outputs": [],
"source": [
"def getSqlData(database,rows_start,nrows):\n",
" \"\"\"Import dataset from local SQL database\n",
" \n",
" Parameters:\n",
" -----------\n",
" database ... path to database\n",
" rows_start ... number of row to start\n",
" rows_max ... number of rows to select\n",
" \n",
" Returns:\n",
" --------\n",
" observations ... pandas dataframe containing observations from JPL database\n",
" \"\"\"\n",
" con = sql.connect(database)\n",
" observations = pd.read_sql(\"\"\"SELECT *, observationStartMJD as observationStartMJD_TAI FROM observations LIMIT \"\"\"+str(rows_start)+','+str(nrows), con)\n",
" return observations"
"Only the apparent magnitude and the five-sigma limiting depth are needed for this. For simplicity, we will set the five-sigma limiting depth to be constant for all observations."
]
},
{
Expand All @@ -65,64 +58,34 @@
"metadata": {},
"outputs": [],
"source": [
"def randobs(ronsky=3,ra=180,dec=45, n=100000):\n",
"\n",
" \"\"\"Create random observations centered on RA, Dec with radius r.\n",
" Random observations are generated correctly only for declinations < dec+r.\n",
" \n",
" Parameters:\n",
" ------------\n",
" ronksy ... on sky radius [deg]\n",
" ra ... Right Ascension of center [deg]\n",
" dec ... Declination of center\n",
" \n",
" Returns:\n",
" --------\n",
" raout ... Right Ascension of fake observations\n",
" decout ... Declination of fake observations\n",
" \n",
" \"\"\"\n",
" # radius of the circle\n",
" # center of the circle (x, y)\n",
" # random angle\n",
" \n",
" rnd1=np.random.rand(n)\n",
" rnd2=np.random.rand(n)\n",
" \n",
" alpha = 2 * np.pi * rnd1\n",
" # random radius\n",
" r = ronsky * np.sqrt(rnd2)\n",
" # calculating coordinates\n",
" raout = r * np.cos(alpha) + ra\n",
" decout = r * np.sin(alpha) + dec\n",
" \n",
" return np.mod(raout+360,360),decout"
"nobs_per_field = 1000000"
]
},
{
"cell_type": "markdown",
"cell_type": "code",
"execution_count": null,
"metadata": {},
"outputs": [],
"source": [
"## Import LSST Opsim database \n",
"This database contains LSST pointings and environmental information such as seeing.\n"
"mags = np.random.uniform(23.0, 26.0, nobs_per_field)\n",
"five_sigma = np.zeros(nobs_per_field) + 24.5\n",
"random_obs = pd.DataFrame({\"PSFMag\":mags, \"fiveSigmaDepth_mag\":five_sigma})"
]
},
{
"cell_type": "code",
"execution_count": null,
"metadata": {
"tags": []
},
"metadata": {},
"outputs": [],
"source": [
"db_path=\"oneline_v2.0.db\""
"random_obs"
]
},
{
"cell_type": "markdown",
"metadata": {},
"source": [
"Select the first exposure of the LSST survey and initialise number of simulated detections."
"We can apply the fading function filter implementation in Sorcha to the randomised observations."
]
},
{
Expand All @@ -131,9 +94,17 @@
"metadata": {},
"outputs": [],
"source": [
"LSSTdf=getSqlData(db_path,0,1)\n",
"peak_efficiency = 1.0\n",
"width = 0.1\n",
"rng=PerModuleRNG(2021)\n",
"nobs_per_field=1000000"
"reduced_obs = PPFadingFunctionFilter(random_obs, peak_efficiency, width, rng)"
]
},
{
"cell_type": "markdown",
"metadata": {},
"source": [
"Now we calculate the probabilty of each observation to survive Sorcha's fading function filter, binned by magnitude."
]
},
{
Expand All @@ -142,14 +113,21 @@
"metadata": {},
"outputs": [],
"source": [
"LSSTdf"
"bin_width = 0.04\n",
"magbins = np.arange(23.0, 26.0, bin_width)\n",
"\n",
"magcounts, _ = np.histogram(random_obs['PSFMag'].values, bins=magbins)\n",
"\n",
"redmagcounts, _ = np.histogram(reduced_obs['PSFMag'].values, bins=magbins)\n",
"\n",
"sorcha_probability = redmagcounts/magcounts"
]
},
{
"cell_type": "markdown",
"metadata": {},
"source": [
"Some random measurements are then created."
"And we can compare this to the calculated detection probability of each magnitude bin, using Equation One."
]
},
{
Expand All @@ -158,16 +136,7 @@
"metadata": {},
"outputs": [],
"source": [
"magbins=[]\n",
"i=22.0\n",
"while(i<=26.0):\n",
" magbins.append(i)\n",
" i=i+0.05\n",
"\n",
"rao,deco=randobs(ronsky=2.5,ra=LSSTdf['fieldRA'][0],dec=LSSTdf['fieldDec'][0],n=nobs_per_field)\n",
"dfobs0=pd.DataFrame(np.array([rao,deco]).T,columns=['RA_deg','Dec_deg'])\n",
"dfobs0['FieldMJD_TAI']=LSSTdf['observationStartMJD_TAI'][0]\n",
"dfobs0['FieldID']=LSSTdf['observationId'][0]"
"calculated_probability = peak_efficiency / (1.0 + np.exp((magbins - 24.5) / width))"
]
},
{
Expand All @@ -176,29 +145,28 @@
"metadata": {},
"outputs": [],
"source": [
"dfobs0"
"fig, ax = plt.subplots(1, figsize=(8, 6))\n",
"ax.bar(magbins[:-1], sorcha_probability, align=\"edge\", width=bin_width, color=\"mediumaquamarine\", label=\"Sorcha detection probability\")\n",
"ax.set_ylim((0, 1.1))\n",
"ax.set_xlim((23, 26)) \n",
"ax.set_xlabel(\"magnitude\")\n",
"ax.set_ylabel(\"detection probability\")\n",
"ax.plot(magbins, calculated_probability, color=\"darkslateblue\", label=\"calculated detection probability\")\n",
"ax.legend()"
]
},
{
"cell_type": "markdown",
"metadata": {},
"source": [
"## Create magnitudes"
"Sorcha's detection probability clearly follows the expected curve."
]
},
{
"cell_type": "code",
"execution_count": null,
"cell_type": "markdown",
"metadata": {},
"outputs": [],
"source": [
"mags=np.arange(nobs_per_field)\n",
"mags=np.random.uniform(22.0,26.0,len(dfobs0))\n",
"dfobs = pd.merge(dfobs0, LSSTdf, left_on=\"FieldID\", right_on=\"observationId\", how=\"left\")\n",
"\n",
"# Match keywords\n",
"dfobs['PSFMag'] = mags\n",
"dfobs['fiveSigmaDepth_mag']=dfobs['fiveSigmaDepth']\n"
"We can also take a look at how the fading function/detection efficiency changes with parameters."
]
},
{
Expand All @@ -207,7 +175,8 @@
"metadata": {},
"outputs": [],
"source": [
"mags"
"def deteff(mags, fivesig, peak, width):\n",
" return peak / (1.0 + np.exp((mags - fivesig) / width))"
]
},
{
Expand All @@ -216,7 +185,8 @@
"metadata": {},
"outputs": [],
"source": [
"dfobs['PSFMag']"
"colors = plt.get_cmap('gnuplot', 3)\n",
"x = colors(np.arange(0,3,1))"
]
},
{
Expand All @@ -225,7 +195,9 @@
"metadata": {},
"outputs": [],
"source": [
"dfobs"
"widths = [0.05, 0.1, 0.15]\n",
"colors = [\"tomato\", \"darkslateblue\", \"mediumturquoise\"]\n",
"styles = [\"solid\", \"dotted\", \"dashed\"]"
]
},
{
Expand All @@ -234,32 +206,32 @@
"metadata": {},
"outputs": [],
"source": [
"magcounts,magpins=np.histogram(dfobs['PSFMag'],bins=magbins)\n",
"fig, ax = plt.subplots(1, figsize=(8, 6))\n",
"\n",
"reduced_dfobs=ff.PPFadingFunctionFilter(dfobs, 1.0, 0.1, rng)\n",
"ax.axvline(24.5, 0, 1, linestyle=\"-\", alpha=0.5, color=\"grey\", linewidth=0.8)\n",
"ax.axhline(0.5, 0, 1, linestyle=\"-\", alpha=0.5, color=\"grey\", linewidth=0.8)\n",
"\n",
"redmagcounts,redmagbins=np.histogram(reduced_dfobs['PSFMag'],bins=magbins)\n",
"for i, width in enumerate(widths):\n",
" y = deteff(magbins, 24.5, 1., width)\n",
" ax.plot(magbins, y, color=colors[i], linestyle=styles[i], label=f\"$F = $ 1.0, $w = {width}$\")\n",
"\n",
"res=redmagcounts/magcounts\n",
"y = deteff(magbins, 24.5, .9, 0.1)\n",
"ax.plot(magbins, y, color=\"mediumpurple\", linestyle=\"dashdot\", label=f\"$F = $ 0.9, $w = {width}$\")\n",
"\n",
"fiveSigma=dfobs.fiveSigmaDepth_mag.values[0]\n",
"ax.set_xlim((23, 26))\n",
"ax.legend()\n",
"ax.set_xlabel(\"magnitude\")\n",
"ax.set_ylabel(\"detection probability\")\n",
"\n",
"restheor=[]\n",
"i=0\n",
"while(i<len(magbins)):\n",
" a=1./(1.+np.exp((magbins[i]-fiveSigma)/.1))\n",
" restheor.append(a)\n",
" i=i+1\n",
"\n",
"plt.clf()\n",
"plt.scatter(magpins[:-1]+0.025,res)\n",
"plt.plot(magbins,restheor, 'r-')\n",
"plt.xlabel('Magnitude')\n",
"plt.ylabel('Observed fraction')\n",
"#plt.savefig('deteff_.jpg', bbox_inches='tight')\n",
"plt.show()\n",
"\n"
"fig.tight_layout()"
]
},
{
"cell_type": "code",
"execution_count": null,
"metadata": {},
"outputs": [],
"source": []
}
],
"metadata": {
Expand All @@ -282,7 +254,7 @@
"name": "python",
"nbconvert_exporter": "python",
"pygments_lexer": "ipython3",
"version": "3.10.13"
"version": "3.10.14"
}
},
"nbformat": 4,
Expand Down
Loading