-
Notifications
You must be signed in to change notification settings - Fork 9
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
discrepancy between specsim.simulator and CMX spectra #110
Comments
Interesting. A possibility is that the particular fiber you are looking at is on average not well positioned on its target. Can you compare the value of FIBERFLUX_R in the fibermap with the integral of the calibrated spectrum? |
Thanks for documenting this nice test @changhoonhahn. I am looking into this now... |
@julienguy @dkirkby I found the issue for all the galaxies spectra that I compared to so I don't think it's caused by a particular fiber. I've updated the notebook to print out the specific fiber number and also compared a second galaxy spectrum. |
Hmm, I didn't realize transparency would be such a large effect! Regarding the fiber acceptance fraction, in my example, I used the smoothed spectra as the source. In that case, FFRAC should not play a role since the source signal is already scaled down --- right? To account for transparency, should I be scaling my exposure times by a factor of (transp/1)^2 as you did for your depth calculations? |
FFRAC still plays a role since the CFRAME spectrum is corrected for the actual fiber acceptance measured with stars in this exposure. Specsim then applies the nominal fiber acceptance for 1.1" seeing, which could be more or less than the actual value. The bigger effect here is probably the large source size, which decreases the FFRAC for his target relative to a PSF-like source. |
I updated the comparison to include Now Still, the inverse variance of the I suspect the throughput corrected sky brightness (blue), which by eye looks like it is overestimated in r and z: @dkirkby I derived the sky brightnesses as (see notebook for details):
Is this consistent with how you got the sky brightness in the comparison you had in your survey planning talk during the sci-com telecon? |
@dkirkby Thanks! Turns out it was a bug in my code with the wavelength binning. I reran the test with the updated sky surface brightness and now there's better agreement between the There's still some discrepancy, |
Thanks for the update @changhoonhahn. We should still be able to achieve better agreement than this, at least for the FRAME ivar (which should dominate and is what specsim actually calculates), but is this good enough for your studies? |
Is this for the same bright TARGETID 35185736613364265? What are you assuming for the throughput (transparency x fiberfrac) for this target? This doesn't look right. |
I noticed that the CMX spectra from mini-SV2 and SV0 were noisier than the simulated spectra I was getting from a BGS version of
specsim
. As a test, in this jupyter notebook I compare a bright BGS galaxy spectrum from CMX to a simulated spectrum constructed usingspecsim.simulator.Simulator
.To reduce the number of moving parts, I smooth the CMX spectrum and use it as the source flux (black dashed). I also use the actual sky surface brightness from CMX for the surface brightness [^1]. The simulated spectrum is substantially less noisy and has much lower inverse variances.
tl;dr: The spectra I get from
specsim.simulator.Simulator
is substantially less noisy than observed spectra from CMX.[^1] @dkirkby pointed out that the way I correct for flux calibration in the raw sky data overestimates the sky flux because it corrects for fiberloss. I didn't correct this since a fainter sky would make the problem worse.
The text was updated successfully, but these errors were encountered: