forked from cms-sw/cmssw
-
Notifications
You must be signed in to change notification settings - Fork 0
Instructions to run the BS ReReco using updated alignment and upload to DB
francescobrivio edited this page Oct 19, 2022
·
26 revisions
- Refit the BS using the new conditions
-
Get the latest release and the package producing the Beamspot
cmsrel CMSSW_X_Y_Z cd CMSSW_X_Y_Z/src cmsenv git cms-addpkg RecoVertex/BeamSpotProducer scram b -j 8 cd RecoVertex/BeamSpotProducer/test
-
Modify the cfg file BeamFit_LumiBased_NewAlignWorkflow.py
- use the correct GT
- Pay special attention to the tracks and vertices collection you want to use.
- Tracks and Vertices can be refitted only using RECO, FEVT or ALCARECO datatiers
- For the 2021-2022 data taking at 900 GeV track selection for ALCARECO was not re-optimized leading to a low number of tracks (~300) and PVs (10-20) per LS after refit
- Other datatiers, like AOD, can be used if no refit has to be performed
- Track collection to be used are: ALCARECOTkAlMinBias (ALCARECO datatier) and generalTracks (RECO, FEVT, AOD)
- PV collection to be used is always OfflinePrimaryVertices
- Useful information from Marco on TrackRefitter: 1) ClusterPositionEstimation (CPE) is re-run for each cluster associated to the track 2) track status and covariance matrix are recomputed for each layer (this is where new alignments come into play)
-
Prepare the crab cfg file (example here) and submit
- Merge the LS results and create IoVs
- get the Beamspot tools package, as described on the github repo https://github.com/MilanoBicocca-pix/BeamspotTools
cd $CMSSW_BASE/src/RecoVertex/BeamSpotProducer/python
git clone [email protected]:MilanoBicocca-pix/BeamspotTools.git
cd $CMSSW_BASE/src
scram b -r -j8
-
using BeamspotTools/test/test_workflow_Run2016B.py merge the LS results
- remember to configure if using or not the slopes for IoV splitting (here)
- in view of the sqlite file production, it's better to produce an output .txt file for each IoV
-
check the results against the Prompt Reco
- run the BS fitter (w/o refitting) on the same dataset as before and prepare the .txt files by LS
- use the script test/compareBS_prompt_reco.py to compare the results (input BS values to the script should be by LS)
- Produce the sqlite file containing all the payloads
- once you have the BS per IoV, produce the sqlite file with the command
cd $CMSSW_BASE/src/RecoVertex/BeamSpotProducer/scripts/
python3 createPayload.py -I lumibase -t TAG_NAME -d YOUR_DIR_WITH_INPUT_FILES -O YOUR_OUTPUT_DIR
- the .db file will be created in the folder YOUR_OUTPUT_DIR/lastPayloads
- you can check if its content is the right one by doing
sqlite3 FILENAME.db
select * from IOV;
* this will printout something like
BeamSpotObjects_2015_LumiBased_v1_offline|1177856126222638|dbf0b086851e0f61627a2fd88626e5c55243df97|2016-06-28 10:33:37.145655
BeamSpotObjects_2015_LumiBased_v1_offline|1177856126222698|e1112ab4b2d069c1b501aa9680815867c8d11ccc|2016-06-28 10:33:45.993201
BeamSpotObjects_2015_LumiBased_v1_offline|1177856126222758|d757f147cf5d2b9cadb09fca0bca414b296154b9|2016-06-28 10:33:53.612859
BeamSpotObjects_2015_LumiBased_v1_offline|1177856126222818|cae819ef14952361f926b89e14c5556dd600d7ab|2016-06-28 10:34:00.845266
BeamSpotObjects_2015_LumiBased_v1_offline|1177856126222878|58f8facd618b7a7c528db5a12ff0ddd071c6ea25|2016-06-28 10:34:08.343873
* to convert to a readable run/lumi range, in python you can do
```
>>> def unpackLumiId(since):
kLowMask = 0XFFFFFFFF
run = since >> 32
lumi = since & kLowMask
return run, lumi
>>> unpackLumiId(1177856126222878)
(274241, 542)
```
- Check the content of the sqlite file
- to be plotted against what is produced from the .txt files
- to obtain the BS values from a .db file, use the CMSSW package CondTools/BeamSpot
- in the test folder, modify BeamSpotRcdRead_cfg.py to read your .db file (important info)
- go back to RecoVertex/BeamSpotProducer and use the script BeamspotTools/test/compareBS_fromTxt_fromDB.py to plot the results [for now it is in the tools2CompareDB branch]
-
Upload the sqlite file to the database
## ONLY ONCE YOU ARE SUPER-SURE EVERYTHING IS CORRECT ##
- usually it's better to interact with AlCA people before uploading the new .dbs
- .db files should be appended to previously created tags (ask AlCa for the Tag name), and must be uploaded following the chronological order
- to upload the file you've created, just follow the instructions on this twiki or just do
uploadConditions.py FILENAME.db
and follow the procedure * it's probably better to avoid '@' in the .db filename
- Update the twiki: https://github.com/MilanoBicocca-pix/cmssw/wiki/payloads-log-2016
- Instructions for ReReco
- BS parameters for MC production & upload to CondDB
- Talks
- Useful links
- Notes for BS fit on VdM scans
- Notes for error scale computation
- Begin of run checks
- DQM and PCL beam spot fits
- BS for FTV
- BS Trends Plots
- Payloads log 2015
- Payloads log 2016
- Payloads log 2017
- Payloads log UL2017
- Payloads log 2018
- Payloads log 2019
- Payloads log 2022
- Payloads log 2023