ntuple based analysis package for long lived displaced jet analyses
# Fermilab uses tcsh by default even though it has bash!
# This framework is based in bash and
# technically maybe you don't need this,
# but tcshers be warned
bash --login
# Set up the area
export SCRAM_ARCH=slc6_amd64_gcc530;
scram pro -n LLDJ_slc6_530_CMSSW_8_0_26_patch1 CMSSW CMSSW_8_0_26_patch1;
cd LLDJ_slc6_530_CMSSW_8_0_26_patch1/src;
cmsenv;
## CMSSW imports and customizations
git cms-merge-topic ikrav:egm_id_80X_v3_photons
scramv1 build -j 10;
## LLDJstandalones Framework checkout
# first fork the repository to make your own workspace
git clone https://github.com/<mygithubusername>/LLDJstandalones.git;
pushd LLDJstandalones;
# If you want to check out a specific branch
# git fetch origin
# git branch -v -a # list branches available, find yours
# git checkout -b NAMEOFBRANCH origin/NAMEOFBRANCH
# add DisplacedHiggs as upstream
git remote add upstream https://github.com/DisplacedHiggs/LLDJstandalones.git
cd LLDJstandalones
# compile a clean area
scramv1 build -j 10;
## Every time you log in
# set up some environment variables (bash)
source LLDJstandalones/setup.shMake sure to run source setup.sh from the LLDJstandalones directory first to set up environment variables used in scripts.
In particular, this sets up $nversion and $aversion
cd ${CMSSW_BASE}/LLDJstandalones/ntuples/config
To run local jobs, do cmsRun run_mc_80XAOD.py and cmsRun run_data_80XAOD.py
Then submit CRAB jobs using bash submitcrab.sh which uses crab_template.py
CRAB directores are in ..config/gitignore/$nversion
Finished jobs appear at FNAL in /store/group/lpchbb/LLDJntuples/$nversion
To run the analyzer first we need to run other things from ..LLDJntuples/commontools
- Lists
bash makemasterlist.shmakes master lists of ntuples from which other lists are derivedbash makelists.shmakes lists of files and puts them inlistsfolder, split by samplebash countevents.shcallscountevents.cxxand makes .info files inlistsfolderbash findTTavgweights.shruns over TTbar samples calculates the average TTbar weight
- Scale factors
- EGamma SFs in
commontools/elesf- scale factors are provided from POG as a TH2F histogram from https://twiki.cern.ch/twiki/bin/view/CMS/EgammaIDRecipesRun2 - Pileup reweighting in
commontools/pileup, seeREADME bash collectjsons.sh- get JSONs from finished CRAB jobs, compare to golden JSONbash makeinputhistos.sh- make input SF histograms using database (on lxplus)bash runMakePUweights.sh- make weight histograms- Make sure SF files are copied into
LLDJntuples/analyzerdirectory
Run local jobs from LLDJntuples/analyzers folder
makecompilesmain.Cinto executablerunanalyzer.exe./runanalyzer.exe --<flags>call executable analyzer by hand (you must specify flags)- or edit and run
bash runAnalyzers.shwhich loops through different options for callingrunanalyzer.exe
From submitters folder
- in
submitjobs.shsetdoSubmit=falseto be safe while testing bash submitjobs.shcreates submit area ingitignore. The job that actually runs on the condor nodes isrunsubmitter.shvoms-proxy-init --voms cms --valid 100:00set up your proxy- set
doSubmit=trueand runbash submitjobs.sh- then optionally add info to the autogenerated txt file about the submit
While jobs are running / finished
bash checkjobs.shcheck to see if see if condor jobs are donebash haddjobs.shmerge analyzed output locally intoLLDJstandalones/roots/$aversionbash cpeos.shcopy hadded analyzer jobs to EOS- delete those files copied over, no script for this
First thing to do is merge the histograms from the analyzer, do this with bash runPlotterStackedRegion then take it from there
- run
bash runPlotterStackedRegionover the unshifted analyzer output. - run
bash runPlotterTagvarUnc.shto get plots starting withtvuand values of shifted cuts based on integral - put new shift cut values in
analyzers/analyzer_config.Cas the variables liketag_shiftXXXXand rerun analyzer