Script to make UCNtuples on the CAF

Last Update: Sep 08, 2004 Old instructions

A script for making UCNtuples for any dataset or list of DFC files exists. The script resides on our UChicago machines: /cdf/data10a/ucntUtils/ntuple Please fill in the 4112 DataSet Table or
531 DataSet Table or
533 DataSet Table for any ntuples created.


Running it without any arguments will provide the following syntax:

Syntax:
ntuple -dataset <dfc dataset name> [options]
-or-
ntuple -list <filename> [options]

*****************************************************************************************
Stephen Levy
October 13, 2004

PLEASE NOTE:

GEN 5
The default tcl is FlatNtuple/tcl/53X_multi.tcl which is designed to be used when
running over 5.3.1 processed MC or data.

GEN 4
The default tcl is FlatNtuple/run_ucnt_4112.tcl which is designed to be used when
running over 4.9.1 MC or data that has already been reprocessed in 4.11.2 by the
top group.

Please specify the tcl file FlatNtuple/remake_ucnt_4112.tcl for running over data
that was processed with offline release 4.8.4 (or earlier).
*****************************************************************************************

Options:
-mc : Sets appropriate flags for MC
-notInDFC : Used with list option when using rootd for file access
-tcl <tcl file name> : Non-default tcl file to use for processing
-cafcondor : Submit job to Condor CAF (if you have permission)
-nevents <# events> : Number of events to process
-cdfrel <=5.3.3_nt> : Cdf offline release to setup
-tagname <=59719> : UCNT tag
-output <output location> : Output directory
-queue <queue name> : Current queues are short, medium (def), long, test
-email <email address> : Email address for notification
-host <host> : Host
-start <start of range> : Start range
-end <end of range> : End range
-norun : Echo all script commands but do not submit job to CAF
-savefile : Save the tarfile used for job submission

NOTE: Always specify absolute path to output directory
For tcl, specify path relative to release dir

If -cdfrel and/or -tagname are specified, we look for a release of the form
/cdf/data10a/ucntUtils/ucnt_[tagname]_[release]

The -savefile switch saves the tar file output.

If -output is a directory on the local machine, the output will be scp'd from the CAF
to the local directory.

Example for ntupling dataset: For example, to make ucntuples for the MC dataset identifier ttopli, one would type (from any starting directory): > /cdf/data10a/ucntUtils/ntuple -dataset ttopli -mc You will see various debug messages printed to the terminal that should tell you what the script is attempting to do. There are a few stages that take a little time (like tarrring up the UCNtuple binary) so please have patience. The output for a typical command (submitted by user levys) is shown here: /cdf/home/levys> /cdf/data10a/ucntUtils/ntuple -dataset ttopli -mc -start 1 -end 1 -queue test -savefile -output /cdf/data10a/ucntUtils/levys/output/ -- Following parameters are set: CAF_CURRENT caf ntuple_binary /cdf/data10a/ucntUtils//prodrelease_59717_4.11.2/bin/Linux2-KCC_4_0/ucntuple.exe ntuple_shlib /cdf/data10a/ucntUtils//prodrelease_59717_4.11.2/shlib/ ntuple_lib /cdf/data10a/ucntUtils//prodrelease_59717_4.11.2/lib/ ntuple_tcl /cdf/data10a/ucntUtils//prodrelease_59717_4.11.2/FlatNtuple/run_ucnt_4112.tcl ntuple_script /cdf/data10a/ucntUtils//ucntuple_dcache.sh ntuple_local_output /cdf/data10a/ucntUtils/levys/output/ --- making temporary directory /cdf/data10a/ucntUtils//levys/ntuple-ttopli-030904 --- cd /cdf/data10a/ucntUtils//levys/ntuple-ttopli-030904 --- link in appopriate binary, etc. --- running /cdf/data10a/ucntUtils//dumpdataset ttopli --- running AC++ Dump /cdf/data10a/ucntUtils/datasetdump.tcl > ttopli.lis ======================================================= Error Log established 09-Mar-2004 12:58:07 CST ======================================================= --- running /cdf/data10a/ucntUtils/fillFileset.sh ttopli.lis fillFileset: Processing file 1 of 1 --- making tar file /cdf/data10a/ucntUtils//levys/temp-ttopli-030904-ntuple.tgz --- number of job segments is 1 --- submitting 1 job(s) (1 additional job for every 100 segments greater than 100) Cafcom /cdf/data10a/ucntUtils//levys/temp-ttopli-030904-ntuple.tgz icaf:ttopli-\$.tgz levys@fnal.gov test 1 1 ./ucntuple_dcache.sh ttopli.lis \$ 4.11.2 /cdf/data10a/ucntUtils//prodrelease_59717_4.11.2/FlatNtuple/run_ucnt_4112.tcl 1 /cdf/data10a/ucntUtils/levys/output/ levys cdf13 10 1 0
Example for ntupling list of DFC files: You may also provide a file containing a list of DFC files that you would like to make UCNtuples for. For example, if you created the file myFilesToUCNtuple.lis with the following MC DFC files: > cat myFilesToUCNtuple.lis wt024f8b.0064top0 wt024f8b.0063top0 wt024f8b.0062top0 Then you would run the script (from the same dir that contains myFilesToUCNtuple.lis) as follows: > /cdf/data10a/ucntUtils/ntuple -list myFilesToUCNtuple.lis -mc Please note: It is highly recommended that you first run the
script using the -norun option. This will execute all the script
commands but the job will _not_ be submitted to the CAF. This
allows the user to check that the submitted command corresponds to
what is expected.

The CAF web-page contains additional information about the syntax of the
CafSubmit command that the script uses to launch the jobs and how one
can check the status of their pending jobs.

The script is designed to be run from any UChicago machine. However,
if you encounter an error like:
       Traceback (innermost last):
File "/cdf/code/cdfsoft/dist/releases/development/CafUtil/cdf_gui/Cafcom.py",
line 5, in ? import krb5
ImportError:
/cdf/code/cdfsoft/dist/releases/development/CafUtil/cdf_gui/krb5module.so:
undefined symbol: krb5_max_cksum
please try running the script from cdf13 or cdf10. See this email for more explanation.


Please email Jason Tsui or Stephen Levy if you have any problems.