This page provides a description of the cwb_condor command.


This command prepares and submits the analysis jobs on the considered computing cluster by using the condor batch system.


  • cwb_condor (without arguments)
    Prints help
  • cwb_condor benchmark
    Prints help in the benchmark type
  • cwb_condor action [dag_file/cwb_stage] [input_dir]
    Prepares and submits the jobs

Further informations

The following options can be passed to cwb_condor:

action =     create    : it creates the dag and sub files under the condor directory

             submit    : it submits the jobs to the considered computing cluster

             recovery  : it compares the list of jobs in the dag file and checks the number of jobs
                         completed (from history). It produces the dag file data_label.dag.recovery.x
                         (x = recovery version).

             resume    : same as recovery but is done only if previous cwb_stage is present
                         in the output dir
             status    : it shows the jobs status.

             remove    : (R/H/I) it remove the jobs with status (Running/Held/Idle).

             check     : it checks which jobs have been completed.
                         (by reading the history reported in the output root files).

             list      : it lists the jobs reported in the dag file.

             benchmark : it shows the computation load and related statistics
                         (see cwb_condor benchmark).

             cleanup   : remove broken symbolic links in the condor log dir (avoid init condor failure)
                         to be used when jobs are in held status.

             sdag      : read the standard \*.dag file produced with cwb_condor create and produced
                         a new dag file \*.sdag
                         The \*.sdag permits to submit in each node N jobs in sequential mode

             mtpe      : used to generate the multitask mode for the Parameter Estimation analysis
                         (see How to do the Parameter Estimation Analysis).
dag_file/cwb_stage (optional) =

             dag_file  : path to the dag file dag_file to be submitted
                         (used as cwb_condor submit dag_file).

             cwb_stage : used in the 2G analysis
input_dir (optional)  : it is used with the recovery option. The default input directory is the
                        output_dir directory defined in cwb_parameters.C


  • The following command lines launch a full stage analysis:
  • cwb_condor create
  • cwb_condor submit
  • The following command lines launch a two-stage analysis:
  • cwb_condor create SUPERCLUSTER
  • cwb_condor submit
  • cwb_condor recovery LIKELIHOOD outpu
  • cwb_condor submit condor/XXX.dag.recovery.x
  • The following command line creates the condor/data_label.sub and condor/data_label.dag condor files:
  • cwb_condor create
  • The following command lines submit the analysis jobs to the considered computing cluster:
  • cwb_condor submit
  • cwb_condor submit condor_data/data_label.dag
  • The following command line recovers the analysis which did not get completed and creates the file condor/data_label.dag.recovery.1:
  • cwb_condor recovery
  • The following command line submits the dag files listing the recovered files:
  • cwb_condor submit condor/data_label.dag.recovery.1
  • The following command line creates the condor files for the SUPERCLUSTER stage of the 2G analysis:
  • cwb_condor create SUPERCLUSTER
The intermediate jobs are stored in the node’s temporary directory. Symbolic links to the remote job files are created in the output directory.
  • The following command lines complete an analysis conducted with the 2G cWB pipeline starting from an intermediate stage:
  • cwb_condor recovery LIKELIHOOD
  • cwb_condor submit condor/data_label.dag.recovery.1
  • The following command line creates the condor/file.sdag file from the condor/file.dag (it defines 10 sequential jobs per node):
  • cwb_condor sdag 10 condor/file.dag