Generating concrete workflow
2022.02.19 23:04:34.935 GMT: [WARNING] --dax option is deprecated. The abstract workflow is passed via the last positional argument on the commandline.
2022.02.19 23:04:34.948 GMT: [DEBUG] Property Key pegasus.integrity.checking already set to nosymlink. Will not be set to - none
2022.02.19 23:04:34.948 GMT: [DEBUG] Property Key condor.periodic_remove already set to (JobStatus == 5) && ((CurrentTime - EnteredCurrentStatus) > 43200). Will not be set to - (JobStatus == 5) && ((CurrentTime - EnteredCurrentStatus) > 30)
2022.02.19 23:04:34.948 GMT: [INFO] Planner launched in the following directory /work/yifan.wang/ringdown/GW200224/220_330/4ogcringdown_output
2022.02.19 23:04:34.948 GMT: [INFO] Planner invoked with following arguments --conf ./pegasus-properties.conf --dir /local/user/yifan.wang/pycbc-tmp.GEJUmm9gCF --submit --forward nogrid --output-sites local --sites local,condorpool_shared --staging-site local=local,condorpool_shared=condorpool_shared --cluster label,horizontal --cleanup inplace --relative-dir work -vvv --dax 4ogcringdown.dax
2022.02.19 23:04:34.950 GMT: [CONFIG] Pegasus Properties set by the user
2022.02.19 23:04:34.950 GMT: [CONFIG] pegasus.catalog.replica.cache.asrc=true
2022.02.19 23:04:34.951 GMT: [CONFIG] pegasus.catalog.replica.dax.asrc=true
2022.02.19 23:04:34.951 GMT: [CONFIG] pegasus.catalog.workflow.amqp.url=amqp://friend:donatedata@msgs.pegasus.isi.edu:5672/prod/workflows
2022.02.19 23:04:34.951 GMT: [CONFIG] pegasus.dir.staging.mapper=Flat
2022.02.19 23:04:34.951 GMT: [CONFIG] pegasus.dir.storage.mapper=Replica
2022.02.19 23:04:34.951 GMT: [CONFIG] pegasus.dir.storage.mapper.replica=File
2022.02.19 23:04:34.951 GMT: [CONFIG] pegasus.dir.storage.mapper.replica.file=output.map
2022.02.19 23:04:34.952 GMT: [CONFIG] pegasus.dir.submit.mapper=Named
2022.02.19 23:04:34.952 GMT: [CONFIG] pegasus.file.cleanup.scope=deferred
2022.02.19 23:04:34.952 GMT: [CONFIG] pegasus.home.bindir=/software/tools/pegasus/5.0/bin
2022.02.19 23:04:34.952 GMT: [CONFIG] pegasus.home.schemadir=/software/tools/pegasus/5.0/share/pegasus/schema
2022.02.19 23:04:34.952 GMT: [CONFIG] pegasus.home.sharedstatedir=/software/tools/pegasus/5.0/share/pegasus
2022.02.19 23:04:34.952 GMT: [CONFIG] pegasus.home.sysconfdir=/software/tools/pegasus/5.0/etc
2022.02.19 23:04:34.952 GMT: [CONFIG] pegasus.integrity.checking=nosymlink
2022.02.19 23:04:34.952 GMT: [CONFIG] pegasus.metrics.app=ligo-pycbc
2022.02.19 23:04:34.953 GMT: [CONFIG] pegasus.mode=development
2022.02.19 23:04:34.953 GMT: [CONFIG] pegasus.monitord.encoding=json
2022.02.19 23:04:34.953 GMT: [CONFIG] pegasus.register=False
2022.02.19 23:04:34.953 GMT: [CONFIG] pegasus.selector.replica=Regex
2022.02.19 23:04:34.953 GMT: [CONFIG] pegasus.selector.replica.regex.rank.1=file://(?!.*(cvmfs)).*
2022.02.19 23:04:34.953 GMT: [CONFIG] pegasus.selector.replica.regex.rank.2=file:///cvmfs/.*
2022.02.19 23:04:34.954 GMT: [CONFIG] pegasus.selector.replica.regex.rank.3=root://.*
2022.02.19 23:04:34.954 GMT: [CONFIG] pegasus.selector.replica.regex.rank.4=gsiftp://red-gridftp.unl.edu.*
2022.02.19 23:04:34.954 GMT: [CONFIG] pegasus.selector.replica.regex.rank.5=gridftp://.*
2022.02.19 23:04:34.954 GMT: [CONFIG] pegasus.selector.replica.regex.rank.6=.*
2022.02.19 23:04:34.954 GMT: [CONFIG] pegasus.transfer.bypass.input.staging=true
2022.02.19 23:04:34.954 GMT: [CONFIG] pegasus.transfer.links=true
2022.02.19 23:04:35.265 GMT: [INFO] event.pegasus.add.data-dependencies dax.id 4ogcringdown.dax-0 - STARTED
2022.02.19 23:04:35.267 GMT: [INFO] event.pegasus.add.data-dependencies dax.id 4ogcringdown.dax-0 (0.001 seconds) - FINISHED
2022.02.19 23:04:35.300 GMT: [DEBUG] Parsed DAX with following metrics {"compute_tasks":0,"dax_tasks":2,"dag_tasks":0,"total_tasks":2,"deleted_tasks":0,"dax_input_files":4,"dax_inter_files":0,"dax_output_files":0,"dax_total_files":4,"compute_jobs":0,"clustered_jobs":0,"si_tx_jobs":0,"so_tx_jobs":0,"inter_tx_jobs":0,"reg_jobs":0,"cleanup_jobs":0,"create_dir_jobs":0,"dax_jobs":2,"dag_jobs":0,"chmod_jobs":0,"total_jobs":2,"mDAXLabel":"4ogcringdown.dax"}
2022.02.19 23:04:35.302 GMT: [CONFIG] Loading site catalog file /work/yifan.wang/ringdown/GW200224/220_330/4ogcringdown_output/sites.yml
2022.02.19 23:04:35.302 GMT: [DEBUG] All sites will be loaded from the site catalog
2022.02.19 23:04:35.303 GMT: [DEBUG] event.pegasus.parse.site-catalog site-catalog.id /work/yifan.wang/ringdown/GW200224/220_330/4ogcringdown_output/sites.yml - STARTED
2022.02.19 23:04:35.400 GMT: [DEBUG] event.pegasus.parse.site-catalog site-catalog.id /work/yifan.wang/ringdown/GW200224/220_330/4ogcringdown_output/sites.yml (0.097 seconds) - FINISHED
2022.02.19 23:04:35.400 GMT: [DEBUG] Sites loaded are [osg, condorpool_shared, condorpool_symlink, condorpool_copy, local]
2022.02.19 23:04:35.401 GMT: [CONFIG] Set environment profile for local site PATH=/software/tools/pegasus/5.0/bin/:/work/yifan.wang/virtualenv/sgwb/bin:/work/yifan.wang/lscsoft/opt/accomlal/bin:/usr/local/bin:/usr/bin:/bin:/usr/local/games:/usr/games
2022.02.19 23:04:35.401 GMT: [CONFIG] Set environment profile for local site PYTHONPATH=/software/tools/pegasus/5.0/lib/python3.7/dist-packages/:/work/yifan.wang/eccsearch/waveform/ihes-teobresum/Python:/work/yifan.wang/eccsearch/waveform/PyCBC-teobresums:/work/yifan.wang/1-ecc-waveform-PE/IMRPhenomDecc:/work/yifan.wang/lscsoft/src/TaylorF2e:
2022.02.19 23:04:35.402 GMT: [CONFIG] Constructed default site catalog entry for condorpool site
condor
2022.02.19 23:04:35.462 GMT: [DEBUG] Mount Under Scratch Directories [/tmp, /var/tmp]
2022.02.19 23:04:35.463 GMT: [DEBUG] Style detected for site osg is class edu.isi.pegasus.planner.code.generator.condor.style.Condor
2022.02.19 23:04:35.463 GMT: [DEBUG] Style detected for site condorpool_shared is class edu.isi.pegasus.planner.code.generator.condor.style.Condor
2022.02.19 23:04:35.463 GMT: [DEBUG] Style detected for site condorpool is class edu.isi.pegasus.planner.code.generator.condor.style.Condor
2022.02.19 23:04:35.464 GMT: [DEBUG] Style detected for site condorpool_symlink is class edu.isi.pegasus.planner.code.generator.condor.style.Condor
2022.02.19 23:04:35.464 GMT: [DEBUG] Style detected for site condorpool_copy is class edu.isi.pegasus.planner.code.generator.condor.style.Condor
2022.02.19 23:04:35.464 GMT: [DEBUG] Style detected for site local is class edu.isi.pegasus.planner.code.generator.condor.style.Condor
2022.02.19 23:04:35.464 GMT: [DEBUG] Execution sites are [condorpool_shared, local]
2022.02.19 23:04:35.469 GMT: [CONFIG] Transformation Catalog Type used YAML TC
2022.02.19 23:04:35.470 GMT: [DEBUG] Ignoring error encountered while loading Transformation Catalog
[1]: Unable to instantiate Transformation Catalog at edu.isi.pegasus.planner.catalog.transformation.TransformationFactory.loadInstance(TransformationFactory.java:164)
[2]: edu.isi.pegasus.planner.catalog.transformation.impl.YAML caught java.lang.RuntimeException The File to be used as TC should be defined with the property pegasus.catalog.transformation.file at edu.isi.pegasus.planner.catalog.transformation.impl.YAML.connect(YAML.java:167)
2022.02.19 23:04:35.470 GMT: [DEBUG] Created a temporary transformation catalog backend /tmp/tc.12428162271857300445.txt
2022.02.19 23:04:35.471 GMT: [CONFIG] Transformation Catalog Type used Multiline Textual TC
2022.02.19 23:04:35.472 GMT: [CONFIG] Transformation Catalog File used /tmp/tc.12428162271857300445.txt
2022.02.19 23:04:35.474 GMT: [CONFIG] Data Configuration used for the workflow condorio
2022.02.19 23:04:35.475 GMT: [DEBUG] Directory to be created is /local/user/yifan.wang/pycbc-tmp.GEJUmm9gCF/yifan.wang/pegasus/4ogcringdown.dax/run0001
2022.02.19 23:04:35.476 GMT: [CONFIG] Metrics file will be written out to /local/user/yifan.wang/pycbc-tmp.GEJUmm9gCF/work/4ogcringdown.dax-0.metrics
2022.02.19 23:04:35.476 GMT: [CONFIG] The base submit directory for the workflow /local/user/yifan.wang/pycbc-tmp.GEJUmm9gCF
2022.02.19 23:04:35.476 GMT: [CONFIG] The relative submit directory for the workflow work
2022.02.19 23:04:35.477 GMT: [CONFIG] The relative execution directory for the workflow work
2022.02.19 23:04:35.478 GMT: [INFO] event.pegasus.stampede.events dax.id 4ogcringdown.dax-0 - STARTED
2022.02.19 23:04:35.481 GMT: [DEBUG] Written out stampede events for the abstract workflow to /local/user/yifan.wang/pycbc-tmp.GEJUmm9gCF/work/4ogcringdown.dax-0.static.bp
2022.02.19 23:04:35.481 GMT: [INFO] event.pegasus.stampede.events dax.id 4ogcringdown.dax-0 (0.003 seconds) - FINISHED
2022.02.19 23:04:35.482 GMT: [INFO] event.pegasus.refinement dax.id 4ogcringdown.dax-0 - STARTED
2022.02.19 23:04:35.489 GMT: [CONFIG] Proxy used for Replica Catalog is /tmp/x509up_u44039
2022.02.19 23:04:35.492 GMT: [DEBUG] [Replica Factory] Connect properties detected {proxy=/tmp/x509up_u44039, read.only=true, dax.asrc=true, cache.asrc=true}
2022.02.19 23:04:35.493 GMT: [DEBUG] [Replica Factory] Connect properties detected for implementor edu.isi.pegasus.planner.catalog.replica.impl.YAML -> {proxy=/tmp/x509up_u44039, read.only=true, dax.asrc=true, cache.asrc=true}
2022.02.19 23:04:35.495 GMT: [DEBUG] Problem while connecting with the Replica Catalog: Unable to connect to replica catalog implementation edu.isi.pegasus.planner.catalog.replica.impl.YAML with props {proxy=/tmp/x509up_u44039, read.only=true, dax.asrc=true, cache.asrc=true}
2022.02.19 23:04:35.495 GMT: [DEBUG] Setting property dagman.registration.maxjobs to 1 to set max jobs for registrations jobs category
2022.02.19 23:04:35.499 GMT: [DEBUG] Copied /tmp/tc.12428162271857300445.txt to directory /local/user/yifan.wang/pycbc-tmp.GEJUmm9gCF/work/catalogs
2022.02.19 23:04:35.501 GMT: [DEBUG] Copied /work/yifan.wang/ringdown/GW200224/220_330/4ogcringdown_output/sites.yml to directory /local/user/yifan.wang/pycbc-tmp.GEJUmm9gCF/work/catalogs
2022.02.19 23:04:35.501 GMT: [DEBUG] Set Default output replica catalog properties to {pegasus.catalog.replica.output.db.driver=sqlite, pegasus.catalog.replica.output.db.url=jdbc:sqlite:/local/user/yifan.wang/pycbc-tmp.GEJUmm9gCF/work/4ogcringdown.dax-0.replicas.db, pegasus.catalog.replica.output=JDBCRC}
2022.02.19 23:04:35.502 GMT: [INFO] event.pegasus.check.cyclic-dependencies dax.id 4ogcringdown.dax-0 - STARTED
2022.02.19 23:04:35.502 GMT: [INFO] event.pegasus.check.cyclic-dependencies dax.id 4ogcringdown.dax-0 (0.0 seconds) - FINISHED
2022.02.19 23:04:35.503 GMT: [DEBUG] 0 entries found in cache of total 4
2022.02.19 23:04:35.504 GMT: [DEBUG] 0 entries found in previous submit dirs of total 4
2022.02.19 23:04:35.504 GMT: [DEBUG] 0 entries found in input directories of total 4
2022.02.19 23:04:35.504 GMT: [DEBUG] 4 entries found in abstract workflow replica store of total 4
2022.02.19 23:04:35.504 GMT: [DEBUG] 0 entries found in inherited replica store of total 4
2022.02.19 23:04:35.504 GMT: [DEBUG] 0 entries found in input replica catalog of total 4
2022.02.19 23:04:35.504 GMT: [DEBUG] 4 entries found in all replica sources of total 4
2022.02.19 23:04:35.504 GMT: [CONFIG] Data Reuse Scope for the workflow: full
2022.02.19 23:04:35.505 GMT: [DEBUG] Reducing the workflow
2022.02.19 23:04:35.505 GMT: [INFO] event.pegasus.reduce dax.id 4ogcringdown.dax-0 - STARTED
2022.02.19 23:04:35.505 GMT: [DEBUG] Jobs whose o/p files already exist
2022.02.19 23:04:35.505 GMT: [DEBUG] Job pegasus-plan_finalization has no o/p files
2022.02.19 23:04:35.505 GMT: [DEBUG] Job pegasus-plan_main has no o/p files
2022.02.19 23:04:35.505 GMT: [DEBUG] Jobs whose o/p files already exist - DONE
2022.02.19 23:04:35.506 GMT: [DEBUG] pegasus-plan_main will not be deleted as not as child pegasus-plan_finalization is not marked for deletion
2022.02.19 23:04:35.506 GMT: [INFO] Nodes/Jobs Deleted from the Workflow during reduction
2022.02.19 23:04:35.506 GMT: [INFO] Nodes/Jobs Deleted from the Workflow during reduction - DONE
2022.02.19 23:04:35.506 GMT: [INFO] event.pegasus.reduce dax.id 4ogcringdown.dax-0 (0.001 seconds) - FINISHED
2022.02.19 23:04:35.506 GMT: [INFO] event.pegasus.siteselection dax.id 4ogcringdown.dax-0 - STARTED
2022.02.19 23:04:35.509 GMT: [DEBUG] List of executions sites is [condorpool_shared, local]
2022.02.19 23:04:35.511 GMT: [DEBUG] Job pegasus-plan_main will be mapped based on selector|hints profile key execution.site
2022.02.19 23:04:35.512 GMT: [DEBUG] Job pegasus-plan_finalization will be mapped based on selector|hints profile key execution.site
2022.02.19 23:04:35.512 GMT: [DEBUG] Setting up site mapping for job pegasus-plan_finalization
2022.02.19 23:04:35.512 GMT: [DEBUG] Job was mapped to pegasus-plan_finalization to site local
2022.02.19 23:04:35.513 GMT: [DEBUG] Setting up site mapping for job pegasus-plan_main
2022.02.19 23:04:35.514 GMT: [DEBUG] Job was mapped to pegasus-plan_main to site local
2022.02.19 23:04:35.514 GMT: [INFO] event.pegasus.stampede.events dax.id 4ogcringdown.dax-0 - STARTED
2022.02.19 23:04:35.515 GMT: [DEBUG] Written out stampede metadata events for the mapped workflow to /local/user/yifan.wang/pycbc-tmp.GEJUmm9gCF/work/4ogcringdown.dax-0.static.bp
2022.02.19 23:04:35.515 GMT: [INFO] event.pegasus.stampede.events dax.id 4ogcringdown.dax-0 (0.001 seconds) - FINISHED
2022.02.19 23:04:35.515 GMT: [INFO] event.pegasus.siteselection dax.id 4ogcringdown.dax-0 (0.009 seconds) - FINISHED
2022.02.19 23:04:35.517 GMT: [DEBUG] User set mapper is not a stageable mapper. Loading a stageable mapper
2022.02.19 23:04:35.518 GMT: [DEBUG] Deployment of Worker Package needed
2022.02.19 23:04:35.527 GMT: [CONFIG] No Replica Registration Jobs will be created .
2022.02.19 23:04:35.531 GMT: [CONFIG] Transfer Implementation loaded for Stage-In [Python based Transfer Script]
2022.02.19 23:04:35.531 GMT: [CONFIG] Transfer Implementation loaded for symbolic linking Stage-In [Python based Transfer Script]
2022.02.19 23:04:35.531 GMT: [CONFIG] Transfer Implementation loaded for Inter Site [Python based Transfer Script]
2022.02.19 23:04:35.531 GMT: [CONFIG] Transfer Implementation loaded for Stage-Out [Python based Transfer Script]
2022.02.19 23:04:35.531 GMT: [DEBUG] Trying to get TCEntries for pegasus::worker on resource ALL of type STAGEABLE
2022.02.19 23:04:35.532 GMT: [DEBUG] System information for pegasus-worker-5.0.1-x86_64_deb_10.tar.gz is {arch=x86_64 os=linux osrelease=deb osversion=10}
2022.02.19 23:04:35.532 GMT: [DEBUG] Compute site sysinfo local {arch=x86_64 os=linux}
2022.02.19 23:04:35.532 GMT: [DEBUG] Worker Package Entry used for site local
Logical Namespace : pegasus
Logical Name : worker
Version : null
Resource Id : local
Physical Name : file:///local/user/yifan.wang/pycbc-tmp.GEJUmm9gCF/work/pegasus-worker-5.0.1-x86_64_deb_10.tar.gz
SysInfo : {arch=x86_64 os=linux}
TYPE : STAGEABLE
BYPASS : false
Notifications:
Container : null
Compound Tx : null
2022.02.19 23:04:35.532 GMT: [DEBUG] Trying to get TCEntries for pegasus::worker on resource local of type STAGEABLE
2022.02.19 23:04:35.533 GMT: [DEBUG] Staging site for site local for worker package deployment - local
2022.02.19 23:04:35.533 GMT: [DEBUG] Trying to get TCEntries for pegasus::worker on resource ALL of type STAGEABLE
2022.02.19 23:04:35.533 GMT: [DEBUG] Selected entry
Logical Namespace : pegasus
Logical Name : worker
Version : null
Resource Id : local
Physical Name : file:///local/user/yifan.wang/pycbc-tmp.GEJUmm9gCF/work/pegasus-worker-5.0.1-x86_64_deb_10.tar.gz
SysInfo : {arch=x86_64 os=linux}
TYPE : STAGEABLE
BYPASS : false
Notifications:
Container : null
Compound Tx : null
2022.02.19 23:04:35.533 GMT: [DEBUG] Creating a default TC entry for pegasus::transfer at site local
2022.02.19 23:04:35.533 GMT: [DEBUG] Remote Path set is pegasus-transfer
2022.02.19 23:04:35.533 GMT: [DEBUG] Trying to get TCEntries for pegasus::transfer on resource local of type INSTALLED
2022.02.19 23:04:35.534 GMT: [DEBUG] Entry constructed
Logical Namespace : pegasus
Logical Name : transfer
Version : null
Resource Id : local
Physical Name : pegasus-transfer
SysInfo : {arch=x86_64 os=linux}
TYPE : INSTALLED
BYPASS : false
Notifications:
Container : null
Compound Tx : null
2022.02.19 23:04:35.534 GMT: [DEBUG] Creating a default TC entry for pegasus::kickstart at site local
2022.02.19 23:04:35.534 GMT: [DEBUG] Remote Path set is pegasus-kickstart
2022.02.19 23:04:35.534 GMT: [DEBUG] Trying to get TCEntries for pegasus::kickstart on resource local of type INSTALLED
2022.02.19 23:04:35.534 GMT: [DEBUG] Entry constructed
Logical Namespace : pegasus
Logical Name : kickstart
Version : null
Resource Id : local
Physical Name : pegasus-kickstart
SysInfo : {arch=x86_64 os=linux}
TYPE : INSTALLED
BYPASS : false
Notifications:
Container : null
Compound Tx : null
2022.02.19 23:04:35.534 GMT: [DEBUG] Creating a default TC entry for pegasus::cleanup at site local
2022.02.19 23:04:35.534 GMT: [DEBUG] Remote Path set is pegasus-transfer
2022.02.19 23:04:35.534 GMT: [DEBUG] Trying to get TCEntries for pegasus::cleanup on resource local of type INSTALLED
2022.02.19 23:04:35.534 GMT: [DEBUG] Entry constructed
Logical Namespace : pegasus
Logical Name : cleanup
Version : null
Resource Id : local
Physical Name : pegasus-transfer
SysInfo : {arch=x86_64 os=linux}
TYPE : INSTALLED
BYPASS : false
Notifications:
Container : null
Compound Tx : null
2022.02.19 23:04:35.535 GMT: [DEBUG] Creating a default TC entry for pegasus::seqexec at site local
2022.02.19 23:04:35.535 GMT: [DEBUG] Remote Path set is pegasus-cluster
2022.02.19 23:04:35.535 GMT: [DEBUG] Trying to get TCEntries for pegasus::seqexec on resource local of type INSTALLED
2022.02.19 23:04:35.535 GMT: [DEBUG] Entry constructed
Logical Namespace : pegasus
Logical Name : seqexec
Version : null
Resource Id : local
Physical Name : pegasus-cluster
SysInfo : {arch=x86_64 os=linux}
TYPE : INSTALLED
BYPASS : false
Notifications:
Container : null
Compound Tx : null
2022.02.19 23:04:35.535 GMT: [DEBUG] Creating a default TC entry for pegasus::dirmanager at site local
2022.02.19 23:04:35.535 GMT: [DEBUG] Remote Path set is pegasus-transfer
2022.02.19 23:04:35.535 GMT: [DEBUG] Trying to get TCEntries for pegasus::dirmanager on resource local of type INSTALLED
2022.02.19 23:04:35.535 GMT: [DEBUG] Entry constructed
Logical Namespace : pegasus
Logical Name : dirmanager
Version : null
Resource Id : local
Physical Name : pegasus-transfer
SysInfo : {arch=x86_64 os=linux}
TYPE : INSTALLED
BYPASS : false
Notifications:
Container : null
Compound Tx : null
2022.02.19 23:04:35.535 GMT: [DEBUG] Creating a default TC entry for pegasus::keg at site local
2022.02.19 23:04:35.535 GMT: [DEBUG] Remote Path set is pegasus-keg
2022.02.19 23:04:35.535 GMT: [DEBUG] Trying to get TCEntries for pegasus::keg on resource local of type INSTALLED
2022.02.19 23:04:35.535 GMT: [DEBUG] Entry constructed
Logical Namespace : pegasus
Logical Name : keg
Version : null
Resource Id : local
Physical Name : pegasus-keg
SysInfo : {arch=x86_64 os=linux}
TYPE : INSTALLED
BYPASS : false
Notifications:
Container : null
Compound Tx : null
2022.02.19 23:04:35.536 GMT: [INFO] event.pegasus.cluster dax.id 4ogcringdown.dax-0 - STARTED
2022.02.19 23:04:35.537 GMT: [DEBUG] Adding job to graph pegasus-plan_finalization
2022.02.19 23:04:35.538 GMT: [DEBUG] Adding job to graph pegasus-plan_main
2022.02.19 23:04:35.538 GMT: [DEBUG] Adding parents for child finalization
2022.02.19 23:04:35.539 GMT: [CONFIG] Partitioner loaded is Label Based Partitioning
2022.02.19 23:04:35.547 GMT: [CONFIG] Kickstart Stating Disabled Completely - false
2022.02.19 23:04:35.548 GMT: [CONFIG] Kickstart Stating Disabled Completely - false
2022.02.19 23:04:35.548 GMT: [CONFIG] Clusterer loaded is Topological based Vertical Clustering
2022.02.19 23:04:35.549 GMT: [INFO] Starting Graph Traversal
2022.02.19 23:04:35.549 GMT: [DEBUG] Adding to level 0 dummy
2022.02.19 23:04:35.549 GMT: [DEBUG] Adding to queue main
2022.02.19 23:04:35.549 GMT: [DEBUG] Removed dummy
2022.02.19 23:04:35.549 GMT: [DEBUG] Adding to level 1 main
2022.02.19 23:04:35.549 GMT: [DEBUG] Adding to queue finalization
2022.02.19 23:04:35.549 GMT: [DEBUG] Removed main
2022.02.19 23:04:35.549 GMT: [DEBUG] Adding to level 2 finalization
2022.02.19 23:04:35.550 GMT: [DEBUG] Removed finalization
2022.02.19 23:04:35.550 GMT: [INFO] Starting Graph Traversal - DONE
2022.02.19 23:04:35.550 GMT: [DEBUG] Partition is [finalization] corresponding to label finalization
2022.02.19 23:04:35.550 GMT: [DEBUG] Clustering jobs in partition ID2 [finalization]
2022.02.19 23:04:35.550 GMT: [DEBUG] No clustering for partition ID2
2022.02.19 23:04:35.550 GMT: [DEBUG] Partition is [main] corresponding to label main
2022.02.19 23:04:35.550 GMT: [DEBUG] Clustering jobs in partition ID1 [main]
2022.02.19 23:04:35.550 GMT: [DEBUG] No clustering for partition ID1
2022.02.19 23:04:35.551 GMT: [INFO] Determining relations between partitions
2022.02.19 23:04:35.551 GMT: [INFO] Determining relations between partitions - DONE
2022.02.19 23:04:35.551 GMT: [DEBUG] Adding job to graph pegasus-plan_finalization
2022.02.19 23:04:35.551 GMT: [DEBUG] Adding job to graph pegasus-plan_main
2022.02.19 23:04:35.551 GMT: [DEBUG] Adding parents for child finalization
2022.02.19 23:04:35.552 GMT: [CONFIG] Partitioner loaded is Level Based Partitioning
2022.02.19 23:04:35.553 GMT: [CONFIG] Kickstart Stating Disabled Completely - false
2022.02.19 23:04:35.553 GMT: [CONFIG] Kickstart Stating Disabled Completely - false
2022.02.19 23:04:35.553 GMT: [CONFIG] Clusterer loaded is Horizontal Clustering
2022.02.19 23:04:35.553 GMT: [DEBUG] Adding to level 0 dummy
2022.02.19 23:04:35.554 GMT: [DEBUG] Adding to queue main
2022.02.19 23:04:35.554 GMT: [DEBUG] Removed dummy
2022.02.19 23:04:35.554 GMT: [DEBUG] Adding to level 1 main
2022.02.19 23:04:35.554 GMT: [DEBUG] Adding to queue finalization
2022.02.19 23:04:35.554 GMT: [DEBUG] Removed main
2022.02.19 23:04:35.554 GMT: [DEBUG] Partition ID1 is :[main]
2022.02.19 23:04:35.554 GMT: [DEBUG] Clustering jobs in partition ID1 [main]
2022.02.19 23:04:35.554 GMT: [DEBUG] Clustering jobs of type pegasus-pegasus-plan-5_0_1
2022.02.19 23:04:35.555 GMT: [DEBUG] No clustering of jobs mapped to execution site local
2022.02.19 23:04:35.555 GMT: [DEBUG] Adding to level 2 finalization
2022.02.19 23:04:35.555 GMT: [DEBUG] Removed finalization
2022.02.19 23:04:35.555 GMT: [DEBUG] Partition ID2 is :[finalization]
2022.02.19 23:04:35.555 GMT: [DEBUG] Clustering jobs in partition ID2 [finalization]
2022.02.19 23:04:35.555 GMT: [DEBUG] Clustering jobs of type pegasus-pegasus-plan-5_0_1
2022.02.19 23:04:35.555 GMT: [DEBUG] No clustering of jobs mapped to execution site local
2022.02.19 23:04:35.555 GMT: [DEBUG]
Replacing {pegasus-plan_main [] -> pegasus-plan_finalization [],false} with {pegasus-plan_main [] -> pegasus-plan_finalization [],false}Add to set : true
2022.02.19 23:04:35.555 GMT: [DEBUG] All clustered jobs removed from the workflow
2022.02.19 23:04:35.555 GMT: [INFO] event.pegasus.cluster dax.id 4ogcringdown.dax-0 (0.019 seconds) - FINISHED
2022.02.19 23:04:35.556 GMT: [DEBUG] Initialising Replica Catalog for Planner Cache
2022.02.19 23:04:35.556 GMT: [DEBUG] [Replica Factory] Connect properties detected for implementor SimpleFile -> {output=JDBCRC, output.db.create=true, file=/local/user/yifan.wang/pycbc-tmp.GEJUmm9gCF/work/4ogcringdown.dax-0.getcache, read.only=true, dax.asrc=true, output.db.url=jdbc:sqlite:/local/user/yifan.wang/pycbc-tmp.GEJUmm9gCF/work/4ogcringdown.dax-0.replicas.db, output.db.driver=sqlite, cache.asrc=true}
2022.02.19 23:04:35.557 GMT: [DEBUG] Initialising Replica Catalog for Planner Cache
2022.02.19 23:04:35.557 GMT: [DEBUG] [Replica Factory] Connect properties detected for implementor SimpleFile -> {output=JDBCRC, output.db.create=true, file=/local/user/yifan.wang/pycbc-tmp.GEJUmm9gCF/work/4ogcringdown.dax-0.putcache, read.only=true, dax.asrc=true, output.db.url=jdbc:sqlite:/local/user/yifan.wang/pycbc-tmp.GEJUmm9gCF/work/4ogcringdown.dax-0.replicas.db, output.db.driver=sqlite, cache.asrc=true}
2022.02.19 23:04:35.557 GMT: [INFO] Grafting transfer nodes in the workflow
2022.02.19 23:04:35.557 GMT: [INFO] event.pegasus.generate.transfer-nodes dax.id 4ogcringdown.dax-0 - STARTED
2022.02.19 23:04:35.559 GMT: [DEBUG] Date Reuse Engine no longer tracks deleted leaf jobs. Returning empty list
2022.02.19 23:04:35.563 GMT: [CONFIG] No Replica Registration Jobs will be created .
2022.02.19 23:04:35.564 GMT: [DEBUG] Number of transfer jobs for -1 are 0
2022.02.19 23:04:35.564 GMT: [DEBUG] Number of transfer jobs for 0 are 1
2022.02.19 23:04:35.564 GMT: [DEBUG] Number of transfer jobs for 1 are 1
2022.02.19 23:04:35.565 GMT: [CONFIG] Transfer Implementation loaded for Stage-In [Python based Transfer Script]
2022.02.19 23:04:35.565 GMT: [CONFIG] Transfer Implementation loaded for symbolic linking Stage-In [Python based Transfer Script]
2022.02.19 23:04:35.565 GMT: [CONFIG] Transfer Implementation loaded for Inter Site [Python based Transfer Script]
2022.02.19 23:04:35.565 GMT: [CONFIG] Transfer Implementation loaded for Stage-Out [Python based Transfer Script]
2022.02.19 23:04:35.567 GMT: [DEBUG] Rank ( rank => 1 priority => 500 expr => file://(?!.*(cvmfs)).*)
2022.02.19 23:04:35.567 GMT: [DEBUG] Rank ( rank => 2 priority => 400 expr => file:///cvmfs/.*)
2022.02.19 23:04:35.567 GMT: [DEBUG] Rank ( rank => 3 priority => 300 expr => root://.*)
2022.02.19 23:04:35.567 GMT: [DEBUG] Rank ( rank => 4 priority => 200 expr => gsiftp://red-gridftp.unl.edu.*)
2022.02.19 23:04:35.567 GMT: [DEBUG] Rank ( rank => 5 priority => 100 expr => gridftp://.*)
2022.02.19 23:04:35.567 GMT: [DEBUG] Rank ( rank => 6 priority => 0 expr => .*)
2022.02.19 23:04:35.567 GMT: [CONFIG] [RegexReplicaSelector] User Provided Ranked regexes are [( rank => 1 priority => 500 expr => file://(?!.*(cvmfs)).*), ( rank => 2 priority => 400 expr => file:///cvmfs/.*), ( rank => 3 priority => 300 expr => root://.*), ( rank => 4 priority => 200 expr => gsiftp://red-gridftp.unl.edu.*), ( rank => 5 priority => 100 expr => gridftp://.*), ( rank => 6 priority => 0 expr => .*)]
2022.02.19 23:04:35.570 GMT: [DEBUG] [Replica Factory] Connect properties detected for implementor File -> {file=output.map, read.only=true}
2022.02.19 23:04:35.570 GMT: [CONFIG] Output Mapper loaded is [Replica Catalog Mapper]
2022.02.19 23:04:35.571 GMT: [DEBUG] Initialising Workflow Cache File in the Submit Directory
2022.02.19 23:04:35.571 GMT: [DEBUG] [Replica Factory] Connect properties detected for implementor FlushedCache -> {output=JDBCRC, output.db.create=true, file=/local/user/yifan.wang/pycbc-tmp.GEJUmm9gCF/work/4ogcringdown.dax-0.cache, dax.asrc=true, output.db.url=jdbc:sqlite:/local/user/yifan.wang/pycbc-tmp.GEJUmm9gCF/work/4ogcringdown.dax-0.replicas.db, output.db.driver=sqlite, cache.asrc=true}
2022.02.19 23:04:35.572 GMT: [CONFIG] Transfer Refiner loaded is [Balanced Cluster Transfer Refiner( round robin distribution at file level)]
2022.02.19 23:04:35.572 GMT: [CONFIG] ReplicaSelector loaded is [Regex]
2022.02.19 23:04:35.572 GMT: [CONFIG] Submit Directory Mapper loaded is [Relative Submit Directory Mapper]
2022.02.19 23:04:35.572 GMT: [CONFIG] Staging Mapper loaded is [Flat Directory Staging Mapper]
2022.02.19 23:04:35.573 GMT: [DEBUG] SRM Server map is {}
2022.02.19 23:04:35.574 GMT: [DEBUG] SRM Server map is {}
2022.02.19 23:04:35.574 GMT: [DEBUG] Directory for job pegasus-plan_main is .
2022.02.19 23:04:35.574 GMT: [DEBUG]
2022.02.19 23:04:35.574 GMT: [DEBUG] Job being traversed is pegasus-plan_main
2022.02.19 23:04:35.574 GMT: [DEBUG] To be run at local
2022.02.19 23:04:35.574 GMT: [DEBUG] Parents of job:{}
2022.02.19 23:04:35.574 GMT: [DEBUG] Initialising Workflow Cache File for job pegasus-plan_main
2022.02.19 23:04:35.574 GMT: [DEBUG] [Replica Factory] Connect properties detected for implementor FlushedCache -> {output=JDBCRC, output.db.create=true, file=/local/user/yifan.wang/pycbc-tmp.GEJUmm9gCF/work/./pegasus-plan_main.cache, dax.asrc=true, output.db.url=jdbc:sqlite:/local/user/yifan.wang/pycbc-tmp.GEJUmm9gCF/work/4ogcringdown.dax-0.replicas.db, output.db.driver=sqlite, cache.asrc=true}
2022.02.19 23:04:35.575 GMT: [DEBUG] [RegexReplicaSelector] Selecting a pfn for lfn main.dax at site local
amongst main.dax regex false -> {(/work/yifan.wang/ringdown/GW200224/220_330/4ogcringdown_output/main.dax,{site=local}),}
2022.02.19 23:04:35.575 GMT: [DEBUG] Job Input files : Removed file main.dax for job pegasus-plan_main
2022.02.19 23:04:35.575 GMT: [DEBUG] Job Search Files : Removed file main.dax for job pegasus-plan_main
2022.02.19 23:04:35.575 GMT: [DEBUG] Set arguments for DAX job pegasus-plan_main to -Dpegasus.dir.storage.mapper.replica.file=main.map --basename main --cluster label,horizontal --output-sites local --staging-site local=local,condorpool_shared=condorpool_shared --cleanup inplace -vvv /work/yifan.wang/ringdown/GW200224/220_330/4ogcringdown_output/main.dax
2022.02.19 23:04:35.577 GMT: [DEBUG] Directory for job pegasus-plan_finalization is .
2022.02.19 23:04:35.577 GMT: [DEBUG]
2022.02.19 23:04:35.577 GMT: [DEBUG] Job being traversed is pegasus-plan_finalization
2022.02.19 23:04:35.577 GMT: [DEBUG] To be run at local
2022.02.19 23:04:35.577 GMT: [DEBUG] Parents of job:{pegasus-plan_main,}
2022.02.19 23:04:35.577 GMT: [DEBUG] Initialising Workflow Cache File for job pegasus-plan_finalization
2022.02.19 23:04:35.578 GMT: [DEBUG] [Replica Factory] Connect properties detected for implementor FlushedCache -> {output=JDBCRC, output.db.create=true, file=/local/user/yifan.wang/pycbc-tmp.GEJUmm9gCF/work/./pegasus-plan_finalization.cache, dax.asrc=true, output.db.url=jdbc:sqlite:/local/user/yifan.wang/pycbc-tmp.GEJUmm9gCF/work/4ogcringdown.dax-0.replicas.db, output.db.driver=sqlite, cache.asrc=true}
2022.02.19 23:04:35.578 GMT: [DEBUG] [RegexReplicaSelector] Selecting a pfn for lfn finalization.dax at site local
amongst finalization.dax regex false -> {(/work/yifan.wang/ringdown/GW200224/220_330/4ogcringdown_output/finalization.dax,{site=local}),}
2022.02.19 23:04:35.578 GMT: [DEBUG] Job Input files : Removed file finalization.dax for job pegasus-plan_finalization
2022.02.19 23:04:35.578 GMT: [DEBUG] Job Search Files : Removed file finalization.dax for job pegasus-plan_finalization
2022.02.19 23:04:35.578 GMT: [DEBUG] Set arguments for DAX job pegasus-plan_finalization to -Dpegasus.dir.storage.mapper.replica.file=finalization.map --basename finalization --cluster label,horizontal --output-sites local --staging-site local=local,condorpool_shared=condorpool_shared --cleanup inplace -vvv /work/yifan.wang/ringdown/GW200224/220_330/4ogcringdown_output/finalization.dax
2022.02.19 23:04:35.579 GMT: [INFO] event.pegasus.generate.transfer-nodes dax.id 4ogcringdown.dax-0 (0.022 seconds) - FINISHED
2022.02.19 23:04:35.580 GMT: [DEBUG] Adding worker package deployment node for local
2022.02.19 23:04:35.581 GMT: [DEBUG] Skipping stage worker job for site local as worker package already in submit directory file:///local/user/yifan.wang/pycbc-tmp.GEJUmm9gCF/work/pegasus-worker-5.0.1-x86_64_deb_10.tar.gz
2022.02.19 23:04:35.581 GMT: [INFO] event.pegasus.generate.workdir-nodes dax.id 4ogcringdown.dax-0 - STARTED
2022.02.19 23:04:35.584 GMT: [DEBUG] Trying to get TCEntries for pegasus::dirmanager on resource local of type INSTALLED
2022.02.19 23:04:35.585 GMT: [DEBUG] Creating create dir node create_dir_4ogcringdown.dax_0_local
2022.02.19 23:04:35.585 GMT: [DEBUG] Need to add edge create_dir_4ogcringdown.dax_0_local -> pegasus-plan_main
2022.02.19 23:04:35.585 GMT: [DEBUG] Adding node to the worfklow create_dir_4ogcringdown.dax_0_local
2022.02.19 23:04:35.585 GMT: [INFO] event.pegasus.generate.workdir-nodes dax.id 4ogcringdown.dax-0 (0.004 seconds) - FINISHED
2022.02.19 23:04:35.585 GMT: [INFO] event.pegasus.generate.cleanup-nodes dax.id 4ogcringdown.dax-0 - STARTED
2022.02.19 23:04:35.589 GMT: [CONFIG] Setting property dagman.cleanup.maxjobs to 4 to set max jobs for cleanup jobs category
2022.02.19 23:04:35.589 GMT: [DEBUG] Number of sites 1
2022.02.19 23:04:35.589 GMT: [DEBUG] Site local count jobs = 3
2022.02.19 23:04:35.589 GMT: [DEBUG] * pegasus-plan_finalization
2022.02.19 23:04:35.589 GMT: [DEBUG] * create_dir_4ogcringdown.dax_0_local
2022.02.19 23:04:35.589 GMT: [DEBUG] * pegasus-plan_main
2022.02.19 23:04:35.589 GMT: [DEBUG] local 3
2022.02.19 23:04:35.590 GMT: [DEBUG] Leaf jobs scheduled at site local are pegasus-plan_finalization,create_dir_4ogcringdown.dax_0_local,pegasus-plan_main,
2022.02.19 23:04:35.590 GMT: [DEBUG] File finalization.map will not be cleaned up for job pegasus-plan_finalization
2022.02.19 23:04:35.590 GMT: [DEBUG] File main.map will not be cleaned up for job pegasus-plan_main
2022.02.19 23:04:35.590 GMT: [DEBUG]
2022.02.19 23:04:35.590 GMT: [INFO] For site: local number of files cleaned up - 0
2022.02.19 23:04:35.590 GMT: [DEBUG] CLEANUP LIST
2022.02.19 23:04:35.590 GMT: [INFO] event.pegasus.generate.cleanup-nodes dax.id 4ogcringdown.dax-0 (0.005 seconds) - FINISHED
2022.02.19 23:04:35.590 GMT: [INFO] Adding Leaf Cleanup Jobs dax.id 4ogcringdown.dax-0 - STARTED
2022.02.19 23:04:35.592 GMT: [DEBUG] Directory URL is a file url for site local [file:///work/yifan.wang/ringdown/GW200224/220_330/4ogcringdown_output/local-site-scratch/work]
2022.02.19 23:04:35.592 GMT: [DEBUG] Trying to get TCEntries for pegasus::cleanup on resource local of type INSTALLED
2022.02.19 23:04:35.592 GMT: [DEBUG] Creating remove directory node cleanup_4ogcringdown.dax_0_local
2022.02.19 23:04:35.592 GMT: [DEBUG] Need to add edge for DAX|DAG job pegasus-plan_finalization -> cleanup_4ogcringdown.dax_0_local
2022.02.19 23:04:35.592 GMT: [DEBUG] Need to add edge pegasus-plan_finalization -> cleanup_4ogcringdown.dax_0_local
2022.02.19 23:04:35.592 GMT: [DEBUG] Need to add edge for DAX|DAG job pegasus-plan_main -> cleanup_4ogcringdown.dax_0_local
2022.02.19 23:04:35.592 GMT: [DEBUG] Adding node to the worklfow cleanup_4ogcringdown.dax_0_local
2022.02.19 23:04:35.592 GMT: [INFO] Adding Leaf Cleanup Jobs dax.id 4ogcringdown.dax-0 (0.002 seconds) - FINISHED
2022.02.19 23:04:35.594 GMT: [INFO] event.pegasus.refinement dax.id 4ogcringdown.dax-0 (0.112 seconds) - FINISHED
2022.02.19 23:04:35.624 GMT: [DEBUG] Condor Version as string 8.8.9
2022.02.19 23:04:35.624 GMT: [DEBUG] Condor Version detected is 80809
2022.02.19 23:04:35.624 GMT: [INFO] Generating codes for the executable workflow
2022.02.19 23:04:35.624 GMT: [INFO] event.pegasus.code.generation dax.id 4ogcringdown.dax-0 - STARTED
2022.02.19 23:04:35.625 GMT: [CONFIG] Kickstart Stating Disabled Completely - false
2022.02.19 23:04:35.626 GMT: [DEBUG] event.pegasus.code.generation dax.id 4ogcringdown.dax-0 - STARTED
2022.02.19 23:04:35.653 GMT: [DEBUG] Condor Version as string 8.8.9
2022.02.19 23:04:35.653 GMT: [DEBUG] Condor Version detected is 80809
2022.02.19 23:04:35.655 GMT: [CONFIG] Kickstart Stating Disabled Completely - false
2022.02.19 23:04:35.657 GMT: [DEBUG] Applying priority of 800 to create_dir_4ogcringdown.dax_0_local
2022.02.19 23:04:35.690 GMT: [DEBUG] Mount Under Scratch Directories [/tmp, /var/tmp]
2022.02.19 23:04:35.691 GMT: [DEBUG] Trying to get TCEntries for pegasus::kickstart on resource local of type INSTALLED
2022.02.19 23:04:35.693 GMT: [DEBUG] Postscript constructed is /software/tools/pegasus/5.0/bin/pegasus-exitcode
2022.02.19 23:04:35.695 GMT: [DEBUG] Unquoted arguments are pegasus-kickstart -n pegasus::dirmanager -N null -i - -R local -L 4ogcringdown.dax -T 2022-02-19T23:04:30+00:00 pegasus-transfer
2022.02.19 23:04:35.696 GMT: [DEBUG] Quoted arguments are "pegasus-kickstart -n pegasus::dirmanager -N null -i - -R local -L 4ogcringdown.dax -T 2022-02-19T23:04:30+00:00 pegasus-transfer "
2022.02.19 23:04:35.697 GMT: [DEBUG] Written Submit file : /local/user/yifan.wang/pycbc-tmp.GEJUmm9gCF/work/./create_dir_4ogcringdown.dax_0_local.sub
2022.02.19 23:04:35.697 GMT: [DEBUG] Applying priority of 10 to pegasus-plan_main
2022.02.19 23:04:35.698 GMT: [DEBUG] Generating code for DAX job pegasus-plan_main
2022.02.19 23:04:35.698 GMT: [DEBUG] Arguments passed to SUBDAX Generator are -Dpegasus.dir.storage.mapper.replica.file=main.map --basename main --cluster label,horizontal --output-sites local --staging-site local=local,condorpool_shared=condorpool_shared --cleanup inplace -vvv /work/yifan.wang/ringdown/GW200224/220_330/4ogcringdown_output/main.dax
2022.02.19 23:04:35.701 GMT: [DEBUG] Retrieving Metadata from the DAX file /work/yifan.wang/ringdown/GW200224/220_330/4ogcringdown_output/main.dax
2022.02.19 23:04:35.717 GMT: [DEBUG] Submit directory in sub dax specified is ./main.dax_main
2022.02.19 23:04:35.718 GMT: [DEBUG] Base Submit directory for inner workflow set to /local/user/yifan.wang/pycbc-tmp.GEJUmm9gCF
2022.02.19 23:04:35.718 GMT: [DEBUG] Relative Submit Directory for inner workflow set to work/././main.dax_main
2022.02.19 23:04:35.718 GMT: [DEBUG] Submit directory for inner workflow set to /local/user/yifan.wang/pycbc-tmp.GEJUmm9gCF/work/././main.dax_main
2022.02.19 23:04:35.718 GMT: [DEBUG] Setting list of execution sites to the same as outer workflow
2022.02.19 23:04:35.718 GMT: [DEBUG] Submit Directory for SUB DAX is /local/user/yifan.wang/pycbc-tmp.GEJUmm9gCF/work/././main.dax_main
2022.02.19 23:04:35.718 GMT: [DEBUG] Relative Execution Directory for SUB DAX is work/./main.dax_main
2022.02.19 23:04:35.720 GMT: [DEBUG] Trying to get TCEntries for pegasus::pegasus-plan on resource local of type INSTALLED
2022.02.19 23:04:35.720 GMT: [DEBUG] Constructing the default path to the pegasus-plan
2022.02.19 23:04:35.721 GMT: [DEBUG] pegasus-plan invocation for job pegasus-plan_main determined to be
/software/tools/pegasus/5.0/bin/pegasus-plan -Dpegasus.log.*=/local/user/yifan.wang/pycbc-tmp.GEJUmm9gCF/work/pegasus-plan_main.pre.log -Dpegasus.workflow.root.uuid=b8a9fa01-ae1f-42d3-8580-2d867c4cf570 -Dpegasus.dir.storage.mapper.replica.file=main.map --conf /local/user/yifan.wang/pycbc-tmp.GEJUmm9gCF/work/pegasus.9477561762409151663.properties --dir /local/user/yifan.wang/pycbc-tmp.GEJUmm9gCF --relative-dir work/./main.dax_main --relative-submit-dir work/././main.dax_main --basename main --sites condorpool_shared,local --staging-site condorpool_shared=condorpool_shared,local=local, --cache /local/user/yifan.wang/pycbc-tmp.GEJUmm9gCF/work/./pegasus-plan_main.cache --inherited-rc-files /local/user/yifan.wang/pycbc-tmp.GEJUmm9gCF/work/4ogcringdown.dax-0.replica.store --cluster label,horizontal --output-sites local --cleanup inplace --verbose --verbose --verbose --deferred /work/yifan.wang/ringdown/GW200224/220_330/4ogcringdown_output/main.dax
2022.02.19 23:04:35.721 GMT: [DEBUG] Basename prefix for the sub workflow is main
2022.02.19 23:04:35.721 GMT: [DEBUG] Cache File for the sub workflow is /local/user/yifan.wang/pycbc-tmp.GEJUmm9gCF/work/././main.dax_main/main.cache
2022.02.19 23:04:35.721 GMT: [DEBUG] Trying to get TCEntries for condor::dagman on resource local of type INSTALLED
2022.02.19 23:04:35.721 GMT: [DEBUG] condor::dagman not catalogued in the Transformation Catalog. Trying to construct from the Site Catalog
2022.02.19 23:04:35.721 GMT: [DEBUG] DAGMan not catalogued in the Transformation Catalog or the Site Catalog. Trying to construct from the environment
2022.02.19 23:04:35.722 GMT: [DEBUG] Constructing path to dagman on basis of env variable CONDOR_LOCATION
2022.02.19 23:04:35.722 GMT: [DEBUG] Number of Resuce retries 999
2022.02.19 23:04:35.722 GMT: [DEBUG] Constructing arguments to dagman in 7.1.0 and later style
2022.02.19 23:04:35.731 GMT: [CONFIG] Kickstart Stating Disabled Completely - false
2022.02.19 23:04:35.737 GMT: [DEBUG] Setting job pegasus-plan_main.pre to run via No container wrapping
2022.02.19 23:04:35.737 GMT: [DEBUG] Trying to get TCEntries for pegasus::kickstart on resource local of type INSTALLED
2022.02.19 23:04:35.738 GMT: [DEBUG] Trying to get TCEntries for pegasus::transfer on resource local of type INSTALLED
2022.02.19 23:04:35.741 GMT: [DEBUG] Unquoted arguments are -p 0 -f -l . -Notification never -Debug 3 -Lockfile main.dag.lock -Dag main.dag -AllowVersionMismatch -AutoRescue 1 -DoRescueFrom 0 -MaxPre 1
2022.02.19 23:04:35.741 GMT: [DEBUG] Quoted arguments are " -p 0 -f -l . -Notification never -Debug 3 -Lockfile main.dag.lock -Dag main.dag -AllowVersionMismatch -AutoRescue 1 -DoRescueFrom 0 -MaxPre 1"
2022.02.19 23:04:35.741 GMT: [DEBUG] Written Submit file : /local/user/yifan.wang/pycbc-tmp.GEJUmm9gCF/work/./pegasus-plan_main.sub
2022.02.19 23:04:35.742 GMT: [DEBUG] Applying priority of 20 to pegasus-plan_finalization
2022.02.19 23:04:35.742 GMT: [DEBUG] Generating code for DAX job pegasus-plan_finalization
2022.02.19 23:04:35.742 GMT: [DEBUG] Arguments passed to SUBDAX Generator are -Dpegasus.dir.storage.mapper.replica.file=finalization.map --basename finalization --cluster label,horizontal --output-sites local --staging-site local=local,condorpool_shared=condorpool_shared --cleanup inplace -vvv /work/yifan.wang/ringdown/GW200224/220_330/4ogcringdown_output/finalization.dax
2022.02.19 23:04:35.743 GMT: [DEBUG] Retrieving Metadata from the DAX file /work/yifan.wang/ringdown/GW200224/220_330/4ogcringdown_output/finalization.dax
2022.02.19 23:04:35.749 GMT: [DEBUG] Submit directory in sub dax specified is ./finalization.dax_finalization
2022.02.19 23:04:35.749 GMT: [DEBUG] Base Submit directory for inner workflow set to /local/user/yifan.wang/pycbc-tmp.GEJUmm9gCF
2022.02.19 23:04:35.749 GMT: [DEBUG] Relative Submit Directory for inner workflow set to work/././finalization.dax_finalization
2022.02.19 23:04:35.749 GMT: [DEBUG] Submit directory for inner workflow set to /local/user/yifan.wang/pycbc-tmp.GEJUmm9gCF/work/././finalization.dax_finalization
2022.02.19 23:04:35.749 GMT: [DEBUG] Setting list of execution sites to the same as outer workflow
2022.02.19 23:04:35.749 GMT: [DEBUG] Parent DAX Jobs Transient RC's are [null]
2022.02.19 23:04:35.749 GMT: [DEBUG] Submit Directory for SUB DAX is /local/user/yifan.wang/pycbc-tmp.GEJUmm9gCF/work/././finalization.dax_finalization
2022.02.19 23:04:35.749 GMT: [DEBUG] Relative Execution Directory for SUB DAX is work/./finalization.dax_finalization
2022.02.19 23:04:35.749 GMT: [DEBUG] Trying to get TCEntries for pegasus::pegasus-plan on resource local of type INSTALLED
2022.02.19 23:04:35.749 GMT: [DEBUG] Constructing the default path to the pegasus-plan
2022.02.19 23:04:35.749 GMT: [DEBUG] pegasus-plan invocation for job pegasus-plan_finalization determined to be
/software/tools/pegasus/5.0/bin/pegasus-plan -Dpegasus.log.*=/local/user/yifan.wang/pycbc-tmp.GEJUmm9gCF/work/pegasus-plan_finalization.pre.log -Dpegasus.workflow.root.uuid=b8a9fa01-ae1f-42d3-8580-2d867c4cf570 -Dpegasus.dir.storage.mapper.replica.file=finalization.map --conf /local/user/yifan.wang/pycbc-tmp.GEJUmm9gCF/work/pegasus.9477561762409151663.properties --dir /local/user/yifan.wang/pycbc-tmp.GEJUmm9gCF --relative-dir work/./finalization.dax_finalization --relative-submit-dir work/././finalization.dax_finalization --basename finalization --sites condorpool_shared,local --staging-site condorpool_shared=condorpool_shared,local=local, --cache null,/local/user/yifan.wang/pycbc-tmp.GEJUmm9gCF/work/./pegasus-plan_finalization.cache --inherited-rc-files /local/user/yifan.wang/pycbc-tmp.GEJUmm9gCF/work/4ogcringdown.dax-0.replica.store --cluster label,horizontal --output-sites local --cleanup inplace --verbose --verbose --verbose --deferred /work/yifan.wang/ringdown/GW200224/220_330/4ogcringdown_output/finalization.dax
2022.02.19 23:04:35.749 GMT: [DEBUG] Basename prefix for the sub workflow is finalization
2022.02.19 23:04:35.749 GMT: [DEBUG] Cache File for the sub workflow is /local/user/yifan.wang/pycbc-tmp.GEJUmm9gCF/work/././finalization.dax_finalization/finalization.cache
2022.02.19 23:04:35.749 GMT: [DEBUG] Trying to get TCEntries for condor::dagman on resource local of type INSTALLED
2022.02.19 23:04:35.749 GMT: [DEBUG] condor::dagman not catalogued in the Transformation Catalog. Trying to construct from the Site Catalog
2022.02.19 23:04:35.750 GMT: [DEBUG] DAGMan not catalogued in the Transformation Catalog or the Site Catalog. Trying to construct from the environment
2022.02.19 23:04:35.750 GMT: [DEBUG] Constructing path to dagman on basis of env variable CONDOR_LOCATION
2022.02.19 23:04:35.750 GMT: [DEBUG] Number of Resuce retries 999
2022.02.19 23:04:35.750 GMT: [DEBUG] Constructing arguments to dagman in 7.1.0 and later style
2022.02.19 23:04:35.751 GMT: [DEBUG] Setting job pegasus-plan_finalization.pre to run via No container wrapping
2022.02.19 23:04:35.751 GMT: [DEBUG] Trying to get TCEntries for pegasus::kickstart on resource local of type INSTALLED
2022.02.19 23:04:35.751 GMT: [DEBUG] Trying to get TCEntries for pegasus::transfer on resource local of type INSTALLED
2022.02.19 23:04:35.752 GMT: [DEBUG] Unquoted arguments are -p 0 -f -l . -Notification never -Debug 3 -Lockfile finalization.dag.lock -Dag finalization.dag -AllowVersionMismatch -AutoRescue 1 -DoRescueFrom 0 -MaxPre 1
2022.02.19 23:04:35.752 GMT: [DEBUG] Quoted arguments are " -p 0 -f -l . -Notification never -Debug 3 -Lockfile finalization.dag.lock -Dag finalization.dag -AllowVersionMismatch -AutoRescue 1 -DoRescueFrom 0 -MaxPre 1"
2022.02.19 23:04:35.752 GMT: [DEBUG] Written Submit file : /local/user/yifan.wang/pycbc-tmp.GEJUmm9gCF/work/./pegasus-plan_finalization.sub
2022.02.19 23:04:35.752 GMT: [DEBUG] Applying priority of 1000 to cleanup_4ogcringdown.dax_0_local
2022.02.19 23:04:35.752 GMT: [DEBUG] Trying to get TCEntries for pegasus::kickstart on resource local of type INSTALLED
2022.02.19 23:04:35.752 GMT: [DEBUG] Postscript constructed is /software/tools/pegasus/5.0/bin/pegasus-exitcode
2022.02.19 23:04:35.753 GMT: [DEBUG] Unquoted arguments are pegasus-kickstart -n pegasus::cleanup -N null -i - -R local -L 4ogcringdown.dax -T 2022-02-19T23:04:30+00:00 pegasus-transfer
2022.02.19 23:04:35.753 GMT: [DEBUG] Quoted arguments are "pegasus-kickstart -n pegasus::cleanup -N null -i - -R local -L 4ogcringdown.dax -T 2022-02-19T23:04:30+00:00 pegasus-transfer "
2022.02.19 23:04:35.753 GMT: [DEBUG] Written Submit file : /local/user/yifan.wang/pycbc-tmp.GEJUmm9gCF/work/./cleanup_4ogcringdown.dax_0_local.sub
2022.02.19 23:04:35.753 GMT: [DEBUG] event.pegasus.code.generation dax.id 4ogcringdown.dax-0 (0.127 seconds) - FINISHED
2022.02.19 23:04:35.753 GMT: [DEBUG] Written Dag File : 4ogcringdown.dax-0.dag.tmp
2022.02.19 23:04:35.753 GMT: [DEBUG] Writing out the DOT file
2022.02.19 23:04:35.764 GMT: [DEBUG] Written out notifications to 4ogcringdown.dax-0.notify
2022.02.19 23:04:35.765 GMT: [DEBUG] Writing out the DAX Replica Store to file /local/user/yifan.wang/pycbc-tmp.GEJUmm9gCF/work/4ogcringdown.dax-0.replica.store
2022.02.19 23:04:35.765 GMT: [DEBUG] [Replica Factory] Connect properties detected for implementor SimpleFile -> {output=JDBCRC, output.db.create=true, file=/local/user/yifan.wang/pycbc-tmp.GEJUmm9gCF/work/4ogcringdown.dax-0.replica.store, dax.asrc=true, output.db.url=jdbc:sqlite:/local/user/yifan.wang/pycbc-tmp.GEJUmm9gCF/work/4ogcringdown.dax-0.replicas.db, output.db.driver=sqlite, cache.asrc=true}
2022.02.19 23:04:35.766 GMT: [DEBUG] Written out dax replica store to 4ogcringdown.dax-0.replica.store
2022.02.19 23:04:35.769 GMT: [DEBUG] Written out stampede events for the executable workflow to /local/user/yifan.wang/pycbc-tmp.GEJUmm9gCF/work/4ogcringdown.dax-0.static.bp
2022.02.19 23:04:35.775 GMT: [DEBUG] Proxy whose DN will be logged in the braindump file /tmp/x509up_u44039
2022.02.19 23:04:35.823 GMT: [DEBUG] Unable to determine GRID DN class org.globus.gsi.gssapi.GlobusGSSException: Defective credential detected [Caused by: proxy not found]
2022.02.19 23:04:35.851 GMT: [DEBUG] Written out braindump to /local/user/yifan.wang/pycbc-tmp.GEJUmm9gCF/work/braindump.yml
2022.02.19 23:04:35.851 GMT: [DEBUG] Renamed temporary dag file to : /local/user/yifan.wang/pycbc-tmp.GEJUmm9gCF/work/4ogcringdown.dax-0.dag
2022.02.19 23:04:35.853 GMT: [DEBUG] Executing /usr/bin/condor_submit_dag -append executable=/software/tools/pegasus/5.0/bin/pegasus-dagman -no_submit -MaxPre 1 -MaxPost 20 -append +pegasus_wf_uuid="b8a9fa01-ae1f-42d3-8580-2d867c4cf570" -append +pegasus_root_wf_uuid="b8a9fa01-ae1f-42d3-8580-2d867c4cf570" -append +pegasus_wf_name="4ogcringdown.dax-0" -append +pegasus_wf_time="20220219T230434+0000" -append +pegasus_version="5.0.1" -append +pegasus_job_class=11 -append +pegasus_cluster_size=1 -append +pegasus_site="local" -append +pegasus_execution_sites="condorpool_shared,local" -append +pegasus_wf_xformation="pegasus::dagman" 4ogcringdown.dax-0.dag with environment = PATH=/software/tools/pegasus/5.0/bin/:/work/yifan.wang/virtualenv/sgwb/bin:/work/yifan.wang/lscsoft/opt/accomlal/bin:/usr/local/bin:/usr/bin:/bin:/usr/local/games:/usr/games;PKG_CONFIG_PATH=/work/yifan.wang/lscsoft/opt/accomlal/lib/pkgconfig:;LAL_DATA_PATH=/atlas/recent/cbc/ROM_data/;TZ=:/etc/localtime;MODULEPATH=/etc/environment-modules/modules:/usr/share/modules/versions:/usr/share/modules/$MODULE_VERSION/modulefiles:/usr/share/modules/modulefiles;PEGASUS_PERL_DIR=/software/tools/pegasus/5.0/lib/pegasus/perl;DBUS_SESSION_BUS_ADDRESS=unix:path=/run/user/44039/bus;MAIL=/var/mail/yifan.wang;LD_LIBRARY_PATH=/work/yifan.wang/eccsearch/C:/work/yifan.wang/1-ecc-waveform-PE/IMRPhenomDecc:/work/yifan.wang/lscsoft/opt/accomlal/lib/:/work/yifan.wang/lscsoft/MultiNest/lib:;LOGNAME=yifan.wang;PWD=/work/yifan.wang/ringdown/GW200224/220_330/4ogcringdown_output;PYTHONPATH=/software/tools/pegasus/5.0/lib/python3.7/dist-packages/:/work/yifan.wang/eccsearch/waveform/ihes-teobresum/Python:/work/yifan.wang/eccsearch/waveform/PyCBC-teobresums:/work/yifan.wang/1-ecc-waveform-PE/IMRPhenomDecc:/work/yifan.wang/lscsoft/src/TaylorF2e:;SHELL=/bin/bash;BASH_ENV=/usr/share/modules/init/bash;LM_LICENSE_FILE=/opt/matlab/default/etc/license.dat;OLDPWD=/work/yifan.wang/ringdown/GW200224/220_330;TMPDIR=/local/user/yifan.wang;VIRTUAL_ENV=/work/yifan.wang/virtualenv/sgwb;MODULEPATH_modshare=/etc/environment-modules/modules:1:/usr/share/modules/$MODULE_VERSION/modulefiles:1:/usr/share/modules/modulefiles:1:/usr/share/modules/versions:1;LC_ALL=C;PEGASUS_PYTHON_DIR=/software/tools/pegasus/5.0/lib/python3.7/dist-packages;LC_CTYPE=en_US.UTF-8;SHLVL=3;SLACK_BOT_TOKEN=xoxp-1889174914644-1876230893430-1893291262468-e9d8766407c79a32862f209441631a1f;CONDOR_LOCATION=/usr;LOADEDMODULES=;SCRATCH=/local/user/yifan.wang;PEGASUS_SCHEMA_DIR=/software/tools/pegasus/5.0/share/pegasus/schema;JAVA_HOME=/usr/lib/jvm/default-java;TERM=xterm-256color;ENV=/usr/share/modules/init/profile.sh;LANG=en_US.UTF-8;XDG_SESSION_ID=149717;XDG_SESSION_TYPE=tty;PEGASUS_PYTHON_EXTERNALS_DIR=/software/tools/pegasus/5.0/lib/pegasus/externals/python;XDG_SESSION_CLASS=user;_=/software/tools/pegasus/5.0/bin/pegasus-plan;PEGASUS_JAVA_DIR=/software/tools/pegasus/5.0/share/pegasus/java;SSH_TTY=/dev/pts/5;SSH_CLIENT=130.75.117.49 62564 22;USER=yifan.wang;CLASSPATH=/software/tools/pegasus/5.0/share/pegasus/java/accessors.jar:/software/tools/pegasus/5.0/share/pegasus/java/bcprov-jdk15on-150.jar:/software/tools/pegasus/5.0/share/pegasus/java/btf-1.2.jar:/software/tools/pegasus/5.0/share/pegasus/java/commons-lang3-3.4.jar:/software/tools/pegasus/5.0/share/pegasus/java/commons-logging.jar:/software/tools/pegasus/5.0/share/pegasus/java/commons-pool.jar:/software/tools/pegasus/5.0/share/pegasus/java/exist-optional.jar:/software/tools/pegasus/5.0/share/pegasus/java/exist.jar:/software/tools/pegasus/5.0/share/pegasus/java/gram-2.1.0.jar:/software/tools/pegasus/5.0/share/pegasus/java/gridftp-2.1.0.jar:/software/tools/pegasus/5.0/share/pegasus/java/gson-2.2.2.jar:/software/tools/pegasus/5.0/share/pegasus/java/gss-2.1.0.jar:/software/tools/pegasus/5.0/share/pegasus/java/guava-16.0.1.jar:/software/tools/pegasus/5.0/share/pegasus/java/hamcrest-core-1.3.jar:/software/tools/pegasus/5.0/share/pegasus/java/io-2.1.0.jar:/software/tools/pegasus/5.0/share/pegasus/java/jackson-annotations-2.9.10.jar:/software/tools/pegasus/5.0/share/pegasus/java/jackson-core-2.9.10.jar:/software/tools/pegasus/5.0/share/pegasus/java/jackson-coreutils-1.8.jar:/software/tools/pegasus/5.0/share/pegasus/java/jackson-databind-2.9.10.jar:/software/tools/pegasus/5.0/share/pegasus/java/jackson-dataformat-yaml-2.9.10.jar:/software/tools/pegasus/5.0/share/pegasus/java/jakarta-oro.jar:/software/tools/pegasus/5.0/share/pegasus/java/java-getopt-1.0.9.jar:/software/tools/pegasus/5.0/share/pegasus/java/javax.json-1.0.4.jar:/software/tools/pegasus/5.0/share/pegasus/java/jcodings-1.0.46.jar:/software/tools/pegasus/5.0/share/pegasus/java/joda-time-2.3.jar:/software/tools/pegasus/5.0/share/pegasus/java/joni-2.1.31.jar:/software/tools/pegasus/5.0/share/pegasus/java/json-schema-validator-1.0.41.jar:/software/tools/pegasus/5.0/share/pegasus/java/jsse-2.1.0.jar:/software/tools/pegasus/5.0/share/pegasus/java/libphonenumber-6.2.jar:/software/tools/pegasus/5.0/share/pegasus/java/log4j-1.2.17.jar:/software/tools/pegasus/5.0/share/pegasus/java/mailapi-1.4.3.jar:/software/tools/pegasus/5.0/share/pegasus/java/msg-simple-1.1.jar:/software/tools/pegasus/5.0/share/pegasus/java/myproxy-2.1.0.jar:/software/tools/pegasus/5.0/share/pegasus/java/mysql-connector-java-5.1.47.jar:/software/tools/pegasus/5.0/share/pegasus/java/pegasus-aws-batch.jar:/software/tools/pegasus/5.0/share/pegasus/java/pegasus.jar:/software/tools/pegasus/5.0/share/pegasus/java/postgresql-8.1dev-400.jdbc3.jar:/software/tools/pegasus/5.0/share/pegasus/java/resolver.jar:/software/tools/pegasus/5.0/share/pegasus/java/rhino-1.7R4.jar:/software/tools/pegasus/5.0/share/pegasus/java/snakeyaml-1.25.jar:/software/tools/pegasus/5.0/share/pegasus/java/sqlite-jdbc-3.8.11.2.jar:/software/tools/pegasus/5.0/share/pegasus/java/ssl-proxies-2.1.0-patched.jar:/software/tools/pegasus/5.0/share/pegasus/java/super-csv-2.4.0.jar:/software/tools/pegasus/5.0/share/pegasus/java/uri-template-0.9.jar:/software/tools/pegasus/5.0/share/pegasus/java/vdl.jar:/software/tools/pegasus/5.0/share/pegasus/java/xercesImpl.jar:/software/tools/pegasus/5.0/share/pegasus/java/xml-apis-1.4.01.jar:/software/tools/pegasus/5.0/share/pegasus/java/xmlParserAPIs.jar:/software/tools/pegasus/5.0/share/pegasus/java/xmldb.jar:/software/tools/pegasus/5.0/share/pegasus/java/xmlrpc.jar:/software/tools/pegasus/5.0/share/pegasus/java/aws/apache-client-2.0.0-preview-2.jar:/software/tools/pegasus/5.0/share/pegasus/java/aws/batch-2.0.0-preview-2.jar:/software/tools/pegasus/5.0/share/pegasus/java/aws/commons-logging-api-1.1.jar:/software/tools/pegasus/5.0/share/pegasus/java/aws/core-2.0.0-preview-2.jar:/software/tools/pegasus/5.0/share/pegasus/java/aws/http-client-spi-2.0.0-preview-2.jar:/software/tools/pegasus/5.0/share/pegasus/java/aws/httpclient-4.5.2.jar:/software/tools/pegasus/5.0/share/pegasus/java/aws/httpcore-4.4.4.jar:/software/tools/pegasus/5.0/share/pegasus/java/aws/jackson-annotations-2.8.8.jar:/software/tools/pegasus/5.0/share/pegasus/java/aws/jackson-databind-2.8.8.jar:/software/tools/pegasus/5.0/share/pegasus/java/aws/jackson-jr-objects-2.9.0.pr4.jar:/software/tools/pegasus/5.0/share/pegasus/java/aws/joda-time-2.8.1.jar:/software/tools/pegasus/5.0/share/pegasus/java/aws/jopt-simple-5.0.4.jar:/software/tools/pegasus/5.0/share/pegasus/java/aws/logs-2.0.0-preview-2.jar:/software/tools/pegasus/5.0/share/pegasus/java/aws/metrics-spi-2.0.0-preview-2.jar:/software/tools/pegasus/5.0/share/pegasus/java/aws/netty-nio-client-2.0.0-preview-2.jar:/software/tools/pegasus/5.0/share/pegasus/java/aws/s3-2.0.0-preview-2.jar:/software/tools/pegasus/5.0/share/pegasus/java/aws/slf4j-api-1.7.25.jar:/software/tools/pegasus/5.0/share/pegasus/java/aws/slf4j-log4j12-1.7.25.jar:/software/tools/pegasus/5.0/share/pegasus/java/aws/thirdparty-logging.jar:/software/tools/pegasus/5.0/share/pegasus/java/aws/thirdparty.jar:/software/tools/pegasus/5.0/share/pegasus/java/aws/utils-2.0.0-preview-2.jar;C_INCLUDE_PATH=/work/yifan.wang/lscsoft/opt/accomlal/include:;PEGASUS_BIN_DIR=/software/tools/pegasus/5.0/bin;SSH_CONNECTION=130.75.117.49 62564 130.75.116.17 22;MODULESHOME=/usr/share/modules;PEGASUS_CONF_DIR=/software/tools/pegasus/5.0/etc;TMP=/local/user/yifan.wang;PYCBC_WAVEFORM=taylorf2e;PEGASUS_ORIG_CLASSPATH=;MODULES_CMD=/usr/lib/x86_64-linux-gnu/modulecmd.tcl;LIGO_DATAFIND_SERVER=ldr.atlas.local:80;GW_DATAFIND_SERVER=;XDG_RUNTIME_DIR=/run/user/44039;PEGASUS_SHARE_DIR=/software/tools/pegasus/5.0/share/pegasus;HOME=/work/yifan.wang;
2022.02.19 23:04:35.881 GMT:
2022.02.19 23:04:35.886 GMT: -----------------------------------------------------------------------
2022.02.19 23:04:35.891 GMT: File for submitting this DAG to HTCondor : 4ogcringdown.dax-0.dag.condor.sub
2022.02.19 23:04:35.897 GMT: Log of DAGMan debugging messages : 4ogcringdown.dax-0.dag.dagman.out
2022.02.19 23:04:35.902 GMT: Log of HTCondor library output : 4ogcringdown.dax-0.dag.lib.out
2022.02.19 23:04:35.907 GMT: Log of HTCondor library error messages : 4ogcringdown.dax-0.dag.lib.err
2022.02.19 23:04:35.912 GMT: Log of the life of condor_dagman itself : 4ogcringdown.dax-0.dag.dagman.log
2022.02.19 23:04:35.917 GMT:
2022.02.19 23:04:35.923 GMT: -no_submit given, not submitting DAG to HTCondor. You can do this with:
2022.02.19 23:04:35.933 GMT: -----------------------------------------------------------------------
2022.02.19 23:04:35.938 GMT: [DEBUG] condor_submit_dag exited with status 0
2022.02.19 23:04:35.947 GMT: [DEBUG] Updated environment for dagman is environment = _CONDOR_SCHEDD_ADDRESS_FILE=/local/condor/spool/.schedd_address;_CONDOR_MAX_DAGMAN_LOG=0;_CONDOR_SCHEDD_DAEMON_AD_FILE=/local/condor/spool/.schedd_classad;_CONDOR_DAGMAN_LOG=4ogcringdown.dax-0.dag.dagman.out;PEGASUS_METRICS=true;
2022.02.19 23:04:35.947 GMT: [INFO] event.pegasus.code.generation dax.id 4ogcringdown.dax-0 (0.323 seconds) - FINISHED
2022.02.19 23:04:35.949 GMT: [DEBUG] Executing /software/tools/pegasus/5.0/bin/pegasus-db-admin update -t master -c /local/user/yifan.wang/pycbc-tmp.GEJUmm9gCF/work/pegasus.9477561762409151663.properties
2022.02.19 23:04:36.951 GMT: Database version: '5.0.1' (sqlite:////work/yifan.wang/.pegasus/workflow.db)
2022.02.19 23:04:37.006 GMT: [DEBUG] pegasus-db-admin exited with status 0
2022.02.19 23:04:37.010 GMT: [DEBUG] Executing /software/tools/pegasus/5.0/bin/pegasus-db-admin create -Dpegasus.catalog.replica.db.url=jdbc:sqlite:/local/user/yifan.wang/pycbc-tmp.GEJUmm9gCF/work/4ogcringdown.dax-0.replicas.db -Dpegasus.catalog.replica=JDBCRC -Dpegasus.catalog.replica.db.driver=sqlite -t jdbcrc -c /local/user/yifan.wang/pycbc-tmp.GEJUmm9gCF/work/pegasus.9477561762409151663.properties
2022.02.19 23:04:38.930 GMT: Pegasus database was successfully created.
2022.02.19 23:04:38.935 GMT: Database version: '5.0.1' (sqlite:////local/user/yifan.wang/pycbc-tmp.GEJUmm9gCF/work/4ogcringdown.dax-0.replicas.db)
2022.02.19 23:04:38.988 GMT: [DEBUG] pegasus-db-admin exited with status 0
2022.02.19 23:04:38.988 GMT: Output replica catalog set to jdbc:sqlite:/local/user/yifan.wang/pycbc-tmp.GEJUmm9gCF/work/4ogcringdown.dax-0.replicas.db
2022.02.19 23:04:38.988 GMT: [DEBUG] Executing /software/tools/pegasus/5.0/bin/pegasus-run --nogrid /local/user/yifan.wang/pycbc-tmp.GEJUmm9gCF/work
2022.02.19 23:04:39.300 GMT: Submitting to condor 4ogcringdown.dax-0.dag.condor.sub
2022.02.19 23:05:04.508 GMT: Error: Running ('/usr/bin/condor_submit', '4ogcringdown.dax-0.dag.condor.sub') failed with 1
2022.02.19 23:05:04.526 GMT: [DEBUG] Submission of workflow exited with status 1
2022.02.19 23:05:04.527 GMT: [FATAL ERROR] java.lang.RuntimeException: Unable to submit the workflow using pegasus-run
at edu.isi.pegasus.planner.client.CPlanner.executeCommand(CPlanner.java:667)
at edu.isi.pegasus.planner.client.CPlanner.executeCommand(CPlanner.java:322)
at edu.isi.pegasus.planner.client.CPlanner.main(CPlanner.java:209)
2022.02.19 23:05:04.536 GMT: [DEBUG] Sending Planner Metrics to [1 of 1] http://metrics.pegasus.isi.edu/metrics
2022.02.19 23:05:04.913 GMT: [DEBUG] Metrics succesfully sent to the server
2022.02.19 23:05:04.913 GMT: [DEBUG] Exiting with non-zero exit-code 1
2022.02.19 23:05:04.913 GMT: [INFO] event.pegasus.planner planner.version 5.0.1 (29.986 seconds) - FINISHED
Generating concrete workflow
2022.02.19 23:05:21.052 GMT: [WARNING] --dax option is deprecated. The abstract workflow is passed via the last positional argument on the commandline.
2022.02.19 23:05:21.065 GMT: [DEBUG] Property Key pegasus.integrity.checking already set to nosymlink. Will not be set to - none
2022.02.19 23:05:21.066 GMT: [DEBUG] Property Key condor.periodic_remove already set to (JobStatus == 5) && ((CurrentTime - EnteredCurrentStatus) > 43200). Will not be set to - (JobStatus == 5) && ((CurrentTime - EnteredCurrentStatus) > 30)
2022.02.19 23:05:21.066 GMT: [INFO] Planner launched in the following directory /work/yifan.wang/ringdown/GW200224/220_330/4ogcringdown_output
2022.02.19 23:05:21.066 GMT: [INFO] Planner invoked with following arguments --conf ./pegasus-properties.conf --dir /local/user/yifan.wang/pycbc-tmp.09h3f7gSyR --submit --forward nogrid --output-sites local --sites local,condorpool_shared --staging-site local=local,condorpool_shared=condorpool_shared --cluster label,horizontal --cleanup inplace --relative-dir work -vvv --dax 4ogcringdown.dax
2022.02.19 23:05:21.068 GMT: [CONFIG] Pegasus Properties set by the user
2022.02.19 23:05:21.068 GMT: [CONFIG] pegasus.catalog.replica.cache.asrc=true
2022.02.19 23:05:21.069 GMT: [CONFIG] pegasus.catalog.replica.dax.asrc=true
2022.02.19 23:05:21.069 GMT: [CONFIG] pegasus.catalog.workflow.amqp.url=amqp://friend:donatedata@msgs.pegasus.isi.edu:5672/prod/workflows
2022.02.19 23:05:21.069 GMT: [CONFIG] pegasus.dir.staging.mapper=Flat
2022.02.19 23:05:21.069 GMT: [CONFIG] pegasus.dir.storage.mapper=Replica
2022.02.19 23:05:21.069 GMT: [CONFIG] pegasus.dir.storage.mapper.replica=File
2022.02.19 23:05:21.069 GMT: [CONFIG] pegasus.dir.storage.mapper.replica.file=output.map
2022.02.19 23:05:21.069 GMT: [CONFIG] pegasus.dir.submit.mapper=Named
2022.02.19 23:05:21.070 GMT: [CONFIG] pegasus.file.cleanup.scope=deferred
2022.02.19 23:05:21.070 GMT: [CONFIG] pegasus.home.bindir=/software/tools/pegasus/5.0/bin
2022.02.19 23:05:21.070 GMT: [CONFIG] pegasus.home.schemadir=/software/tools/pegasus/5.0/share/pegasus/schema
2022.02.19 23:05:21.070 GMT: [CONFIG] pegasus.home.sharedstatedir=/software/tools/pegasus/5.0/share/pegasus
2022.02.19 23:05:21.070 GMT: [CONFIG] pegasus.home.sysconfdir=/software/tools/pegasus/5.0/etc
2022.02.19 23:05:21.070 GMT: [CONFIG] pegasus.integrity.checking=nosymlink
2022.02.19 23:05:21.070 GMT: [CONFIG] pegasus.metrics.app=ligo-pycbc
2022.02.19 23:05:21.070 GMT: [CONFIG] pegasus.mode=development
2022.02.19 23:05:21.071 GMT: [CONFIG] pegasus.monitord.encoding=json
2022.02.19 23:05:21.071 GMT: [CONFIG] pegasus.register=False
2022.02.19 23:05:21.071 GMT: [CONFIG] pegasus.selector.replica=Regex
2022.02.19 23:05:21.071 GMT: [CONFIG] pegasus.selector.replica.regex.rank.1=file://(?!.*(cvmfs)).*
2022.02.19 23:05:21.071 GMT: [CONFIG] pegasus.selector.replica.regex.rank.2=file:///cvmfs/.*
2022.02.19 23:05:21.071 GMT: [CONFIG] pegasus.selector.replica.regex.rank.3=root://.*
2022.02.19 23:05:21.071 GMT: [CONFIG] pegasus.selector.replica.regex.rank.4=gsiftp://red-gridftp.unl.edu.*
2022.02.19 23:05:21.072 GMT: [CONFIG] pegasus.selector.replica.regex.rank.5=gridftp://.*
2022.02.19 23:05:21.072 GMT: [CONFIG] pegasus.selector.replica.regex.rank.6=.*
2022.02.19 23:05:21.072 GMT: [CONFIG] pegasus.transfer.bypass.input.staging=true
2022.02.19 23:05:21.072 GMT: [CONFIG] pegasus.transfer.links=true
2022.02.19 23:05:21.413 GMT: [INFO] event.pegasus.add.data-dependencies dax.id 4ogcringdown.dax-0 - STARTED
2022.02.19 23:05:21.413 GMT: [INFO] event.pegasus.add.data-dependencies dax.id 4ogcringdown.dax-0 (0.0 seconds) - FINISHED
2022.02.19 23:05:21.448 GMT: [DEBUG] Parsed DAX with following metrics {"compute_tasks":0,"dax_tasks":2,"dag_tasks":0,"total_tasks":2,"deleted_tasks":0,"dax_input_files":4,"dax_inter_files":0,"dax_output_files":0,"dax_total_files":4,"compute_jobs":0,"clustered_jobs":0,"si_tx_jobs":0,"so_tx_jobs":0,"inter_tx_jobs":0,"reg_jobs":0,"cleanup_jobs":0,"create_dir_jobs":0,"dax_jobs":2,"dag_jobs":0,"chmod_jobs":0,"total_jobs":2,"mDAXLabel":"4ogcringdown.dax"}
2022.02.19 23:05:21.449 GMT: [CONFIG] Loading site catalog file /work/yifan.wang/ringdown/GW200224/220_330/4ogcringdown_output/sites.yml
2022.02.19 23:05:21.449 GMT: [DEBUG] All sites will be loaded from the site catalog
2022.02.19 23:05:21.450 GMT: [DEBUG] event.pegasus.parse.site-catalog site-catalog.id /work/yifan.wang/ringdown/GW200224/220_330/4ogcringdown_output/sites.yml - STARTED
2022.02.19 23:05:21.544 GMT: [DEBUG] event.pegasus.parse.site-catalog site-catalog.id /work/yifan.wang/ringdown/GW200224/220_330/4ogcringdown_output/sites.yml (0.093 seconds) - FINISHED
2022.02.19 23:05:21.545 GMT: [DEBUG] Sites loaded are [osg, condorpool_shared, condorpool_symlink, condorpool_copy, local]
2022.02.19 23:05:21.545 GMT: [CONFIG] Set environment profile for local site PATH=/software/tools/pegasus/5.0/bin/:/work/yifan.wang/virtualenv/sgwb/bin:/work/yifan.wang/lscsoft/opt/accomlal/bin:/usr/local/bin:/usr/bin:/bin:/usr/local/games:/usr/games
2022.02.19 23:05:21.545 GMT: [CONFIG] Set environment profile for local site PYTHONPATH=/software/tools/pegasus/5.0/lib/python3.7/dist-packages/:/work/yifan.wang/eccsearch/waveform/ihes-teobresum/Python:/work/yifan.wang/eccsearch/waveform/PyCBC-teobresums:/work/yifan.wang/1-ecc-waveform-PE/IMRPhenomDecc:/work/yifan.wang/lscsoft/src/TaylorF2e:
2022.02.19 23:05:21.546 GMT: [CONFIG] Constructed default site catalog entry for condorpool site
condor
2022.02.19 23:05:21.601 GMT: [DEBUG] Mount Under Scratch Directories [/tmp, /var/tmp]
2022.02.19 23:05:21.601 GMT: [DEBUG] Style detected for site osg is class edu.isi.pegasus.planner.code.generator.condor.style.Condor
2022.02.19 23:05:21.601 GMT: [DEBUG] Style detected for site condorpool_shared is class edu.isi.pegasus.planner.code.generator.condor.style.Condor
2022.02.19 23:05:21.602 GMT: [DEBUG] Style detected for site condorpool is class edu.isi.pegasus.planner.code.generator.condor.style.Condor
2022.02.19 23:05:21.602 GMT: [DEBUG] Style detected for site condorpool_symlink is class edu.isi.pegasus.planner.code.generator.condor.style.Condor
2022.02.19 23:05:21.602 GMT: [DEBUG] Style detected for site condorpool_copy is class edu.isi.pegasus.planner.code.generator.condor.style.Condor
2022.02.19 23:05:21.602 GMT: [DEBUG] Style detected for site local is class edu.isi.pegasus.planner.code.generator.condor.style.Condor
2022.02.19 23:05:21.602 GMT: [DEBUG] Execution sites are [condorpool_shared, local]
2022.02.19 23:05:21.605 GMT: [CONFIG] Transformation Catalog Type used YAML TC
2022.02.19 23:05:21.605 GMT: [DEBUG] Ignoring error encountered while loading Transformation Catalog
[1]: Unable to instantiate Transformation Catalog at edu.isi.pegasus.planner.catalog.transformation.TransformationFactory.loadInstance(TransformationFactory.java:164)
[2]: edu.isi.pegasus.planner.catalog.transformation.impl.YAML caught java.lang.RuntimeException The File to be used as TC should be defined with the property pegasus.catalog.transformation.file at edu.isi.pegasus.planner.catalog.transformation.impl.YAML.connect(YAML.java:167)
2022.02.19 23:05:21.606 GMT: [DEBUG] Created a temporary transformation catalog backend /tmp/tc.14910202026922531189.txt
2022.02.19 23:05:21.607 GMT: [CONFIG] Transformation Catalog Type used Multiline Textual TC
2022.02.19 23:05:21.607 GMT: [CONFIG] Transformation Catalog File used /tmp/tc.14910202026922531189.txt
2022.02.19 23:05:21.609 GMT: [CONFIG] Data Configuration used for the workflow condorio
2022.02.19 23:05:21.610 GMT: [DEBUG] Directory to be created is /local/user/yifan.wang/pycbc-tmp.09h3f7gSyR/yifan.wang/pegasus/4ogcringdown.dax/run0001
2022.02.19 23:05:21.611 GMT: [CONFIG] Metrics file will be written out to /local/user/yifan.wang/pycbc-tmp.09h3f7gSyR/work/4ogcringdown.dax-0.metrics
2022.02.19 23:05:21.611 GMT: [CONFIG] The base submit directory for the workflow /local/user/yifan.wang/pycbc-tmp.09h3f7gSyR
2022.02.19 23:05:21.611 GMT: [CONFIG] The relative submit directory for the workflow work
2022.02.19 23:05:21.611 GMT: [CONFIG] The relative execution directory for the workflow work
2022.02.19 23:05:21.613 GMT: [INFO] event.pegasus.stampede.events dax.id 4ogcringdown.dax-0 - STARTED
2022.02.19 23:05:21.616 GMT: [DEBUG] Written out stampede events for the abstract workflow to /local/user/yifan.wang/pycbc-tmp.09h3f7gSyR/work/4ogcringdown.dax-0.static.bp
2022.02.19 23:05:21.616 GMT: [INFO] event.pegasus.stampede.events dax.id 4ogcringdown.dax-0 (0.003 seconds) - FINISHED
2022.02.19 23:05:21.617 GMT: [INFO] event.pegasus.refinement dax.id 4ogcringdown.dax-0 - STARTED
2022.02.19 23:05:21.622 GMT: [CONFIG] Proxy used for Replica Catalog is /tmp/x509up_u44039
2022.02.19 23:05:21.624 GMT: [DEBUG] [Replica Factory] Connect properties detected {proxy=/tmp/x509up_u44039, read.only=true, dax.asrc=true, cache.asrc=true}
2022.02.19 23:05:21.625 GMT: [DEBUG] [Replica Factory] Connect properties detected for implementor edu.isi.pegasus.planner.catalog.replica.impl.YAML -> {proxy=/tmp/x509up_u44039, read.only=true, dax.asrc=true, cache.asrc=true}
2022.02.19 23:05:21.626 GMT: [DEBUG] Problem while connecting with the Replica Catalog: Unable to connect to replica catalog implementation edu.isi.pegasus.planner.catalog.replica.impl.YAML with props {proxy=/tmp/x509up_u44039, read.only=true, dax.asrc=true, cache.asrc=true}
2022.02.19 23:05:21.626 GMT: [DEBUG] Setting property dagman.registration.maxjobs to 1 to set max jobs for registrations jobs category
2022.02.19 23:05:21.629 GMT: [DEBUG] Copied /tmp/tc.14910202026922531189.txt to directory /local/user/yifan.wang/pycbc-tmp.09h3f7gSyR/work/catalogs
2022.02.19 23:05:21.630 GMT: [DEBUG] Copied /work/yifan.wang/ringdown/GW200224/220_330/4ogcringdown_output/sites.yml to directory /local/user/yifan.wang/pycbc-tmp.09h3f7gSyR/work/catalogs
2022.02.19 23:05:21.630 GMT: [DEBUG] Set Default output replica catalog properties to {pegasus.catalog.replica.output.db.driver=sqlite, pegasus.catalog.replica.output.db.url=jdbc:sqlite:/local/user/yifan.wang/pycbc-tmp.09h3f7gSyR/work/4ogcringdown.dax-0.replicas.db, pegasus.catalog.replica.output=JDBCRC}
2022.02.19 23:05:21.630 GMT: [INFO] event.pegasus.check.cyclic-dependencies dax.id 4ogcringdown.dax-0 - STARTED
2022.02.19 23:05:21.631 GMT: [INFO] event.pegasus.check.cyclic-dependencies dax.id 4ogcringdown.dax-0 (0.001 seconds) - FINISHED
2022.02.19 23:05:21.632 GMT: [DEBUG] 0 entries found in cache of total 4
2022.02.19 23:05:21.632 GMT: [DEBUG] 0 entries found in previous submit dirs of total 4
2022.02.19 23:05:21.632 GMT: [DEBUG] 0 entries found in input directories of total 4
2022.02.19 23:05:21.632 GMT: [DEBUG] 4 entries found in abstract workflow replica store of total 4
2022.02.19 23:05:21.632 GMT: [DEBUG] 0 entries found in inherited replica store of total 4
2022.02.19 23:05:21.632 GMT: [DEBUG] 0 entries found in input replica catalog of total 4
2022.02.19 23:05:21.632 GMT: [DEBUG] 4 entries found in all replica sources of total 4
2022.02.19 23:05:21.632 GMT: [CONFIG] Data Reuse Scope for the workflow: full
2022.02.19 23:05:21.632 GMT: [DEBUG] Reducing the workflow
2022.02.19 23:05:21.633 GMT: [INFO] event.pegasus.reduce dax.id 4ogcringdown.dax-0 - STARTED
2022.02.19 23:05:21.633 GMT: [DEBUG] Jobs whose o/p files already exist
2022.02.19 23:05:21.633 GMT: [DEBUG] Job pegasus-plan_finalization has no o/p files
2022.02.19 23:05:21.633 GMT: [DEBUG] Job pegasus-plan_main has no o/p files
2022.02.19 23:05:21.633 GMT: [DEBUG] Jobs whose o/p files already exist - DONE
2022.02.19 23:05:21.634 GMT: [DEBUG] pegasus-plan_main will not be deleted as not as child pegasus-plan_finalization is not marked for deletion
2022.02.19 23:05:21.634 GMT: [INFO] Nodes/Jobs Deleted from the Workflow during reduction
2022.02.19 23:05:21.634 GMT: [INFO] Nodes/Jobs Deleted from the Workflow during reduction - DONE
2022.02.19 23:05:21.634 GMT: [INFO] event.pegasus.reduce dax.id 4ogcringdown.dax-0 (0.001 seconds) - FINISHED
2022.02.19 23:05:21.634 GMT: [INFO] event.pegasus.siteselection dax.id 4ogcringdown.dax-0 - STARTED
2022.02.19 23:05:21.637 GMT: [DEBUG] List of executions sites is [condorpool_shared, local]
2022.02.19 23:05:21.639 GMT: [DEBUG] Job pegasus-plan_main will be mapped based on selector|hints profile key execution.site
2022.02.19 23:05:21.639 GMT: [DEBUG] Job pegasus-plan_finalization will be mapped based on selector|hints profile key execution.site
2022.02.19 23:05:21.639 GMT: [DEBUG] Setting up site mapping for job pegasus-plan_finalization
2022.02.19 23:05:21.640 GMT: [DEBUG] Job was mapped to pegasus-plan_finalization to site local
2022.02.19 23:05:21.641 GMT: [DEBUG] Setting up site mapping for job pegasus-plan_main
2022.02.19 23:05:21.641 GMT: [DEBUG] Job was mapped to pegasus-plan_main to site local
2022.02.19 23:05:21.641 GMT: [INFO] event.pegasus.stampede.events dax.id 4ogcringdown.dax-0 - STARTED
2022.02.19 23:05:21.642 GMT: [DEBUG] Written out stampede metadata events for the mapped workflow to /local/user/yifan.wang/pycbc-tmp.09h3f7gSyR/work/4ogcringdown.dax-0.static.bp
2022.02.19 23:05:21.642 GMT: [INFO] event.pegasus.stampede.events dax.id 4ogcringdown.dax-0 (0.001 seconds) - FINISHED
2022.02.19 23:05:21.642 GMT: [INFO] event.pegasus.siteselection dax.id 4ogcringdown.dax-0 (0.008 seconds) - FINISHED
2022.02.19 23:05:21.645 GMT: [DEBUG] User set mapper is not a stageable mapper. Loading a stageable mapper
2022.02.19 23:05:21.646 GMT: [DEBUG] Deployment of Worker Package needed
2022.02.19 23:05:21.654 GMT: [CONFIG] No Replica Registration Jobs will be created .
2022.02.19 23:05:21.658 GMT: [CONFIG] Transfer Implementation loaded for Stage-In [Python based Transfer Script]
2022.02.19 23:05:21.658 GMT: [CONFIG] Transfer Implementation loaded for symbolic linking Stage-In [Python based Transfer Script]
2022.02.19 23:05:21.658 GMT: [CONFIG] Transfer Implementation loaded for Inter Site [Python based Transfer Script]
2022.02.19 23:05:21.658 GMT: [CONFIG] Transfer Implementation loaded for Stage-Out [Python based Transfer Script]
2022.02.19 23:05:21.658 GMT: [DEBUG] Trying to get TCEntries for pegasus::worker on resource ALL of type STAGEABLE
2022.02.19 23:05:21.659 GMT: [DEBUG] System information for pegasus-worker-5.0.1-x86_64_deb_10.tar.gz is {arch=x86_64 os=linux osrelease=deb osversion=10}
2022.02.19 23:05:21.659 GMT: [DEBUG] Compute site sysinfo local {arch=x86_64 os=linux}
2022.02.19 23:05:21.659 GMT: [DEBUG] Worker Package Entry used for site local
Logical Namespace : pegasus
Logical Name : worker
Version : null
Resource Id : local
Physical Name : file:///local/user/yifan.wang/pycbc-tmp.09h3f7gSyR/work/pegasus-worker-5.0.1-x86_64_deb_10.tar.gz
SysInfo : {arch=x86_64 os=linux}
TYPE : STAGEABLE
BYPASS : false
Notifications:
Container : null
Compound Tx : null
2022.02.19 23:05:21.659 GMT: [DEBUG] Trying to get TCEntries for pegasus::worker on resource local of type STAGEABLE
2022.02.19 23:05:21.659 GMT: [DEBUG] Staging site for site local for worker package deployment - local
2022.02.19 23:05:21.660 GMT: [DEBUG] Trying to get TCEntries for pegasus::worker on resource ALL of type STAGEABLE
2022.02.19 23:05:21.660 GMT: [DEBUG] Selected entry
Logical Namespace : pegasus
Logical Name : worker
Version : null
Resource Id : local
Physical Name : file:///local/user/yifan.wang/pycbc-tmp.09h3f7gSyR/work/pegasus-worker-5.0.1-x86_64_deb_10.tar.gz
SysInfo : {arch=x86_64 os=linux}
TYPE : STAGEABLE
BYPASS : false
Notifications:
Container : null
Compound Tx : null
2022.02.19 23:05:21.660 GMT: [DEBUG] Creating a default TC entry for pegasus::transfer at site local
2022.02.19 23:05:21.660 GMT: [DEBUG] Remote Path set is pegasus-transfer
2022.02.19 23:05:21.660 GMT: [DEBUG] Trying to get TCEntries for pegasus::transfer on resource local of type INSTALLED
2022.02.19 23:05:21.661 GMT: [DEBUG] Entry constructed
Logical Namespace : pegasus
Logical Name : transfer
Version : null
Resource Id : local
Physical Name : pegasus-transfer
SysInfo : {arch=x86_64 os=linux}
TYPE : INSTALLED
BYPASS : false
Notifications:
Container : null
Compound Tx : null
2022.02.19 23:05:21.661 GMT: [DEBUG] Creating a default TC entry for pegasus::kickstart at site local
2022.02.19 23:05:21.661 GMT: [DEBUG] Remote Path set is pegasus-kickstart
2022.02.19 23:05:21.661 GMT: [DEBUG] Trying to get TCEntries for pegasus::kickstart on resource local of type INSTALLED
2022.02.19 23:05:21.661 GMT: [DEBUG] Entry constructed
Logical Namespace : pegasus
Logical Name : kickstart
Version : null
Resource Id : local
Physical Name : pegasus-kickstart
SysInfo : {arch=x86_64 os=linux}
TYPE : INSTALLED
BYPASS : false
Notifications:
Container : null
Compound Tx : null
2022.02.19 23:05:21.661 GMT: [DEBUG] Creating a default TC entry for pegasus::cleanup at site local
2022.02.19 23:05:21.661 GMT: [DEBUG] Remote Path set is pegasus-transfer
2022.02.19 23:05:21.661 GMT: [DEBUG] Trying to get TCEntries for pegasus::cleanup on resource local of type INSTALLED
2022.02.19 23:05:21.661 GMT: [DEBUG] Entry constructed
Logical Namespace : pegasus
Logical Name : cleanup
Version : null
Resource Id : local
Physical Name : pegasus-transfer
SysInfo : {arch=x86_64 os=linux}
TYPE : INSTALLED
BYPASS : false
Notifications:
Container : null
Compound Tx : null
2022.02.19 23:05:21.661 GMT: [DEBUG] Creating a default TC entry for pegasus::seqexec at site local
2022.02.19 23:05:21.661 GMT: [DEBUG] Remote Path set is pegasus-cluster
2022.02.19 23:05:21.662 GMT: [DEBUG] Trying to get TCEntries for pegasus::seqexec on resource local of type INSTALLED
2022.02.19 23:05:21.662 GMT: [DEBUG] Entry constructed
Logical Namespace : pegasus
Logical Name : seqexec
Version : null
Resource Id : local
Physical Name : pegasus-cluster
SysInfo : {arch=x86_64 os=linux}
TYPE : INSTALLED
BYPASS : false
Notifications:
Container : null
Compound Tx : null
2022.02.19 23:05:21.662 GMT: [DEBUG] Creating a default TC entry for pegasus::dirmanager at site local
2022.02.19 23:05:21.662 GMT: [DEBUG] Remote Path set is pegasus-transfer
2022.02.19 23:05:21.662 GMT: [DEBUG] Trying to get TCEntries for pegasus::dirmanager on resource local of type INSTALLED
2022.02.19 23:05:21.662 GMT: [DEBUG] Entry constructed
Logical Namespace : pegasus
Logical Name : dirmanager
Version : null
Resource Id : local
Physical Name : pegasus-transfer
SysInfo : {arch=x86_64 os=linux}
TYPE : INSTALLED
BYPASS : false
Notifications:
Container : null
Compound Tx : null
2022.02.19 23:05:21.662 GMT: [DEBUG] Creating a default TC entry for pegasus::keg at site local
2022.02.19 23:05:21.662 GMT: [DEBUG] Remote Path set is pegasus-keg
2022.02.19 23:05:21.662 GMT: [DEBUG] Trying to get TCEntries for pegasus::keg on resource local of type INSTALLED
2022.02.19 23:05:21.662 GMT: [DEBUG] Entry constructed
Logical Namespace : pegasus
Logical Name : keg
Version : null
Resource Id : local
Physical Name : pegasus-keg
SysInfo : {arch=x86_64 os=linux}
TYPE : INSTALLED
BYPASS : false
Notifications:
Container : null
Compound Tx : null
2022.02.19 23:05:21.663 GMT: [INFO] event.pegasus.cluster dax.id 4ogcringdown.dax-0 - STARTED
2022.02.19 23:05:21.664 GMT: [DEBUG] Adding job to graph pegasus-plan_finalization
2022.02.19 23:05:21.664 GMT: [DEBUG] Adding job to graph pegasus-plan_main
2022.02.19 23:05:21.665 GMT: [DEBUG] Adding parents for child finalization
2022.02.19 23:05:21.666 GMT: [CONFIG] Partitioner loaded is Label Based Partitioning
2022.02.19 23:05:21.673 GMT: [CONFIG] Kickstart Stating Disabled Completely - false
2022.02.19 23:05:21.675 GMT: [CONFIG] Kickstart Stating Disabled Completely - false
2022.02.19 23:05:21.675 GMT: [CONFIG] Clusterer loaded is Topological based Vertical Clustering
2022.02.19 23:05:21.675 GMT: [INFO] Starting Graph Traversal
2022.02.19 23:05:21.676 GMT: [DEBUG] Adding to level 0 dummy
2022.02.19 23:05:21.676 GMT: [DEBUG] Adding to queue main
2022.02.19 23:05:21.676 GMT: [DEBUG] Removed dummy
2022.02.19 23:05:21.676 GMT: [DEBUG] Adding to level 1 main
2022.02.19 23:05:21.676 GMT: [DEBUG] Adding to queue finalization
2022.02.19 23:05:21.676 GMT: [DEBUG] Removed main
2022.02.19 23:05:21.676 GMT: [DEBUG] Adding to level 2 finalization
2022.02.19 23:05:21.676 GMT: [DEBUG] Removed finalization
2022.02.19 23:05:21.676 GMT: [INFO] Starting Graph Traversal - DONE
2022.02.19 23:05:21.676 GMT: [DEBUG] Partition is [finalization] corresponding to label finalization
2022.02.19 23:05:21.677 GMT: [DEBUG] Clustering jobs in partition ID2 [finalization]
2022.02.19 23:05:21.677 GMT: [DEBUG] No clustering for partition ID2
2022.02.19 23:05:21.677 GMT: [DEBUG] Partition is [main] corresponding to label main
2022.02.19 23:05:21.677 GMT: [DEBUG] Clustering jobs in partition ID1 [main]
2022.02.19 23:05:21.677 GMT: [DEBUG] No clustering for partition ID1
2022.02.19 23:05:21.677 GMT: [INFO] Determining relations between partitions
2022.02.19 23:05:21.678 GMT: [INFO] Determining relations between partitions - DONE
2022.02.19 23:05:21.678 GMT: [DEBUG] Adding job to graph pegasus-plan_finalization
2022.02.19 23:05:21.678 GMT: [DEBUG] Adding job to graph pegasus-plan_main
2022.02.19 23:05:21.678 GMT: [DEBUG] Adding parents for child finalization
2022.02.19 23:05:21.678 GMT: [CONFIG] Partitioner loaded is Level Based Partitioning
2022.02.19 23:05:21.680 GMT: [CONFIG] Kickstart Stating Disabled Completely - false
2022.02.19 23:05:21.680 GMT: [CONFIG] Kickstart Stating Disabled Completely - false
2022.02.19 23:05:21.680 GMT: [CONFIG] Clusterer loaded is Horizontal Clustering
2022.02.19 23:05:21.680 GMT: [DEBUG] Adding to level 0 dummy
2022.02.19 23:05:21.680 GMT: [DEBUG] Adding to queue main
2022.02.19 23:05:21.680 GMT: [DEBUG] Removed dummy
2022.02.19 23:05:21.680 GMT: [DEBUG] Adding to level 1 main
2022.02.19 23:05:21.681 GMT: [DEBUG] Adding to queue finalization
2022.02.19 23:05:21.681 GMT: [DEBUG] Removed main
2022.02.19 23:05:21.681 GMT: [DEBUG] Partition ID1 is :[main]
2022.02.19 23:05:21.681 GMT: [DEBUG] Clustering jobs in partition ID1 [main]
2022.02.19 23:05:21.681 GMT: [DEBUG] Clustering jobs of type pegasus-pegasus-plan-5_0_1
2022.02.19 23:05:21.682 GMT: [DEBUG] No clustering of jobs mapped to execution site local
2022.02.19 23:05:21.682 GMT: [DEBUG] Adding to level 2 finalization
2022.02.19 23:05:21.682 GMT: [DEBUG] Removed finalization
2022.02.19 23:05:21.682 GMT: [DEBUG] Partition ID2 is :[finalization]
2022.02.19 23:05:21.682 GMT: [DEBUG] Clustering jobs in partition ID2 [finalization]
2022.02.19 23:05:21.682 GMT: [DEBUG] Clustering jobs of type pegasus-pegasus-plan-5_0_1
2022.02.19 23:05:21.682 GMT: [DEBUG] No clustering of jobs mapped to execution site local
2022.02.19 23:05:21.683 GMT: [DEBUG]
Replacing {pegasus-plan_main [] -> pegasus-plan_finalization [],false} with {pegasus-plan_main [] -> pegasus-plan_finalization [],false}Add to set : true
2022.02.19 23:05:21.683 GMT: [DEBUG] All clustered jobs removed from the workflow
2022.02.19 23:05:21.683 GMT: [INFO] event.pegasus.cluster dax.id 4ogcringdown.dax-0 (0.02 seconds) - FINISHED
2022.02.19 23:05:21.684 GMT: [DEBUG] Initialising Replica Catalog for Planner Cache
2022.02.19 23:05:21.685 GMT: [DEBUG] [Replica Factory] Connect properties detected for implementor SimpleFile -> {output=JDBCRC, output.db.create=true, file=/local/user/yifan.wang/pycbc-tmp.09h3f7gSyR/work/4ogcringdown.dax-0.getcache, read.only=true, dax.asrc=true, output.db.url=jdbc:sqlite:/local/user/yifan.wang/pycbc-tmp.09h3f7gSyR/work/4ogcringdown.dax-0.replicas.db, output.db.driver=sqlite, cache.asrc=true}
2022.02.19 23:05:21.686 GMT: [DEBUG] Initialising Replica Catalog for Planner Cache
2022.02.19 23:05:21.686 GMT: [DEBUG] [Replica Factory] Connect properties detected for implementor SimpleFile -> {output=JDBCRC, output.db.create=true, file=/local/user/yifan.wang/pycbc-tmp.09h3f7gSyR/work/4ogcringdown.dax-0.putcache, read.only=true, dax.asrc=true, output.db.url=jdbc:sqlite:/local/user/yifan.wang/pycbc-tmp.09h3f7gSyR/work/4ogcringdown.dax-0.replicas.db, output.db.driver=sqlite, cache.asrc=true}
2022.02.19 23:05:21.687 GMT: [INFO] Grafting transfer nodes in the workflow
2022.02.19 23:05:21.687 GMT: [INFO] event.pegasus.generate.transfer-nodes dax.id 4ogcringdown.dax-0 - STARTED
2022.02.19 23:05:21.689 GMT: [DEBUG] Date Reuse Engine no longer tracks deleted leaf jobs. Returning empty list
2022.02.19 23:05:21.696 GMT: [CONFIG] No Replica Registration Jobs will be created .
2022.02.19 23:05:21.696 GMT: [DEBUG] Number of transfer jobs for -1 are 0
2022.02.19 23:05:21.697 GMT: [DEBUG] Number of transfer jobs for 0 are 1
2022.02.19 23:05:21.697 GMT: [DEBUG] Number of transfer jobs for 1 are 1
2022.02.19 23:05:21.698 GMT: [CONFIG] Transfer Implementation loaded for Stage-In [Python based Transfer Script]
2022.02.19 23:05:21.698 GMT: [CONFIG] Transfer Implementation loaded for symbolic linking Stage-In [Python based Transfer Script]
2022.02.19 23:05:21.698 GMT: [CONFIG] Transfer Implementation loaded for Inter Site [Python based Transfer Script]
2022.02.19 23:05:21.698 GMT: [CONFIG] Transfer Implementation loaded for Stage-Out [Python based Transfer Script]
2022.02.19 23:05:21.700 GMT: [DEBUG] Rank ( rank => 1 priority => 500 expr => file://(?!.*(cvmfs)).*)
2022.02.19 23:05:21.700 GMT: [DEBUG] Rank ( rank => 2 priority => 400 expr => file:///cvmfs/.*)
2022.02.19 23:05:21.700 GMT: [DEBUG] Rank ( rank => 3 priority => 300 expr => root://.*)
2022.02.19 23:05:21.700 GMT: [DEBUG] Rank ( rank => 4 priority => 200 expr => gsiftp://red-gridftp.unl.edu.*)
2022.02.19 23:05:21.700 GMT: [DEBUG] Rank ( rank => 5 priority => 100 expr => gridftp://.*)
2022.02.19 23:05:21.700 GMT: [DEBUG] Rank ( rank => 6 priority => 0 expr => .*)
2022.02.19 23:05:21.701 GMT: [CONFIG] [RegexReplicaSelector] User Provided Ranked regexes are [( rank => 1 priority => 500 expr => file://(?!.*(cvmfs)).*), ( rank => 2 priority => 400 expr => file:///cvmfs/.*), ( rank => 3 priority => 300 expr => root://.*), ( rank => 4 priority => 200 expr => gsiftp://red-gridftp.unl.edu.*), ( rank => 5 priority => 100 expr => gridftp://.*), ( rank => 6 priority => 0 expr => .*)]
2022.02.19 23:05:21.703 GMT: [DEBUG] [Replica Factory] Connect properties detected for implementor File -> {file=output.map, read.only=true}
2022.02.19 23:05:21.703 GMT: [CONFIG] Output Mapper loaded is [Replica Catalog Mapper]
2022.02.19 23:05:21.703 GMT: [DEBUG] Initialising Workflow Cache File in the Submit Directory
2022.02.19 23:05:21.704 GMT: [DEBUG] [Replica Factory] Connect properties detected for implementor FlushedCache -> {output=JDBCRC, output.db.create=true, file=/local/user/yifan.wang/pycbc-tmp.09h3f7gSyR/work/4ogcringdown.dax-0.cache, dax.asrc=true, output.db.url=jdbc:sqlite:/local/user/yifan.wang/pycbc-tmp.09h3f7gSyR/work/4ogcringdown.dax-0.replicas.db, output.db.driver=sqlite, cache.asrc=true}
2022.02.19 23:05:21.705 GMT: [CONFIG] Transfer Refiner loaded is [Balanced Cluster Transfer Refiner( round robin distribution at file level)]
2022.02.19 23:05:21.705 GMT: [CONFIG] ReplicaSelector loaded is [Regex]
2022.02.19 23:05:21.705 GMT: [CONFIG] Submit Directory Mapper loaded is [Relative Submit Directory Mapper]
2022.02.19 23:05:21.705 GMT: [CONFIG] Staging Mapper loaded is [Flat Directory Staging Mapper]
2022.02.19 23:05:21.706 GMT: [DEBUG] SRM Server map is {}
2022.02.19 23:05:21.706 GMT: [DEBUG] SRM Server map is {}
2022.02.19 23:05:21.707 GMT: [DEBUG] Directory for job pegasus-plan_main is .
2022.02.19 23:05:21.707 GMT: [DEBUG]
2022.02.19 23:05:21.707 GMT: [DEBUG] Job being traversed is pegasus-plan_main
2022.02.19 23:05:21.707 GMT: [DEBUG] To be run at local
2022.02.19 23:05:21.707 GMT: [DEBUG] Parents of job:{}
2022.02.19 23:05:21.707 GMT: [DEBUG] Initialising Workflow Cache File for job pegasus-plan_main
2022.02.19 23:05:21.707 GMT: [DEBUG] [Replica Factory] Connect properties detected for implementor FlushedCache -> {output=JDBCRC, output.db.create=true, file=/local/user/yifan.wang/pycbc-tmp.09h3f7gSyR/work/./pegasus-plan_main.cache, dax.asrc=true, output.db.url=jdbc:sqlite:/local/user/yifan.wang/pycbc-tmp.09h3f7gSyR/work/4ogcringdown.dax-0.replicas.db, output.db.driver=sqlite, cache.asrc=true}
2022.02.19 23:05:21.708 GMT: [DEBUG] [RegexReplicaSelector] Selecting a pfn for lfn main.dax at site local
amongst main.dax regex false -> {(/work/yifan.wang/ringdown/GW200224/220_330/4ogcringdown_output/main.dax,{site=local}),}
2022.02.19 23:05:21.708 GMT: [DEBUG] Job Input files : Removed file main.dax for job pegasus-plan_main
2022.02.19 23:05:21.708 GMT: [DEBUG] Job Search Files : Removed file main.dax for job pegasus-plan_main
2022.02.19 23:05:21.708 GMT: [DEBUG] Set arguments for DAX job pegasus-plan_main to -Dpegasus.dir.storage.mapper.replica.file=main.map --basename main --cluster label,horizontal --output-sites local --staging-site local=local,condorpool_shared=condorpool_shared --cleanup inplace -vvv /work/yifan.wang/ringdown/GW200224/220_330/4ogcringdown_output/main.dax
2022.02.19 23:05:21.709 GMT: [DEBUG] Directory for job pegasus-plan_finalization is .
2022.02.19 23:05:21.709 GMT: [DEBUG]
2022.02.19 23:05:21.710 GMT: [DEBUG] Job being traversed is pegasus-plan_finalization
2022.02.19 23:05:21.710 GMT: [DEBUG] To be run at local
2022.02.19 23:05:21.710 GMT: [DEBUG] Parents of job:{pegasus-plan_main,}
2022.02.19 23:05:21.710 GMT: [DEBUG] Initialising Workflow Cache File for job pegasus-plan_finalization
2022.02.19 23:05:21.710 GMT: [DEBUG] [Replica Factory] Connect properties detected for implementor FlushedCache -> {output=JDBCRC, output.db.create=true, file=/local/user/yifan.wang/pycbc-tmp.09h3f7gSyR/work/./pegasus-plan_finalization.cache, dax.asrc=true, output.db.url=jdbc:sqlite:/local/user/yifan.wang/pycbc-tmp.09h3f7gSyR/work/4ogcringdown.dax-0.replicas.db, output.db.driver=sqlite, cache.asrc=true}
2022.02.19 23:05:21.710 GMT: [DEBUG] [RegexReplicaSelector] Selecting a pfn for lfn finalization.dax at site local
amongst finalization.dax regex false -> {(/work/yifan.wang/ringdown/GW200224/220_330/4ogcringdown_output/finalization.dax,{site=local}),}
2022.02.19 23:05:21.710 GMT: [DEBUG] Job Input files : Removed file finalization.dax for job pegasus-plan_finalization
2022.02.19 23:05:21.711 GMT: [DEBUG] Job Search Files : Removed file finalization.dax for job pegasus-plan_finalization
2022.02.19 23:05:21.711 GMT: [DEBUG] Set arguments for DAX job pegasus-plan_finalization to -Dpegasus.dir.storage.mapper.replica.file=finalization.map --basename finalization --cluster label,horizontal --output-sites local --staging-site local=local,condorpool_shared=condorpool_shared --cleanup inplace -vvv /work/yifan.wang/ringdown/GW200224/220_330/4ogcringdown_output/finalization.dax
2022.02.19 23:05:21.712 GMT: [INFO] event.pegasus.generate.transfer-nodes dax.id 4ogcringdown.dax-0 (0.025 seconds) - FINISHED
2022.02.19 23:05:21.712 GMT: [DEBUG] Adding worker package deployment node for local
2022.02.19 23:05:21.712 GMT: [DEBUG] Skipping stage worker job for site local as worker package already in submit directory file:///local/user/yifan.wang/pycbc-tmp.09h3f7gSyR/work/pegasus-worker-5.0.1-x86_64_deb_10.tar.gz
2022.02.19 23:05:21.712 GMT: [INFO] event.pegasus.generate.workdir-nodes dax.id 4ogcringdown.dax-0 - STARTED
2022.02.19 23:05:21.715 GMT: [DEBUG] Trying to get TCEntries for pegasus::dirmanager on resource local of type INSTALLED
2022.02.19 23:05:21.715 GMT: [DEBUG] Creating create dir node create_dir_4ogcringdown.dax_0_local
2022.02.19 23:05:21.716 GMT: [DEBUG] Need to add edge create_dir_4ogcringdown.dax_0_local -> pegasus-plan_main
2022.02.19 23:05:21.716 GMT: [DEBUG] Adding node to the worfklow create_dir_4ogcringdown.dax_0_local
2022.02.19 23:05:21.716 GMT: [INFO] event.pegasus.generate.workdir-nodes dax.id 4ogcringdown.dax-0 (0.004 seconds) - FINISHED
2022.02.19 23:05:21.716 GMT: [INFO] event.pegasus.generate.cleanup-nodes dax.id 4ogcringdown.dax-0 - STARTED
2022.02.19 23:05:21.722 GMT: [CONFIG] Setting property dagman.cleanup.maxjobs to 4 to set max jobs for cleanup jobs category
2022.02.19 23:05:21.722 GMT: [DEBUG] Number of sites 1
2022.02.19 23:05:21.723 GMT: [DEBUG] Site local count jobs = 3
2022.02.19 23:05:21.723 GMT: [DEBUG] * pegasus-plan_finalization
2022.02.19 23:05:21.723 GMT: [DEBUG] * create_dir_4ogcringdown.dax_0_local
2022.02.19 23:05:21.723 GMT: [DEBUG] * pegasus-plan_main
2022.02.19 23:05:21.723 GMT: [DEBUG] local 3
2022.02.19 23:05:21.723 GMT: [DEBUG] Leaf jobs scheduled at site local are pegasus-plan_finalization,create_dir_4ogcringdown.dax_0_local,pegasus-plan_main,
2022.02.19 23:05:21.723 GMT: [DEBUG] File finalization.map will not be cleaned up for job pegasus-plan_finalization
2022.02.19 23:05:21.723 GMT: [DEBUG] File main.map will not be cleaned up for job pegasus-plan_main
2022.02.19 23:05:21.723 GMT: [DEBUG]
2022.02.19 23:05:21.724 GMT: [INFO] For site: local number of files cleaned up - 0
2022.02.19 23:05:21.724 GMT: [DEBUG] CLEANUP LIST
2022.02.19 23:05:21.724 GMT: [INFO] event.pegasus.generate.cleanup-nodes dax.id 4ogcringdown.dax-0 (0.008 seconds) - FINISHED
2022.02.19 23:05:21.724 GMT: [INFO] Adding Leaf Cleanup Jobs dax.id 4ogcringdown.dax-0 - STARTED
2022.02.19 23:05:21.726 GMT: [DEBUG] Directory URL is a file url for site local [file:///work/yifan.wang/ringdown/GW200224/220_330/4ogcringdown_output/local-site-scratch/work]
2022.02.19 23:05:21.726 GMT: [DEBUG] Trying to get TCEntries for pegasus::cleanup on resource local of type INSTALLED
2022.02.19 23:05:21.726 GMT: [DEBUG] Creating remove directory node cleanup_4ogcringdown.dax_0_local
2022.02.19 23:05:21.727 GMT: [DEBUG] Need to add edge for DAX|DAG job pegasus-plan_finalization -> cleanup_4ogcringdown.dax_0_local
2022.02.19 23:05:21.727 GMT: [DEBUG] Need to add edge pegasus-plan_finalization -> cleanup_4ogcringdown.dax_0_local
2022.02.19 23:05:21.727 GMT: [DEBUG] Need to add edge for DAX|DAG job pegasus-plan_main -> cleanup_4ogcringdown.dax_0_local
2022.02.19 23:05:21.727 GMT: [DEBUG] Adding node to the worklfow cleanup_4ogcringdown.dax_0_local
2022.02.19 23:05:21.727 GMT: [INFO] Adding Leaf Cleanup Jobs dax.id 4ogcringdown.dax-0 (0.003 seconds) - FINISHED
2022.02.19 23:05:21.729 GMT: [INFO] event.pegasus.refinement dax.id 4ogcringdown.dax-0 (0.112 seconds) - FINISHED
2022.02.19 23:05:21.764 GMT: [DEBUG] Condor Version as string 8.8.9
2022.02.19 23:05:21.764 GMT: [DEBUG] Condor Version detected is 80809
2022.02.19 23:05:21.765 GMT: [INFO] Generating codes for the executable workflow
2022.02.19 23:05:21.765 GMT: [INFO] event.pegasus.code.generation dax.id 4ogcringdown.dax-0 - STARTED
2022.02.19 23:05:21.766 GMT: [CONFIG] Kickstart Stating Disabled Completely - false
2022.02.19 23:05:21.766 GMT: [DEBUG] event.pegasus.code.generation dax.id 4ogcringdown.dax-0 - STARTED
2022.02.19 23:05:21.796 GMT: [DEBUG] Condor Version as string 8.8.9
2022.02.19 23:05:21.797 GMT: [DEBUG] Condor Version detected is 80809
2022.02.19 23:05:21.798 GMT: [CONFIG] Kickstart Stating Disabled Completely - false
2022.02.19 23:05:21.800 GMT: [DEBUG] Applying priority of 800 to create_dir_4ogcringdown.dax_0_local
2022.02.19 23:05:21.834 GMT: [DEBUG] Mount Under Scratch Directories [/tmp, /var/tmp]
2022.02.19 23:05:21.835 GMT: [DEBUG] Trying to get TCEntries for pegasus::kickstart on resource local of type INSTALLED
2022.02.19 23:05:21.836 GMT: [DEBUG] Postscript constructed is /software/tools/pegasus/5.0/bin/pegasus-exitcode
2022.02.19 23:05:21.838 GMT: [DEBUG] Unquoted arguments are pegasus-kickstart -n pegasus::dirmanager -N null -i - -R local -L 4ogcringdown.dax -T 2022-02-19T23:04:30+00:00 pegasus-transfer
2022.02.19 23:05:21.839 GMT: [DEBUG] Quoted arguments are "pegasus-kickstart -n pegasus::dirmanager -N null -i - -R local -L 4ogcringdown.dax -T 2022-02-19T23:04:30+00:00 pegasus-transfer "
2022.02.19 23:05:21.841 GMT: [DEBUG] Written Submit file : /local/user/yifan.wang/pycbc-tmp.09h3f7gSyR/work/./create_dir_4ogcringdown.dax_0_local.sub
2022.02.19 23:05:21.841 GMT: [DEBUG] Applying priority of 10 to pegasus-plan_main
2022.02.19 23:05:21.841 GMT: [DEBUG] Generating code for DAX job pegasus-plan_main
2022.02.19 23:05:21.841 GMT: [DEBUG] Arguments passed to SUBDAX Generator are -Dpegasus.dir.storage.mapper.replica.file=main.map --basename main --cluster label,horizontal --output-sites local --staging-site local=local,condorpool_shared=condorpool_shared --cleanup inplace -vvv /work/yifan.wang/ringdown/GW200224/220_330/4ogcringdown_output/main.dax
2022.02.19 23:05:21.844 GMT: [DEBUG] Retrieving Metadata from the DAX file /work/yifan.wang/ringdown/GW200224/220_330/4ogcringdown_output/main.dax
2022.02.19 23:05:21.858 GMT: [DEBUG] Submit directory in sub dax specified is ./main.dax_main
2022.02.19 23:05:21.858 GMT: [DEBUG] Base Submit directory for inner workflow set to /local/user/yifan.wang/pycbc-tmp.09h3f7gSyR
2022.02.19 23:05:21.858 GMT: [DEBUG] Relative Submit Directory for inner workflow set to work/././main.dax_main
2022.02.19 23:05:21.859 GMT: [DEBUG] Submit directory for inner workflow set to /local/user/yifan.wang/pycbc-tmp.09h3f7gSyR/work/././main.dax_main
2022.02.19 23:05:21.859 GMT: [DEBUG] Setting list of execution sites to the same as outer workflow
2022.02.19 23:05:21.859 GMT: [DEBUG] Submit Directory for SUB DAX is /local/user/yifan.wang/pycbc-tmp.09h3f7gSyR/work/././main.dax_main
2022.02.19 23:05:21.859 GMT: [DEBUG] Relative Execution Directory for SUB DAX is work/./main.dax_main
2022.02.19 23:05:21.859 GMT: [DEBUG] Trying to get TCEntries for pegasus::pegasus-plan on resource local of type INSTALLED
2022.02.19 23:05:21.859 GMT: [DEBUG] Constructing the default path to the pegasus-plan
2022.02.19 23:05:21.860 GMT: [DEBUG] pegasus-plan invocation for job pegasus-plan_main determined to be
/software/tools/pegasus/5.0/bin/pegasus-plan -Dpegasus.log.*=/local/user/yifan.wang/pycbc-tmp.09h3f7gSyR/work/pegasus-plan_main.pre.log -Dpegasus.workflow.root.uuid=44004f20-9e9b-42e2-a1ec-084c27eb3e84 -Dpegasus.dir.storage.mapper.replica.file=main.map --conf /local/user/yifan.wang/pycbc-tmp.09h3f7gSyR/work/pegasus.7922074547850858695.properties --dir /local/user/yifan.wang/pycbc-tmp.09h3f7gSyR --relative-dir work/./main.dax_main --relative-submit-dir work/././main.dax_main --basename main --sites condorpool_shared,local --staging-site condorpool_shared=condorpool_shared,local=local, --cache /local/user/yifan.wang/pycbc-tmp.09h3f7gSyR/work/./pegasus-plan_main.cache --inherited-rc-files /local/user/yifan.wang/pycbc-tmp.09h3f7gSyR/work/4ogcringdown.dax-0.replica.store --cluster label,horizontal --output-sites local --cleanup inplace --verbose --verbose --verbose --deferred /work/yifan.wang/ringdown/GW200224/220_330/4ogcringdown_output/main.dax
2022.02.19 23:05:21.860 GMT: [DEBUG] Basename prefix for the sub workflow is main
2022.02.19 23:05:21.860 GMT: [DEBUG] Cache File for the sub workflow is /local/user/yifan.wang/pycbc-tmp.09h3f7gSyR/work/././main.dax_main/main.cache
2022.02.19 23:05:21.860 GMT: [DEBUG] Trying to get TCEntries for condor::dagman on resource local of type INSTALLED
2022.02.19 23:05:21.860 GMT: [DEBUG] condor::dagman not catalogued in the Transformation Catalog. Trying to construct from the Site Catalog
2022.02.19 23:05:21.860 GMT: [DEBUG] DAGMan not catalogued in the Transformation Catalog or the Site Catalog. Trying to construct from the environment
2022.02.19 23:05:21.860 GMT: [DEBUG] Constructing path to dagman on basis of env variable CONDOR_LOCATION
2022.02.19 23:05:21.860 GMT: [DEBUG] Number of Resuce retries 999
2022.02.19 23:05:21.860 GMT: [DEBUG] Constructing arguments to dagman in 7.1.0 and later style
2022.02.19 23:05:21.865 GMT: [CONFIG] Kickstart Stating Disabled Completely - false
2022.02.19 23:05:21.870 GMT: [DEBUG] Setting job pegasus-plan_main.pre to run via No container wrapping
2022.02.19 23:05:21.870 GMT: [DEBUG] Trying to get TCEntries for pegasus::kickstart on resource local of type INSTALLED
2022.02.19 23:05:21.870 GMT: [DEBUG] Trying to get TCEntries for pegasus::transfer on resource local of type INSTALLED
2022.02.19 23:05:21.873 GMT: [DEBUG] Unquoted arguments are -p 0 -f -l . -Notification never -Debug 3 -Lockfile main.dag.lock -Dag main.dag -AllowVersionMismatch -AutoRescue 1 -DoRescueFrom 0 -MaxPre 1
2022.02.19 23:05:21.873 GMT: [DEBUG] Quoted arguments are " -p 0 -f -l . -Notification never -Debug 3 -Lockfile main.dag.lock -Dag main.dag -AllowVersionMismatch -AutoRescue 1 -DoRescueFrom 0 -MaxPre 1"
2022.02.19 23:05:21.873 GMT: [DEBUG] Written Submit file : /local/user/yifan.wang/pycbc-tmp.09h3f7gSyR/work/./pegasus-plan_main.sub
2022.02.19 23:05:21.874 GMT: [DEBUG] Applying priority of 20 to pegasus-plan_finalization
2022.02.19 23:05:21.874 GMT: [DEBUG] Generating code for DAX job pegasus-plan_finalization
2022.02.19 23:05:21.874 GMT: [DEBUG] Arguments passed to SUBDAX Generator are -Dpegasus.dir.storage.mapper.replica.file=finalization.map --basename finalization --cluster label,horizontal --output-sites local --staging-site local=local,condorpool_shared=condorpool_shared --cleanup inplace -vvv /work/yifan.wang/ringdown/GW200224/220_330/4ogcringdown_output/finalization.dax
2022.02.19 23:05:21.875 GMT: [DEBUG] Retrieving Metadata from the DAX file /work/yifan.wang/ringdown/GW200224/220_330/4ogcringdown_output/finalization.dax
2022.02.19 23:05:21.879 GMT: [DEBUG] Submit directory in sub dax specified is ./finalization.dax_finalization
2022.02.19 23:05:21.880 GMT: [DEBUG] Base Submit directory for inner workflow set to /local/user/yifan.wang/pycbc-tmp.09h3f7gSyR
2022.02.19 23:05:21.880 GMT: [DEBUG] Relative Submit Directory for inner workflow set to work/././finalization.dax_finalization
2022.02.19 23:05:21.880 GMT: [DEBUG] Submit directory for inner workflow set to /local/user/yifan.wang/pycbc-tmp.09h3f7gSyR/work/././finalization.dax_finalization
2022.02.19 23:05:21.880 GMT: [DEBUG] Setting list of execution sites to the same as outer workflow
2022.02.19 23:05:21.880 GMT: [DEBUG] Parent DAX Jobs Transient RC's are [null]
2022.02.19 23:05:21.880 GMT: [DEBUG] Submit Directory for SUB DAX is /local/user/yifan.wang/pycbc-tmp.09h3f7gSyR/work/././finalization.dax_finalization
2022.02.19 23:05:21.880 GMT: [DEBUG] Relative Execution Directory for SUB DAX is work/./finalization.dax_finalization
2022.02.19 23:05:21.880 GMT: [DEBUG] Trying to get TCEntries for pegasus::pegasus-plan on resource local of type INSTALLED
2022.02.19 23:05:21.880 GMT: [DEBUG] Constructing the default path to the pegasus-plan
2022.02.19 23:05:21.880 GMT: [DEBUG] pegasus-plan invocation for job pegasus-plan_finalization determined to be
/software/tools/pegasus/5.0/bin/pegasus-plan -Dpegasus.log.*=/local/user/yifan.wang/pycbc-tmp.09h3f7gSyR/work/pegasus-plan_finalization.pre.log -Dpegasus.workflow.root.uuid=44004f20-9e9b-42e2-a1ec-084c27eb3e84 -Dpegasus.dir.storage.mapper.replica.file=finalization.map --conf /local/user/yifan.wang/pycbc-tmp.09h3f7gSyR/work/pegasus.7922074547850858695.properties --dir /local/user/yifan.wang/pycbc-tmp.09h3f7gSyR --relative-dir work/./finalization.dax_finalization --relative-submit-dir work/././finalization.dax_finalization --basename finalization --sites condorpool_shared,local --staging-site condorpool_shared=condorpool_shared,local=local, --cache null,/local/user/yifan.wang/pycbc-tmp.09h3f7gSyR/work/./pegasus-plan_finalization.cache --inherited-rc-files /local/user/yifan.wang/pycbc-tmp.09h3f7gSyR/work/4ogcringdown.dax-0.replica.store --cluster label,horizontal --output-sites local --cleanup inplace --verbose --verbose --verbose --deferred /work/yifan.wang/ringdown/GW200224/220_330/4ogcringdown_output/finalization.dax
2022.02.19 23:05:21.880 GMT: [DEBUG] Basename prefix for the sub workflow is finalization
2022.02.19 23:05:21.880 GMT: [DEBUG] Cache File for the sub workflow is /local/user/yifan.wang/pycbc-tmp.09h3f7gSyR/work/././finalization.dax_finalization/finalization.cache
2022.02.19 23:05:21.880 GMT: [DEBUG] Trying to get TCEntries for condor::dagman on resource local of type INSTALLED
2022.02.19 23:05:21.880 GMT: [DEBUG] condor::dagman not catalogued in the Transformation Catalog. Trying to construct from the Site Catalog
2022.02.19 23:05:21.880 GMT: [DEBUG] DAGMan not catalogued in the Transformation Catalog or the Site Catalog. Trying to construct from the environment
2022.02.19 23:05:21.880 GMT: [DEBUG] Constructing path to dagman on basis of env variable CONDOR_LOCATION
2022.02.19 23:05:21.880 GMT: [DEBUG] Number of Resuce retries 999
2022.02.19 23:05:21.880 GMT: [DEBUG] Constructing arguments to dagman in 7.1.0 and later style
2022.02.19 23:05:21.881 GMT: [DEBUG] Setting job pegasus-plan_finalization.pre to run via No container wrapping
2022.02.19 23:05:21.881 GMT: [DEBUG] Trying to get TCEntries for pegasus::kickstart on resource local of type INSTALLED
2022.02.19 23:05:21.881 GMT: [DEBUG] Trying to get TCEntries for pegasus::transfer on resource local of type INSTALLED
2022.02.19 23:05:21.882 GMT: [DEBUG] Unquoted arguments are -p 0 -f -l . -Notification never -Debug 3 -Lockfile finalization.dag.lock -Dag finalization.dag -AllowVersionMismatch -AutoRescue 1 -DoRescueFrom 0 -MaxPre 1
2022.02.19 23:05:21.882 GMT: [DEBUG] Quoted arguments are " -p 0 -f -l . -Notification never -Debug 3 -Lockfile finalization.dag.lock -Dag finalization.dag -AllowVersionMismatch -AutoRescue 1 -DoRescueFrom 0 -MaxPre 1"
2022.02.19 23:05:21.882 GMT: [DEBUG] Written Submit file : /local/user/yifan.wang/pycbc-tmp.09h3f7gSyR/work/./pegasus-plan_finalization.sub
2022.02.19 23:05:21.882 GMT: [DEBUG] Applying priority of 1000 to cleanup_4ogcringdown.dax_0_local
2022.02.19 23:05:21.883 GMT: [DEBUG] Trying to get TCEntries for pegasus::kickstart on resource local of type INSTALLED
2022.02.19 23:05:21.883 GMT: [DEBUG] Postscript constructed is /software/tools/pegasus/5.0/bin/pegasus-exitcode
2022.02.19 23:05:21.883 GMT: [DEBUG] Unquoted arguments are pegasus-kickstart -n pegasus::cleanup -N null -i - -R local -L 4ogcringdown.dax -T 2022-02-19T23:04:30+00:00 pegasus-transfer
2022.02.19 23:05:21.883 GMT: [DEBUG] Quoted arguments are "pegasus-kickstart -n pegasus::cleanup -N null -i - -R local -L 4ogcringdown.dax -T 2022-02-19T23:04:30+00:00 pegasus-transfer "
2022.02.19 23:05:21.883 GMT: [DEBUG] Written Submit file : /local/user/yifan.wang/pycbc-tmp.09h3f7gSyR/work/./cleanup_4ogcringdown.dax_0_local.sub
2022.02.19 23:05:21.883 GMT: [DEBUG] event.pegasus.code.generation dax.id 4ogcringdown.dax-0 (0.117 seconds) - FINISHED
2022.02.19 23:05:21.884 GMT: [DEBUG] Written Dag File : 4ogcringdown.dax-0.dag.tmp
2022.02.19 23:05:21.884 GMT: [DEBUG] Writing out the DOT file
2022.02.19 23:05:21.891 GMT: [DEBUG] Written out notifications to 4ogcringdown.dax-0.notify
2022.02.19 23:05:21.891 GMT: [DEBUG] Writing out the DAX Replica Store to file /local/user/yifan.wang/pycbc-tmp.09h3f7gSyR/work/4ogcringdown.dax-0.replica.store
2022.02.19 23:05:21.891 GMT: [DEBUG] [Replica Factory] Connect properties detected for implementor SimpleFile -> {output=JDBCRC, output.db.create=true, file=/local/user/yifan.wang/pycbc-tmp.09h3f7gSyR/work/4ogcringdown.dax-0.replica.store, dax.asrc=true, output.db.url=jdbc:sqlite:/local/user/yifan.wang/pycbc-tmp.09h3f7gSyR/work/4ogcringdown.dax-0.replicas.db, output.db.driver=sqlite, cache.asrc=true}
2022.02.19 23:05:21.892 GMT: [DEBUG] Written out dax replica store to 4ogcringdown.dax-0.replica.store
2022.02.19 23:05:21.894 GMT: [DEBUG] Written out stampede events for the executable workflow to /local/user/yifan.wang/pycbc-tmp.09h3f7gSyR/work/4ogcringdown.dax-0.static.bp
2022.02.19 23:05:21.900 GMT: [DEBUG] Proxy whose DN will be logged in the braindump file /tmp/x509up_u44039
2022.02.19 23:05:21.950 GMT: [DEBUG] Unable to determine GRID DN class org.globus.gsi.gssapi.GlobusGSSException: Defective credential detected [Caused by: proxy not found]
2022.02.19 23:05:21.978 GMT: [DEBUG] Written out braindump to /local/user/yifan.wang/pycbc-tmp.09h3f7gSyR/work/braindump.yml
2022.02.19 23:05:21.978 GMT: [DEBUG] Renamed temporary dag file to : /local/user/yifan.wang/pycbc-tmp.09h3f7gSyR/work/4ogcringdown.dax-0.dag
2022.02.19 23:05:21.980 GMT: [DEBUG] Executing /usr/bin/condor_submit_dag -append executable=/software/tools/pegasus/5.0/bin/pegasus-dagman -no_submit -MaxPre 1 -MaxPost 20 -append +pegasus_wf_uuid="44004f20-9e9b-42e2-a1ec-084c27eb3e84" -append +pegasus_root_wf_uuid="44004f20-9e9b-42e2-a1ec-084c27eb3e84" -append +pegasus_wf_name="4ogcringdown.dax-0" -append +pegasus_wf_time="20220219T230521+0000" -append +pegasus_version="5.0.1" -append +pegasus_job_class=11 -append +pegasus_cluster_size=1 -append +pegasus_site="local" -append +pegasus_execution_sites="condorpool_shared,local" -append +pegasus_wf_xformation="pegasus::dagman" 4ogcringdown.dax-0.dag with environment = PATH=/software/tools/pegasus/5.0/bin/:/work/yifan.wang/virtualenv/sgwb/bin:/work/yifan.wang/lscsoft/opt/accomlal/bin:/usr/local/bin:/usr/bin:/bin:/usr/local/games:/usr/games;PKG_CONFIG_PATH=/work/yifan.wang/lscsoft/opt/accomlal/lib/pkgconfig:;LAL_DATA_PATH=/atlas/recent/cbc/ROM_data/;TZ=:/etc/localtime;MODULEPATH=/etc/environment-modules/modules:/usr/share/modules/versions:/usr/share/modules/$MODULE_VERSION/modulefiles:/usr/share/modules/modulefiles;PEGASUS_PERL_DIR=/software/tools/pegasus/5.0/lib/pegasus/perl;DBUS_SESSION_BUS_ADDRESS=unix:path=/run/user/44039/bus;MAIL=/var/mail/yifan.wang;LD_LIBRARY_PATH=/work/yifan.wang/eccsearch/C:/work/yifan.wang/1-ecc-waveform-PE/IMRPhenomDecc:/work/yifan.wang/lscsoft/opt/accomlal/lib/:/work/yifan.wang/lscsoft/MultiNest/lib:;LOGNAME=yifan.wang;PWD=/work/yifan.wang/ringdown/GW200224/220_330/4ogcringdown_output;PYTHONPATH=/software/tools/pegasus/5.0/lib/python3.7/dist-packages/:/work/yifan.wang/eccsearch/waveform/ihes-teobresum/Python:/work/yifan.wang/eccsearch/waveform/PyCBC-teobresums:/work/yifan.wang/1-ecc-waveform-PE/IMRPhenomDecc:/work/yifan.wang/lscsoft/src/TaylorF2e:;SHELL=/bin/bash;BASH_ENV=/usr/share/modules/init/bash;LM_LICENSE_FILE=/opt/matlab/default/etc/license.dat;OLDPWD=/work/yifan.wang/ringdown/GW200224/220_330;TMPDIR=/local/user/yifan.wang;VIRTUAL_ENV=/work/yifan.wang/virtualenv/sgwb;MODULEPATH_modshare=/etc/environment-modules/modules:1:/usr/share/modules/$MODULE_VERSION/modulefiles:1:/usr/share/modules/modulefiles:1:/usr/share/modules/versions:1;LC_ALL=C;PEGASUS_PYTHON_DIR=/software/tools/pegasus/5.0/lib/python3.7/dist-packages;LC_CTYPE=en_US.UTF-8;SHLVL=2;SLACK_BOT_TOKEN=xoxp-1889174914644-1876230893430-1893291262468-e9d8766407c79a32862f209441631a1f;CONDOR_LOCATION=/usr;LOADEDMODULES=;SCRATCH=/local/user/yifan.wang;PEGASUS_SCHEMA_DIR=/software/tools/pegasus/5.0/share/pegasus/schema;JAVA_HOME=/usr/lib/jvm/default-java;TERM=xterm-256color;ENV=/usr/share/modules/init/profile.sh;LANG=en_US.UTF-8;XDG_SESSION_ID=149717;XDG_SESSION_TYPE=tty;PEGASUS_PYTHON_EXTERNALS_DIR=/software/tools/pegasus/5.0/lib/pegasus/externals/python;XDG_SESSION_CLASS=user;_=/software/tools/pegasus/5.0/bin/pegasus-plan;PEGASUS_JAVA_DIR=/software/tools/pegasus/5.0/share/pegasus/java;SSH_TTY=/dev/pts/5;SSH_CLIENT=130.75.117.49 62564 22;USER=yifan.wang;CLASSPATH=/software/tools/pegasus/5.0/share/pegasus/java/accessors.jar:/software/tools/pegasus/5.0/share/pegasus/java/bcprov-jdk15on-150.jar:/software/tools/pegasus/5.0/share/pegasus/java/btf-1.2.jar:/software/tools/pegasus/5.0/share/pegasus/java/commons-lang3-3.4.jar:/software/tools/pegasus/5.0/share/pegasus/java/commons-logging.jar:/software/tools/pegasus/5.0/share/pegasus/java/commons-pool.jar:/software/tools/pegasus/5.0/share/pegasus/java/exist-optional.jar:/software/tools/pegasus/5.0/share/pegasus/java/exist.jar:/software/tools/pegasus/5.0/share/pegasus/java/gram-2.1.0.jar:/software/tools/pegasus/5.0/share/pegasus/java/gridftp-2.1.0.jar:/software/tools/pegasus/5.0/share/pegasus/java/gson-2.2.2.jar:/software/tools/pegasus/5.0/share/pegasus/java/gss-2.1.0.jar:/software/tools/pegasus/5.0/share/pegasus/java/guava-16.0.1.jar:/software/tools/pegasus/5.0/share/pegasus/java/hamcrest-core-1.3.jar:/software/tools/pegasus/5.0/share/pegasus/java/io-2.1.0.jar:/software/tools/pegasus/5.0/share/pegasus/java/jackson-annotations-2.9.10.jar:/software/tools/pegasus/5.0/share/pegasus/java/jackson-core-2.9.10.jar:/software/tools/pegasus/5.0/share/pegasus/java/jackson-coreutils-1.8.jar:/software/tools/pegasus/5.0/share/pegasus/java/jackson-databind-2.9.10.jar:/software/tools/pegasus/5.0/share/pegasus/java/jackson-dataformat-yaml-2.9.10.jar:/software/tools/pegasus/5.0/share/pegasus/java/jakarta-oro.jar:/software/tools/pegasus/5.0/share/pegasus/java/java-getopt-1.0.9.jar:/software/tools/pegasus/5.0/share/pegasus/java/javax.json-1.0.4.jar:/software/tools/pegasus/5.0/share/pegasus/java/jcodings-1.0.46.jar:/software/tools/pegasus/5.0/share/pegasus/java/joda-time-2.3.jar:/software/tools/pegasus/5.0/share/pegasus/java/joni-2.1.31.jar:/software/tools/pegasus/5.0/share/pegasus/java/json-schema-validator-1.0.41.jar:/software/tools/pegasus/5.0/share/pegasus/java/jsse-2.1.0.jar:/software/tools/pegasus/5.0/share/pegasus/java/libphonenumber-6.2.jar:/software/tools/pegasus/5.0/share/pegasus/java/log4j-1.2.17.jar:/software/tools/pegasus/5.0/share/pegasus/java/mailapi-1.4.3.jar:/software/tools/pegasus/5.0/share/pegasus/java/msg-simple-1.1.jar:/software/tools/pegasus/5.0/share/pegasus/java/myproxy-2.1.0.jar:/software/tools/pegasus/5.0/share/pegasus/java/mysql-connector-java-5.1.47.jar:/software/tools/pegasus/5.0/share/pegasus/java/pegasus-aws-batch.jar:/software/tools/pegasus/5.0/share/pegasus/java/pegasus.jar:/software/tools/pegasus/5.0/share/pegasus/java/postgresql-8.1dev-400.jdbc3.jar:/software/tools/pegasus/5.0/share/pegasus/java/resolver.jar:/software/tools/pegasus/5.0/share/pegasus/java/rhino-1.7R4.jar:/software/tools/pegasus/5.0/share/pegasus/java/snakeyaml-1.25.jar:/software/tools/pegasus/5.0/share/pegasus/java/sqlite-jdbc-3.8.11.2.jar:/software/tools/pegasus/5.0/share/pegasus/java/ssl-proxies-2.1.0-patched.jar:/software/tools/pegasus/5.0/share/pegasus/java/super-csv-2.4.0.jar:/software/tools/pegasus/5.0/share/pegasus/java/uri-template-0.9.jar:/software/tools/pegasus/5.0/share/pegasus/java/vdl.jar:/software/tools/pegasus/5.0/share/pegasus/java/xercesImpl.jar:/software/tools/pegasus/5.0/share/pegasus/java/xml-apis-1.4.01.jar:/software/tools/pegasus/5.0/share/pegasus/java/xmlParserAPIs.jar:/software/tools/pegasus/5.0/share/pegasus/java/xmldb.jar:/software/tools/pegasus/5.0/share/pegasus/java/xmlrpc.jar:/software/tools/pegasus/5.0/share/pegasus/java/aws/apache-client-2.0.0-preview-2.jar:/software/tools/pegasus/5.0/share/pegasus/java/aws/batch-2.0.0-preview-2.jar:/software/tools/pegasus/5.0/share/pegasus/java/aws/commons-logging-api-1.1.jar:/software/tools/pegasus/5.0/share/pegasus/java/aws/core-2.0.0-preview-2.jar:/software/tools/pegasus/5.0/share/pegasus/java/aws/http-client-spi-2.0.0-preview-2.jar:/software/tools/pegasus/5.0/share/pegasus/java/aws/httpclient-4.5.2.jar:/software/tools/pegasus/5.0/share/pegasus/java/aws/httpcore-4.4.4.jar:/software/tools/pegasus/5.0/share/pegasus/java/aws/jackson-annotations-2.8.8.jar:/software/tools/pegasus/5.0/share/pegasus/java/aws/jackson-databind-2.8.8.jar:/software/tools/pegasus/5.0/share/pegasus/java/aws/jackson-jr-objects-2.9.0.pr4.jar:/software/tools/pegasus/5.0/share/pegasus/java/aws/joda-time-2.8.1.jar:/software/tools/pegasus/5.0/share/pegasus/java/aws/jopt-simple-5.0.4.jar:/software/tools/pegasus/5.0/share/pegasus/java/aws/logs-2.0.0-preview-2.jar:/software/tools/pegasus/5.0/share/pegasus/java/aws/metrics-spi-2.0.0-preview-2.jar:/software/tools/pegasus/5.0/share/pegasus/java/aws/netty-nio-client-2.0.0-preview-2.jar:/software/tools/pegasus/5.0/share/pegasus/java/aws/s3-2.0.0-preview-2.jar:/software/tools/pegasus/5.0/share/pegasus/java/aws/slf4j-api-1.7.25.jar:/software/tools/pegasus/5.0/share/pegasus/java/aws/slf4j-log4j12-1.7.25.jar:/software/tools/pegasus/5.0/share/pegasus/java/aws/thirdparty-logging.jar:/software/tools/pegasus/5.0/share/pegasus/java/aws/thirdparty.jar:/software/tools/pegasus/5.0/share/pegasus/java/aws/utils-2.0.0-preview-2.jar;C_INCLUDE_PATH=/work/yifan.wang/lscsoft/opt/accomlal/include:;PEGASUS_BIN_DIR=/software/tools/pegasus/5.0/bin;SSH_CONNECTION=130.75.117.49 62564 130.75.116.17 22;MODULESHOME=/usr/share/modules;PEGASUS_CONF_DIR=/software/tools/pegasus/5.0/etc;TMP=/local/user/yifan.wang;PYCBC_WAVEFORM=taylorf2e;PEGASUS_ORIG_CLASSPATH=;MODULES_CMD=/usr/lib/x86_64-linux-gnu/modulecmd.tcl;LIGO_DATAFIND_SERVER=ldr.atlas.local:80;GW_DATAFIND_SERVER=;XDG_RUNTIME_DIR=/run/user/44039;PEGASUS_SHARE_DIR=/software/tools/pegasus/5.0/share/pegasus;HOME=/work/yifan.wang;
2022.02.19 23:05:22.005 GMT:
2022.02.19 23:05:22.011 GMT: -----------------------------------------------------------------------
2022.02.19 23:05:22.016 GMT: File for submitting this DAG to HTCondor : 4ogcringdown.dax-0.dag.condor.sub
2022.02.19 23:05:22.021 GMT: Log of DAGMan debugging messages : 4ogcringdown.dax-0.dag.dagman.out
2022.02.19 23:05:22.026 GMT: Log of HTCondor library output : 4ogcringdown.dax-0.dag.lib.out
2022.02.19 23:05:22.031 GMT: Log of HTCondor library error messages : 4ogcringdown.dax-0.dag.lib.err
2022.02.19 23:05:22.037 GMT: Log of the life of condor_dagman itself : 4ogcringdown.dax-0.dag.dagman.log
2022.02.19 23:05:22.042 GMT:
2022.02.19 23:05:22.047 GMT: -no_submit given, not submitting DAG to HTCondor. You can do this with:
2022.02.19 23:05:22.057 GMT: -----------------------------------------------------------------------
2022.02.19 23:05:22.062 GMT: [DEBUG] condor_submit_dag exited with status 0
2022.02.19 23:05:22.070 GMT: [DEBUG] Updated environment for dagman is environment = _CONDOR_SCHEDD_ADDRESS_FILE=/local/condor/spool/.schedd_address;_CONDOR_MAX_DAGMAN_LOG=0;_CONDOR_SCHEDD_DAEMON_AD_FILE=/local/condor/spool/.schedd_classad;_CONDOR_DAGMAN_LOG=4ogcringdown.dax-0.dag.dagman.out;PEGASUS_METRICS=true;
2022.02.19 23:05:22.071 GMT: [INFO] event.pegasus.code.generation dax.id 4ogcringdown.dax-0 (0.306 seconds) - FINISHED
2022.02.19 23:05:22.073 GMT: [DEBUG] Executing /software/tools/pegasus/5.0/bin/pegasus-db-admin update -t master -c /local/user/yifan.wang/pycbc-tmp.09h3f7gSyR/work/pegasus.7922074547850858695.properties
2022.02.19 23:05:23.165 GMT: Database version: '5.0.1' (sqlite:////work/yifan.wang/.pegasus/workflow.db)
2022.02.19 23:05:23.216 GMT: [DEBUG] pegasus-db-admin exited with status 0
2022.02.19 23:05:23.219 GMT: [DEBUG] Executing /software/tools/pegasus/5.0/bin/pegasus-db-admin create -Dpegasus.catalog.replica.db.url=jdbc:sqlite:/local/user/yifan.wang/pycbc-tmp.09h3f7gSyR/work/4ogcringdown.dax-0.replicas.db -Dpegasus.catalog.replica=JDBCRC -Dpegasus.catalog.replica.db.driver=sqlite -t jdbcrc -c /local/user/yifan.wang/pycbc-tmp.09h3f7gSyR/work/pegasus.7922074547850858695.properties
2022.02.19 23:05:24.979 GMT: Pegasus database was successfully created.
2022.02.19 23:05:24.984 GMT: Database version: '5.0.1' (sqlite:////local/user/yifan.wang/pycbc-tmp.09h3f7gSyR/work/4ogcringdown.dax-0.replicas.db)
2022.02.19 23:05:25.030 GMT: [DEBUG] pegasus-db-admin exited with status 0
2022.02.19 23:05:25.031 GMT: Output replica catalog set to jdbc:sqlite:/local/user/yifan.wang/pycbc-tmp.09h3f7gSyR/work/4ogcringdown.dax-0.replicas.db
2022.02.19 23:05:25.031 GMT: [DEBUG] Executing /software/tools/pegasus/5.0/bin/pegasus-run --nogrid /local/user/yifan.wang/pycbc-tmp.09h3f7gSyR/work
2022.02.19 23:05:25.341 GMT: Submitting to condor 4ogcringdown.dax-0.dag.condor.sub
2022.02.19 23:05:36.995 GMT:
2022.02.19 23:05:37.000 GMT: Your workflow has been started and is running in the base directory:
2022.02.19 23:05:37.005 GMT:
2022.02.19 23:05:37.010 GMT: /local/user/yifan.wang/pycbc-tmp.09h3f7gSyR/work
2022.02.19 23:05:37.015 GMT:
2022.02.19 23:05:37.020 GMT: *** To monitor the workflow you can run ***
2022.02.19 23:05:37.026 GMT:
2022.02.19 23:05:37.031 GMT: pegasus-status -l /local/user/yifan.wang/pycbc-tmp.09h3f7gSyR/work
2022.02.19 23:05:37.036 GMT:
2022.02.19 23:05:37.041 GMT: *** To remove your workflow run ***
2022.02.19 23:05:37.046 GMT:
2022.02.19 23:05:37.051 GMT: pegasus-remove /local/user/yifan.wang/pycbc-tmp.09h3f7gSyR/work
2022.02.19 23:05:37.057 GMT: [DEBUG] Submission of workflow exited with status 0
2022.02.19 23:05:37.066 GMT: [DEBUG] Sending Planner Metrics to [1 of 1] http://metrics.pegasus.isi.edu/metrics
2022.02.19 23:05:37.551 GMT: [DEBUG] Metrics succesfully sent to the server
2022.02.19 23:05:37.552 GMT: Time taken to execute is 16.154 seconds
2022.02.19 23:05:37.552 GMT: [INFO] event.pegasus.planner planner.version 5.0.1 (16.509 seconds) - FINISHED
Querying Pegasus database for workflow stored in /local/user/yifan.wang/pycbc-tmp.09h3f7gSyR/work
This may take up to 120 seconds. Please wait......... Done.
Workflow submission completed successfully.
The Pegasus dashboard URL for this workflow is:
https://condor2.atlas.local/pegasus/u/yifan.wang/r/41/w?wf_uuid=44004f20-9e9b-42e2-a1ec-084c27eb3e84
Note that it make take a while for the dashboard entry to appear while the workflow
is parsed by the dashboard. The delay can be on the order of one hour for very large
workflows.