Generating concrete workflow 2022.10.05 22:09:46.344 GMT: [INFO] Planner launched in the following directory /work/yifan.wang/search-high-spin/prod/runs/O1/7 2022.10.05 22:09:46.364 GMT: [INFO] Planner invoked with following arguments --conf ./pegasus-properties.conf --output-sites local --sites local --staging-site local=local --cluster label,horizontal --cleanup inplace --relative-dir work -vvv --dir /local/user/yifan.wang/pycbc-tmp.1RDOkHIy6p --submit --forward nogrid gw.dax 2022.10.05 22:09:46.367 GMT: [CONFIG] Pegasus Properties set by the user 2022.10.05 22:09:46.367 GMT: [CONFIG] pegasus.catalog.replica.cache.asrc=true 2022.10.05 22:09:46.368 GMT: [CONFIG] pegasus.catalog.replica.dax.asrc=true 2022.10.05 22:09:46.368 GMT: [CONFIG] pegasus.catalog.workflow.amqp.url=amqp://friend:donatedata@msgs.pegasus.isi.edu:5672/prod/workflows 2022.10.05 22:09:46.368 GMT: [CONFIG] pegasus.dir.staging.mapper=Flat 2022.10.05 22:09:46.368 GMT: [CONFIG] pegasus.dir.storage.mapper=Replica 2022.10.05 22:09:46.369 GMT: [CONFIG] pegasus.dir.storage.mapper.replica=File 2022.10.05 22:09:46.369 GMT: [CONFIG] pegasus.dir.storage.mapper.replica.file=output.map 2022.10.05 22:09:46.369 GMT: [CONFIG] pegasus.dir.submit.mapper=Named 2022.10.05 22:09:46.369 GMT: [CONFIG] pegasus.home.bindir=/software/tools/pegasus/5.0/bin 2022.10.05 22:09:46.370 GMT: [CONFIG] pegasus.home.schemadir=/software/tools/pegasus/5.0/share/pegasus/schema 2022.10.05 22:09:46.370 GMT: [CONFIG] pegasus.home.sharedstatedir=/software/tools/pegasus/5.0/share/pegasus 2022.10.05 22:09:46.370 GMT: [CONFIG] pegasus.home.sysconfdir=/software/tools/pegasus/5.0/etc 2022.10.05 22:09:46.371 GMT: [CONFIG] pegasus.integrity.checking=none 2022.10.05 22:09:46.371 GMT: [CONFIG] pegasus.metrics.app=ligo-pycbc 2022.10.05 22:09:46.371 GMT: [CONFIG] pegasus.mode=production 2022.10.05 22:09:46.371 GMT: [CONFIG] pegasus.monitord.encoding=json 2022.10.05 22:09:46.371 GMT: [CONFIG] pegasus.register=False 2022.10.05 22:09:46.372 GMT: [CONFIG] pegasus.selector.replica=Regex 2022.10.05 22:09:46.372 GMT: [CONFIG] pegasus.selector.replica.regex.rank.1=file://(?!.*(cvmfs)).* 2022.10.05 22:09:46.372 GMT: [CONFIG] pegasus.selector.replica.regex.rank.2=file:///cvmfs/.* 2022.10.05 22:09:46.372 GMT: [CONFIG] pegasus.selector.replica.regex.rank.3=root://.* 2022.10.05 22:09:46.373 GMT: [CONFIG] pegasus.selector.replica.regex.rank.4=gsiftp://red-gridftp.unl.edu.* 2022.10.05 22:09:46.373 GMT: [CONFIG] pegasus.selector.replica.regex.rank.5=gridftp://.* 2022.10.05 22:09:46.373 GMT: [CONFIG] pegasus.selector.replica.regex.rank.6=.* 2022.10.05 22:09:46.373 GMT: [CONFIG] pegasus.transfer.bypass.input.staging=true 2022.10.05 22:09:46.373 GMT: [CONFIG] pegasus.transfer.links=true 2022.10.05 22:09:47.086 GMT: [INFO] event.pegasus.add.data-dependencies dax.id gw.dax-0 - STARTED 2022.10.05 22:09:47.086 GMT: [INFO] event.pegasus.add.data-dependencies dax.id gw.dax-0 (0.0 seconds) - FINISHED 2022.10.05 22:09:47.157 GMT: [DEBUG] Parsed DAX with following metrics {"compute_tasks":0,"dax_tasks":2,"dag_tasks":0,"total_tasks":2,"deleted_tasks":0,"dax_input_files":4,"dax_inter_files":0,"dax_output_files":0,"dax_total_files":4,"compute_jobs":0,"clustered_jobs":0,"si_tx_jobs":0,"so_tx_jobs":0,"inter_tx_jobs":0,"reg_jobs":0,"cleanup_jobs":0,"create_dir_jobs":0,"dax_jobs":2,"dag_jobs":0,"chmod_jobs":0,"total_jobs":2,"mDAXLabel":"gw.dax"} 2022.10.05 22:09:47.168 GMT: [CONFIG] Loading site catalog file /work/yifan.wang/search-high-spin/prod/runs/O1/7/sites.yml 2022.10.05 22:09:47.168 GMT: [DEBUG] All sites will be loaded from the site catalog 2022.10.05 22:09:47.169 GMT: [DEBUG] event.pegasus.parse.site-catalog site-catalog.id /work/yifan.wang/search-high-spin/prod/runs/O1/7/sites.yml - STARTED 2022.10.05 22:09:47.455 GMT: [DEBUG] event.pegasus.parse.site-catalog site-catalog.id /work/yifan.wang/search-high-spin/prod/runs/O1/7/sites.yml (0.284 seconds) - FINISHED 2022.10.05 22:09:47.456 GMT: [DEBUG] Sites loaded are [osg, condorpool_shared, condorpool_symlink, condorpool_copy, local] 2022.10.05 22:09:47.458 GMT: [CONFIG] Set environment profile for local site PATH=/opt/intel/2018/compilers_and_libraries_2018.0.128/linux/bin/intel64:/opt/intel/2018/compilers_and_libraries_2018.0.128/linux/mpi/intel64/bin:/software/tools/pegasus/5.0/bin/:/work/ahnitz/projects/4ogc/env/bin:/work/yifan.wang/lscsoft/opt/accomlal/bin:/usr/local/bin:/usr/bin:/bin:/usr/local/games:/usr/games 2022.10.05 22:09:47.459 GMT: [CONFIG] Set environment profile for local site PYTHONPATH=/software/tools/pegasus/5.0/lib/python3.7/dist-packages/:/work/yifan.wang/eccentricity/gitlab-eccsearch/waveform/ihes-teobresum/Python:/work/yifan.wang/eccentricity/gitlab-eccsearch//waveform/PyCBC-teobresums:/work/yifan.wang/lscsoft/src/TaylorF2e: 2022.10.05 22:09:47.460 GMT: [CONFIG] Constructed default site catalog entry for condorpool site condor 2022.10.05 22:09:47.596 GMT: [DEBUG] Mount Under Scratch Directories [/tmp, /var/tmp] 2022.10.05 22:09:47.597 GMT: [DEBUG] Style detected for site osg is class edu.isi.pegasus.planner.code.generator.condor.style.Condor 2022.10.05 22:09:47.597 GMT: [DEBUG] Style detected for site condorpool_shared is class edu.isi.pegasus.planner.code.generator.condor.style.Condor 2022.10.05 22:09:47.598 GMT: [DEBUG] Style detected for site condorpool is class edu.isi.pegasus.planner.code.generator.condor.style.Condor 2022.10.05 22:09:47.598 GMT: [DEBUG] Style detected for site condorpool_symlink is class edu.isi.pegasus.planner.code.generator.condor.style.Condor 2022.10.05 22:09:47.598 GMT: [DEBUG] Style detected for site condorpool_copy is class edu.isi.pegasus.planner.code.generator.condor.style.Condor 2022.10.05 22:09:47.598 GMT: [DEBUG] Style detected for site local is class edu.isi.pegasus.planner.code.generator.condor.style.Condor 2022.10.05 22:09:47.598 GMT: [DEBUG] Execution sites are [local] 2022.10.05 22:09:47.604 GMT: [CONFIG] Transformation Catalog Type used YAML TC 2022.10.05 22:09:47.612 GMT: [DEBUG] Ignoring error encountered while loading Transformation Catalog [1]: Unable to instantiate Transformation Catalog at edu.isi.pegasus.planner.catalog.transformation.TransformationFactory.loadInstance(TransformationFactory.java:164) [2]: edu.isi.pegasus.planner.catalog.transformation.impl.YAML caught java.lang.RuntimeException The File to be used as TC should be defined with the property pegasus.catalog.transformation.file at edu.isi.pegasus.planner.catalog.transformation.impl.YAML.connect(YAML.java:167) 2022.10.05 22:09:47.615 GMT: [DEBUG] Created a temporary transformation catalog backend /tmp/tc.11427274609685488692.txt 2022.10.05 22:09:47.617 GMT: [CONFIG] Transformation Catalog Type used Multiline Textual TC 2022.10.05 22:09:47.617 GMT: [CONFIG] Transformation Catalog File used /tmp/tc.11427274609685488692.txt 2022.10.05 22:09:47.628 GMT: [CONFIG] Data Configuration used for the workflow condorio 2022.10.05 22:09:47.630 GMT: [DEBUG] Directory to be created is /local/user/yifan.wang/pycbc-tmp.1RDOkHIy6p/yifan.wang/pegasus/gw.dax/run0001 2022.10.05 22:09:47.632 GMT: [CONFIG] Metrics file will be written out to /local/user/yifan.wang/pycbc-tmp.1RDOkHIy6p/work/gw.dax-0.metrics 2022.10.05 22:09:47.632 GMT: [CONFIG] The base submit directory for the workflow /local/user/yifan.wang/pycbc-tmp.1RDOkHIy6p 2022.10.05 22:09:47.632 GMT: [CONFIG] The relative submit directory for the workflow work 2022.10.05 22:09:47.632 GMT: [CONFIG] The relative execution directory for the workflow work 2022.10.05 22:09:47.635 GMT: [INFO] event.pegasus.stampede.events dax.id gw.dax-0 - STARTED 2022.10.05 22:09:47.656 GMT: [DEBUG] Written out stampede events for the abstract workflow to /local/user/yifan.wang/pycbc-tmp.1RDOkHIy6p/work/gw.dax-0.static.bp 2022.10.05 22:09:47.657 GMT: [INFO] event.pegasus.stampede.events dax.id gw.dax-0 (0.022 seconds) - FINISHED 2022.10.05 22:09:47.658 GMT: [INFO] event.pegasus.refinement dax.id gw.dax-0 - STARTED 2022.10.05 22:09:47.669 GMT: [CONFIG] Proxy used for Replica Catalog is /tmp/x509up_u44039 2022.10.05 22:09:47.672 GMT: [DEBUG] [Replica Factory] Connect properties detected {proxy=/tmp/x509up_u44039, read.only=true, dax.asrc=true, cache.asrc=true} 2022.10.05 22:09:47.676 GMT: [DEBUG] [Replica Factory] Connect properties detected for implementor edu.isi.pegasus.planner.catalog.replica.impl.YAML -> {proxy=/tmp/x509up_u44039, read.only=true, dax.asrc=true, cache.asrc=true} 2022.10.05 22:09:47.678 GMT: [DEBUG] Problem while connecting with the Replica Catalog: Unable to connect to replica catalog implementation edu.isi.pegasus.planner.catalog.replica.impl.YAML with props {proxy=/tmp/x509up_u44039, read.only=true, dax.asrc=true, cache.asrc=true} 2022.10.05 22:09:47.678 GMT: [DEBUG] Setting property dagman.registration.maxjobs to 1 to set max jobs for registrations jobs category 2022.10.05 22:09:47.681 GMT: [DEBUG] Copied /tmp/tc.11427274609685488692.txt to directory /local/user/yifan.wang/pycbc-tmp.1RDOkHIy6p/work/catalogs 2022.10.05 22:09:47.684 GMT: [DEBUG] Copied /work/yifan.wang/search-high-spin/prod/runs/O1/7/sites.yml to directory /local/user/yifan.wang/pycbc-tmp.1RDOkHIy6p/work/catalogs 2022.10.05 22:09:47.684 GMT: [DEBUG] Set Default output replica catalog properties to {pegasus.catalog.replica.output.db.driver=sqlite, pegasus.catalog.replica.output.db.url=jdbc:sqlite:/local/user/yifan.wang/pycbc-tmp.1RDOkHIy6p/work/gw.dax-0.replicas.db, pegasus.catalog.replica.output=JDBCRC} 2022.10.05 22:09:47.684 GMT: [INFO] event.pegasus.check.cyclic-dependencies dax.id gw.dax-0 - STARTED 2022.10.05 22:09:47.685 GMT: [INFO] event.pegasus.check.cyclic-dependencies dax.id gw.dax-0 (0.001 seconds) - FINISHED 2022.10.05 22:09:47.686 GMT: [DEBUG] 0 entries found in cache of total 4 2022.10.05 22:09:47.686 GMT: [DEBUG] 0 entries found in previous submit dirs of total 4 2022.10.05 22:09:47.687 GMT: [DEBUG] 0 entries found in input directories of total 4 2022.10.05 22:09:47.687 GMT: [DEBUG] 4 entries found in abstract workflow replica store of total 4 2022.10.05 22:09:47.687 GMT: [DEBUG] 0 entries found in inherited replica store of total 4 2022.10.05 22:09:47.687 GMT: [DEBUG] 0 entries found in input replica catalog of total 4 2022.10.05 22:09:47.687 GMT: [DEBUG] 4 entries found in all replica sources of total 4 2022.10.05 22:09:47.687 GMT: [CONFIG] Data Reuse Scope for the workflow: full 2022.10.05 22:09:47.687 GMT: [DEBUG] Reducing the workflow 2022.10.05 22:09:47.687 GMT: [INFO] event.pegasus.reduce dax.id gw.dax-0 - STARTED 2022.10.05 22:09:47.687 GMT: [DEBUG] Jobs whose o/p files already exist 2022.10.05 22:09:47.687 GMT: [DEBUG] Job pegasus-plan_gw-main has no o/p files 2022.10.05 22:09:47.688 GMT: [DEBUG] Job pegasus-plan_gw-finalization has no o/p files 2022.10.05 22:09:47.688 GMT: [DEBUG] Jobs whose o/p files already exist - DONE 2022.10.05 22:09:47.689 GMT: [DEBUG] pegasus-plan_gw-main will not be deleted as not as child pegasus-plan_gw-finalization is not marked for deletion 2022.10.05 22:09:47.689 GMT: [INFO] Nodes/Jobs Deleted from the Workflow during reduction 2022.10.05 22:09:47.689 GMT: [INFO] Nodes/Jobs Deleted from the Workflow during reduction - DONE 2022.10.05 22:09:47.689 GMT: [INFO] event.pegasus.reduce dax.id gw.dax-0 (0.002 seconds) - FINISHED 2022.10.05 22:09:47.689 GMT: [INFO] event.pegasus.siteselection dax.id gw.dax-0 - STARTED 2022.10.05 22:09:47.694 GMT: [DEBUG] List of executions sites is [local] 2022.10.05 22:09:47.697 GMT: [DEBUG] Job pegasus-plan_gw-main will be mapped based on selector|hints profile key execution.site 2022.10.05 22:09:47.697 GMT: [DEBUG] Job pegasus-plan_gw-finalization will be mapped based on selector|hints profile key execution.site 2022.10.05 22:09:47.698 GMT: [DEBUG] Setting up site mapping for job pegasus-plan_gw-main 2022.10.05 22:09:47.698 GMT: [DEBUG] Job was mapped to pegasus-plan_gw-main to site local 2022.10.05 22:09:47.700 GMT: [DEBUG] Setting up site mapping for job pegasus-plan_gw-finalization 2022.10.05 22:09:47.700 GMT: [DEBUG] Job was mapped to pegasus-plan_gw-finalization to site local 2022.10.05 22:09:47.701 GMT: [INFO] event.pegasus.stampede.events dax.id gw.dax-0 - STARTED 2022.10.05 22:09:47.702 GMT: [DEBUG] Written out stampede metadata events for the mapped workflow to /local/user/yifan.wang/pycbc-tmp.1RDOkHIy6p/work/gw.dax-0.static.bp 2022.10.05 22:09:47.702 GMT: [INFO] event.pegasus.stampede.events dax.id gw.dax-0 (0.001 seconds) - FINISHED 2022.10.05 22:09:47.702 GMT: [INFO] event.pegasus.siteselection dax.id gw.dax-0 (0.013 seconds) - FINISHED 2022.10.05 22:09:47.706 GMT: [DEBUG] User set mapper is not a stageable mapper. Loading a stageable mapper 2022.10.05 22:09:47.712 GMT: [DEBUG] Deployment of Worker Package needed 2022.10.05 22:09:47.737 GMT: [CONFIG] No Replica Registration Jobs will be created . 2022.10.05 22:09:47.742 GMT: [CONFIG] Transfer Implementation loaded for Stage-In [Python based Transfer Script] 2022.10.05 22:09:47.743 GMT: [CONFIG] Transfer Implementation loaded for symbolic linking Stage-In [Python based Transfer Script] 2022.10.05 22:09:47.743 GMT: [CONFIG] Transfer Implementation loaded for Inter Site [Python based Transfer Script] 2022.10.05 22:09:47.743 GMT: [CONFIG] Transfer Implementation loaded for Stage-Out [Python based Transfer Script] 2022.10.05 22:09:47.743 GMT: [DEBUG] Trying to get TCEntries for pegasus::worker on resource ALL of type STAGEABLE 2022.10.05 22:09:47.749 GMT: [DEBUG] System information for pegasus-worker-5.0.1-x86_64_deb_10.tar.gz is {arch=x86_64 os=linux osrelease=deb osversion=10} 2022.10.05 22:09:47.749 GMT: [DEBUG] Compute site sysinfo local {arch=x86_64 os=linux} 2022.10.05 22:09:47.749 GMT: [DEBUG] Worker Package Entry used for site local Logical Namespace : pegasus Logical Name : worker Version : null Resource Id : local Physical Name : file:///local/user/yifan.wang/pycbc-tmp.1RDOkHIy6p/work/pegasus-worker-5.0.1-x86_64_deb_10.tar.gz SysInfo : {arch=x86_64 os=linux} TYPE : STAGEABLE BYPASS : false Notifications: Container : null Compound Tx : null 2022.10.05 22:09:47.749 GMT: [DEBUG] Trying to get TCEntries for pegasus::worker on resource local of type STAGEABLE 2022.10.05 22:09:47.750 GMT: [DEBUG] Staging site for site local for worker package deployment - local 2022.10.05 22:09:47.750 GMT: [DEBUG] Trying to get TCEntries for pegasus::worker on resource ALL of type STAGEABLE 2022.10.05 22:09:47.750 GMT: [DEBUG] Selected entry Logical Namespace : pegasus Logical Name : worker Version : null Resource Id : local Physical Name : file:///local/user/yifan.wang/pycbc-tmp.1RDOkHIy6p/work/pegasus-worker-5.0.1-x86_64_deb_10.tar.gz SysInfo : {arch=x86_64 os=linux} TYPE : STAGEABLE BYPASS : false Notifications: Container : null Compound Tx : null 2022.10.05 22:09:47.751 GMT: [DEBUG] Creating a default TC entry for pegasus::transfer at site local 2022.10.05 22:09:47.751 GMT: [DEBUG] Remote Path set is pegasus-transfer 2022.10.05 22:09:47.751 GMT: [DEBUG] Trying to get TCEntries for pegasus::transfer on resource local of type INSTALLED 2022.10.05 22:09:47.751 GMT: [DEBUG] Entry constructed Logical Namespace : pegasus Logical Name : transfer Version : null Resource Id : local Physical Name : pegasus-transfer SysInfo : {arch=x86_64 os=linux} TYPE : INSTALLED BYPASS : false Notifications: Container : null Compound Tx : null 2022.10.05 22:09:47.751 GMT: [DEBUG] Creating a default TC entry for pegasus::kickstart at site local 2022.10.05 22:09:47.751 GMT: [DEBUG] Remote Path set is pegasus-kickstart 2022.10.05 22:09:47.751 GMT: [DEBUG] Trying to get TCEntries for pegasus::kickstart on resource local of type INSTALLED 2022.10.05 22:09:47.752 GMT: [DEBUG] Entry constructed Logical Namespace : pegasus Logical Name : kickstart Version : null Resource Id : local Physical Name : pegasus-kickstart SysInfo : {arch=x86_64 os=linux} TYPE : INSTALLED BYPASS : false Notifications: Container : null Compound Tx : null 2022.10.05 22:09:47.752 GMT: [DEBUG] Creating a default TC entry for pegasus::cleanup at site local 2022.10.05 22:09:47.752 GMT: [DEBUG] Remote Path set is pegasus-transfer 2022.10.05 22:09:47.752 GMT: [DEBUG] Trying to get TCEntries for pegasus::cleanup on resource local of type INSTALLED 2022.10.05 22:09:47.752 GMT: [DEBUG] Entry constructed Logical Namespace : pegasus Logical Name : cleanup Version : null Resource Id : local Physical Name : pegasus-transfer SysInfo : {arch=x86_64 os=linux} TYPE : INSTALLED BYPASS : false Notifications: Container : null Compound Tx : null 2022.10.05 22:09:47.752 GMT: [DEBUG] Creating a default TC entry for pegasus::seqexec at site local 2022.10.05 22:09:47.752 GMT: [DEBUG] Remote Path set is pegasus-cluster 2022.10.05 22:09:47.753 GMT: [DEBUG] Trying to get TCEntries for pegasus::seqexec on resource local of type INSTALLED 2022.10.05 22:09:47.753 GMT: [DEBUG] Entry constructed Logical Namespace : pegasus Logical Name : seqexec Version : null Resource Id : local Physical Name : pegasus-cluster SysInfo : {arch=x86_64 os=linux} TYPE : INSTALLED BYPASS : false Notifications: Container : null Compound Tx : null 2022.10.05 22:09:47.753 GMT: [DEBUG] Creating a default TC entry for pegasus::dirmanager at site local 2022.10.05 22:09:47.753 GMT: [DEBUG] Remote Path set is pegasus-transfer 2022.10.05 22:09:47.753 GMT: [DEBUG] Trying to get TCEntries for pegasus::dirmanager on resource local of type INSTALLED 2022.10.05 22:09:47.753 GMT: [DEBUG] Entry constructed Logical Namespace : pegasus Logical Name : dirmanager Version : null Resource Id : local Physical Name : pegasus-transfer SysInfo : {arch=x86_64 os=linux} TYPE : INSTALLED BYPASS : false Notifications: Container : null Compound Tx : null 2022.10.05 22:09:47.753 GMT: [DEBUG] Creating a default TC entry for pegasus::keg at site local 2022.10.05 22:09:47.753 GMT: [DEBUG] Remote Path set is pegasus-keg 2022.10.05 22:09:47.753 GMT: [DEBUG] Trying to get TCEntries for pegasus::keg on resource local of type INSTALLED 2022.10.05 22:09:47.754 GMT: [DEBUG] Entry constructed Logical Namespace : pegasus Logical Name : keg Version : null Resource Id : local Physical Name : pegasus-keg SysInfo : {arch=x86_64 os=linux} TYPE : INSTALLED BYPASS : false Notifications: Container : null Compound Tx : null 2022.10.05 22:09:47.754 GMT: [INFO] event.pegasus.cluster dax.id gw.dax-0 - STARTED 2022.10.05 22:09:47.756 GMT: [DEBUG] Adding job to graph pegasus-plan_gw-main 2022.10.05 22:09:47.757 GMT: [DEBUG] Adding job to graph pegasus-plan_gw-finalization 2022.10.05 22:09:47.757 GMT: [DEBUG] Adding parents for child gw-finalization 2022.10.05 22:09:47.759 GMT: [CONFIG] Partitioner loaded is Label Based Partitioning 2022.10.05 22:09:47.770 GMT: [CONFIG] Kickstart Stating Disabled Completely - false 2022.10.05 22:09:47.773 GMT: [CONFIG] Kickstart Stating Disabled Completely - false 2022.10.05 22:09:47.773 GMT: [CONFIG] Clusterer loaded is Topological based Vertical Clustering 2022.10.05 22:09:47.774 GMT: [INFO] Starting Graph Traversal 2022.10.05 22:09:47.775 GMT: [DEBUG] Adding to level 0 dummy 2022.10.05 22:09:47.775 GMT: [DEBUG] Adding to queue gw-main 2022.10.05 22:09:47.775 GMT: [DEBUG] Removed dummy 2022.10.05 22:09:47.775 GMT: [DEBUG] Adding to level 1 gw-main 2022.10.05 22:09:47.775 GMT: [DEBUG] Adding to queue gw-finalization 2022.10.05 22:09:47.775 GMT: [DEBUG] Removed gw-main 2022.10.05 22:09:47.775 GMT: [DEBUG] Adding to level 2 gw-finalization 2022.10.05 22:09:47.775 GMT: [DEBUG] Removed gw-finalization 2022.10.05 22:09:47.775 GMT: [INFO] Starting Graph Traversal - DONE 2022.10.05 22:09:47.775 GMT: [DEBUG] Partition is [gw-main] corresponding to label gw-main 2022.10.05 22:09:47.778 GMT: [DEBUG] Clustering jobs in partition ID1 [gw-main] 2022.10.05 22:09:47.778 GMT: [DEBUG] No clustering for partition ID1 2022.10.05 22:09:47.778 GMT: [DEBUG] Partition is [gw-finalization] corresponding to label gw-finalization 2022.10.05 22:09:47.778 GMT: [DEBUG] Clustering jobs in partition ID2 [gw-finalization] 2022.10.05 22:09:47.778 GMT: [DEBUG] No clustering for partition ID2 2022.10.05 22:09:47.778 GMT: [INFO] Determining relations between partitions 2022.10.05 22:09:47.779 GMT: [INFO] Determining relations between partitions - DONE 2022.10.05 22:09:47.779 GMT: [DEBUG] Adding job to graph pegasus-plan_gw-main 2022.10.05 22:09:47.779 GMT: [DEBUG] Adding job to graph pegasus-plan_gw-finalization 2022.10.05 22:09:47.779 GMT: [DEBUG] Adding parents for child gw-finalization 2022.10.05 22:09:47.780 GMT: [CONFIG] Partitioner loaded is Level Based Partitioning 2022.10.05 22:09:47.782 GMT: [CONFIG] Kickstart Stating Disabled Completely - false 2022.10.05 22:09:47.783 GMT: [CONFIG] Kickstart Stating Disabled Completely - false 2022.10.05 22:09:47.783 GMT: [CONFIG] Clusterer loaded is Horizontal Clustering 2022.10.05 22:09:47.783 GMT: [DEBUG] Adding to level 0 dummy 2022.10.05 22:09:47.783 GMT: [DEBUG] Adding to queue gw-main 2022.10.05 22:09:47.783 GMT: [DEBUG] Removed dummy 2022.10.05 22:09:47.783 GMT: [DEBUG] Adding to level 1 gw-main 2022.10.05 22:09:47.783 GMT: [DEBUG] Adding to queue gw-finalization 2022.10.05 22:09:47.783 GMT: [DEBUG] Removed gw-main 2022.10.05 22:09:47.784 GMT: [DEBUG] Partition ID1 is :[gw-main] 2022.10.05 22:09:47.786 GMT: [DEBUG] Clustering jobs in partition ID1 [gw-main] 2022.10.05 22:09:47.787 GMT: [DEBUG] Clustering jobs of type pegasus-pegasus-plan-5_0_1 2022.10.05 22:09:47.787 GMT: [DEBUG] No clustering of jobs mapped to execution site local 2022.10.05 22:09:47.787 GMT: [DEBUG] Adding to level 2 gw-finalization 2022.10.05 22:09:47.788 GMT: [DEBUG] Removed gw-finalization 2022.10.05 22:09:47.788 GMT: [DEBUG] Partition ID2 is :[gw-finalization] 2022.10.05 22:09:47.788 GMT: [DEBUG] Clustering jobs in partition ID2 [gw-finalization] 2022.10.05 22:09:47.788 GMT: [DEBUG] Clustering jobs of type pegasus-pegasus-plan-5_0_1 2022.10.05 22:09:47.788 GMT: [DEBUG] No clustering of jobs mapped to execution site local 2022.10.05 22:09:47.788 GMT: [DEBUG] Replacing {pegasus-plan_gw-main [] -> pegasus-plan_gw-finalization [],false} with {pegasus-plan_gw-main [] -> pegasus-plan_gw-finalization [],false}Add to set : true 2022.10.05 22:09:47.789 GMT: [DEBUG] All clustered jobs removed from the workflow 2022.10.05 22:09:47.789 GMT: [INFO] event.pegasus.cluster dax.id gw.dax-0 (0.035 seconds) - FINISHED 2022.10.05 22:09:47.790 GMT: [DEBUG] Initialising Replica Catalog for Planner Cache 2022.10.05 22:09:47.790 GMT: [DEBUG] [Replica Factory] Connect properties detected for implementor SimpleFile -> {output=JDBCRC, output.db.create=true, file=/local/user/yifan.wang/pycbc-tmp.1RDOkHIy6p/work/gw.dax-0.getcache, read.only=true, dax.asrc=true, output.db.url=jdbc:sqlite:/local/user/yifan.wang/pycbc-tmp.1RDOkHIy6p/work/gw.dax-0.replicas.db, output.db.driver=sqlite, cache.asrc=true} 2022.10.05 22:09:47.791 GMT: [DEBUG] Initialising Replica Catalog for Planner Cache 2022.10.05 22:09:47.792 GMT: [DEBUG] [Replica Factory] Connect properties detected for implementor SimpleFile -> {output=JDBCRC, output.db.create=true, file=/local/user/yifan.wang/pycbc-tmp.1RDOkHIy6p/work/gw.dax-0.putcache, read.only=true, dax.asrc=true, output.db.url=jdbc:sqlite:/local/user/yifan.wang/pycbc-tmp.1RDOkHIy6p/work/gw.dax-0.replicas.db, output.db.driver=sqlite, cache.asrc=true} 2022.10.05 22:09:47.794 GMT: [INFO] Grafting transfer nodes in the workflow 2022.10.05 22:09:47.795 GMT: [INFO] event.pegasus.generate.transfer-nodes dax.id gw.dax-0 - STARTED 2022.10.05 22:09:47.796 GMT: [DEBUG] Date Reuse Engine no longer tracks deleted leaf jobs. Returning empty list 2022.10.05 22:09:47.803 GMT: [CONFIG] No Replica Registration Jobs will be created . 2022.10.05 22:09:47.804 GMT: [DEBUG] Number of transfer jobs for -1 are 0 2022.10.05 22:09:47.804 GMT: [DEBUG] Number of transfer jobs for 0 are 1 2022.10.05 22:09:47.804 GMT: [DEBUG] Number of transfer jobs for 1 are 1 2022.10.05 22:09:47.806 GMT: [CONFIG] Transfer Implementation loaded for Stage-In [Python based Transfer Script] 2022.10.05 22:09:47.808 GMT: [CONFIG] Transfer Implementation loaded for symbolic linking Stage-In [Python based Transfer Script] 2022.10.05 22:09:47.808 GMT: [CONFIG] Transfer Implementation loaded for Inter Site [Python based Transfer Script] 2022.10.05 22:09:47.808 GMT: [CONFIG] Transfer Implementation loaded for Stage-Out [Python based Transfer Script] 2022.10.05 22:09:47.812 GMT: [DEBUG] Rank ( rank => 1 priority => 500 expr => file://(?!.*(cvmfs)).*) 2022.10.05 22:09:47.812 GMT: [DEBUG] Rank ( rank => 2 priority => 400 expr => file:///cvmfs/.*) 2022.10.05 22:09:47.812 GMT: [DEBUG] Rank ( rank => 3 priority => 300 expr => root://.*) 2022.10.05 22:09:47.812 GMT: [DEBUG] Rank ( rank => 4 priority => 200 expr => gsiftp://red-gridftp.unl.edu.*) 2022.10.05 22:09:47.812 GMT: [DEBUG] Rank ( rank => 5 priority => 100 expr => gridftp://.*) 2022.10.05 22:09:47.813 GMT: [DEBUG] Rank ( rank => 6 priority => 0 expr => .*) 2022.10.05 22:09:47.813 GMT: [CONFIG] [RegexReplicaSelector] User Provided Ranked regexes are [( rank => 1 priority => 500 expr => file://(?!.*(cvmfs)).*), ( rank => 2 priority => 400 expr => file:///cvmfs/.*), ( rank => 3 priority => 300 expr => root://.*), ( rank => 4 priority => 200 expr => gsiftp://red-gridftp.unl.edu.*), ( rank => 5 priority => 100 expr => gridftp://.*), ( rank => 6 priority => 0 expr => .*)] 2022.10.05 22:09:47.816 GMT: [DEBUG] [Replica Factory] Connect properties detected for implementor File -> {file=output.map, read.only=true} 2022.10.05 22:09:47.828 GMT: [CONFIG] Output Mapper loaded is [Replica Catalog Mapper] 2022.10.05 22:09:47.828 GMT: [DEBUG] Initialising Workflow Cache File in the Submit Directory 2022.10.05 22:09:47.829 GMT: [DEBUG] [Replica Factory] Connect properties detected for implementor FlushedCache -> {output=JDBCRC, output.db.create=true, file=/local/user/yifan.wang/pycbc-tmp.1RDOkHIy6p/work/gw.dax-0.cache, dax.asrc=true, output.db.url=jdbc:sqlite:/local/user/yifan.wang/pycbc-tmp.1RDOkHIy6p/work/gw.dax-0.replicas.db, output.db.driver=sqlite, cache.asrc=true} 2022.10.05 22:09:47.831 GMT: [CONFIG] Transfer Refiner loaded is [Balanced Cluster Transfer Refiner( round robin distribution at file level)] 2022.10.05 22:09:47.831 GMT: [CONFIG] ReplicaSelector loaded is [Regex] 2022.10.05 22:09:47.831 GMT: [CONFIG] Submit Directory Mapper loaded is [Relative Submit Directory Mapper] 2022.10.05 22:09:47.832 GMT: [CONFIG] Staging Mapper loaded is [Flat Directory Staging Mapper] 2022.10.05 22:09:47.834 GMT: [DEBUG] SRM Server map is {} 2022.10.05 22:09:47.837 GMT: [DEBUG] SRM Server map is {} 2022.10.05 22:09:47.837 GMT: [DEBUG] Directory for job pegasus-plan_gw-main is . 2022.10.05 22:09:47.838 GMT: [DEBUG] 2022.10.05 22:09:47.838 GMT: [DEBUG] Job being traversed is pegasus-plan_gw-main 2022.10.05 22:09:47.838 GMT: [DEBUG] To be run at local 2022.10.05 22:09:47.838 GMT: [DEBUG] Parents of job:{} 2022.10.05 22:09:47.838 GMT: [DEBUG] Initialising Workflow Cache File for job pegasus-plan_gw-main 2022.10.05 22:09:47.838 GMT: [DEBUG] [Replica Factory] Connect properties detected for implementor FlushedCache -> {output=JDBCRC, output.db.create=true, file=/local/user/yifan.wang/pycbc-tmp.1RDOkHIy6p/work/./pegasus-plan_gw-main.cache, dax.asrc=true, output.db.url=jdbc:sqlite:/local/user/yifan.wang/pycbc-tmp.1RDOkHIy6p/work/gw.dax-0.replicas.db, output.db.driver=sqlite, cache.asrc=true} 2022.10.05 22:09:47.839 GMT: [DEBUG] [RegexReplicaSelector] Selecting a pfn for lfn gw-main.dax at site local amongst gw-main.dax regex false -> {(/work/yifan.wang/search-high-spin/prod/runs/O1/7/gw-main.dax,{site=local}),} 2022.10.05 22:09:47.840 GMT: [DEBUG] Job Input files : Removed file gw-main.dax for job pegasus-plan_gw-main 2022.10.05 22:09:47.840 GMT: [DEBUG] Job Search Files : Removed file gw-main.dax for job pegasus-plan_gw-main 2022.10.05 22:09:47.840 GMT: [DEBUG] Set arguments for DAX job pegasus-plan_gw-main to -Dpegasus.dir.storage.mapper.replica.file=gw-main.map --basename gw-main --cluster label,horizontal --output-sites local --staging-site local=local --cache /work/yifan.wang/search-high-spin/prod/runs/O1/7/_reuse.cache --cleanup inplace -vvv /work/yifan.wang/search-high-spin/prod/runs/O1/7/gw-main.dax 2022.10.05 22:09:47.852 GMT: [DEBUG] Directory for job pegasus-plan_gw-finalization is . 2022.10.05 22:09:47.853 GMT: [DEBUG] 2022.10.05 22:09:47.866 GMT: [DEBUG] Job being traversed is pegasus-plan_gw-finalization 2022.10.05 22:09:47.866 GMT: [DEBUG] To be run at local 2022.10.05 22:09:47.867 GMT: [DEBUG] Parents of job:{pegasus-plan_gw-main,} 2022.10.05 22:09:47.867 GMT: [DEBUG] Initialising Workflow Cache File for job pegasus-plan_gw-finalization 2022.10.05 22:09:47.867 GMT: [DEBUG] [Replica Factory] Connect properties detected for implementor FlushedCache -> {output=JDBCRC, output.db.create=true, file=/local/user/yifan.wang/pycbc-tmp.1RDOkHIy6p/work/./pegasus-plan_gw-finalization.cache, dax.asrc=true, output.db.url=jdbc:sqlite:/local/user/yifan.wang/pycbc-tmp.1RDOkHIy6p/work/gw.dax-0.replicas.db, output.db.driver=sqlite, cache.asrc=true} 2022.10.05 22:09:47.868 GMT: [DEBUG] [RegexReplicaSelector] Selecting a pfn for lfn gw-finalization.dax at site local amongst gw-finalization.dax regex false -> {(/work/yifan.wang/search-high-spin/prod/runs/O1/7/gw-finalization.dax,{site=local}),} 2022.10.05 22:09:47.869 GMT: [DEBUG] Job Input files : Removed file gw-finalization.dax for job pegasus-plan_gw-finalization 2022.10.05 22:09:47.869 GMT: [DEBUG] Job Search Files : Removed file gw-finalization.dax for job pegasus-plan_gw-finalization 2022.10.05 22:09:47.870 GMT: [DEBUG] Set arguments for DAX job pegasus-plan_gw-finalization to -Dpegasus.dir.storage.mapper.replica.file=gw-finalization.map --basename gw-finalization --cluster label,horizontal --output-sites local --staging-site local=local --cache /work/yifan.wang/search-high-spin/prod/runs/O1/7/_reuse.cache --cleanup inplace -vvv /work/yifan.wang/search-high-spin/prod/runs/O1/7/gw-finalization.dax 2022.10.05 22:09:47.871 GMT: [INFO] event.pegasus.generate.transfer-nodes dax.id gw.dax-0 (0.076 seconds) - FINISHED 2022.10.05 22:09:47.872 GMT: [DEBUG] Adding worker package deployment node for local 2022.10.05 22:09:47.879 GMT: [DEBUG] Skipping stage worker job for site local as worker package already in submit directory file:///local/user/yifan.wang/pycbc-tmp.1RDOkHIy6p/work/pegasus-worker-5.0.1-x86_64_deb_10.tar.gz 2022.10.05 22:09:47.879 GMT: [INFO] event.pegasus.generate.workdir-nodes dax.id gw.dax-0 - STARTED 2022.10.05 22:09:47.883 GMT: [DEBUG] Trying to get TCEntries for pegasus::dirmanager on resource local of type INSTALLED 2022.10.05 22:09:47.897 GMT: [DEBUG] Creating create dir node create_dir_gw.dax_0_local 2022.10.05 22:09:47.897 GMT: [DEBUG] Need to add edge create_dir_gw.dax_0_local -> pegasus-plan_gw-main 2022.10.05 22:09:47.897 GMT: [DEBUG] Adding node to the worfklow create_dir_gw.dax_0_local 2022.10.05 22:09:47.907 GMT: [INFO] event.pegasus.generate.workdir-nodes dax.id gw.dax-0 (0.028 seconds) - FINISHED 2022.10.05 22:09:47.907 GMT: [INFO] event.pegasus.generate.cleanup-nodes dax.id gw.dax-0 - STARTED 2022.10.05 22:09:47.917 GMT: [CONFIG] Setting property dagman.cleanup.maxjobs to 4 to set max jobs for cleanup jobs category 2022.10.05 22:09:47.920 GMT: [DEBUG] Number of sites 1 2022.10.05 22:09:47.920 GMT: [DEBUG] Site local count jobs = 3 2022.10.05 22:09:47.921 GMT: [DEBUG] * pegasus-plan_gw-main 2022.10.05 22:09:47.921 GMT: [DEBUG] * pegasus-plan_gw-finalization 2022.10.05 22:09:47.921 GMT: [DEBUG] * create_dir_gw.dax_0_local 2022.10.05 22:09:47.922 GMT: [DEBUG] local 3 2022.10.05 22:09:47.922 GMT: [DEBUG] Leaf jobs scheduled at site local are pegasus-plan_gw-main,pegasus-plan_gw-finalization,create_dir_gw.dax_0_local, 2022.10.05 22:09:47.922 GMT: [DEBUG] File gw-finalization.map will not be cleaned up for job pegasus-plan_gw-finalization 2022.10.05 22:09:47.922 GMT: [DEBUG] File gw-main.map will not be cleaned up for job pegasus-plan_gw-main 2022.10.05 22:09:47.922 GMT: [DEBUG] 2022.10.05 22:09:47.922 GMT: [INFO] For site: local number of files cleaned up - 0 2022.10.05 22:09:47.922 GMT: [DEBUG] CLEANUP LIST 2022.10.05 22:09:47.922 GMT: [INFO] event.pegasus.generate.cleanup-nodes dax.id gw.dax-0 (0.015 seconds) - FINISHED 2022.10.05 22:09:47.922 GMT: [INFO] Adding Leaf Cleanup Jobs dax.id gw.dax-0 - STARTED 2022.10.05 22:09:47.930 GMT: [DEBUG] Directory URL is a file url for site local [file:///atlas/user/scratch/yifan.wang/pycbc-tmp_uq8rtj4c/local-site-scratch/work] 2022.10.05 22:09:47.930 GMT: [DEBUG] Trying to get TCEntries for pegasus::cleanup on resource local of type INSTALLED 2022.10.05 22:09:47.931 GMT: [DEBUG] Creating remove directory node cleanup_gw.dax_0_local 2022.10.05 22:09:47.932 GMT: [DEBUG] Need to add edge for DAX|DAG job pegasus-plan_gw-finalization -> cleanup_gw.dax_0_local 2022.10.05 22:09:47.932 GMT: [DEBUG] Need to add edge pegasus-plan_gw-finalization -> cleanup_gw.dax_0_local 2022.10.05 22:09:47.932 GMT: [DEBUG] Need to add edge for DAX|DAG job pegasus-plan_gw-main -> cleanup_gw.dax_0_local 2022.10.05 22:09:47.932 GMT: [DEBUG] Adding node to the worklfow cleanup_gw.dax_0_local 2022.10.05 22:09:47.932 GMT: [INFO] Adding Leaf Cleanup Jobs dax.id gw.dax-0 (0.01 seconds) - FINISHED 2022.10.05 22:09:47.934 GMT: [INFO] event.pegasus.refinement dax.id gw.dax-0 (0.276 seconds) - FINISHED 2022.10.05 22:09:48.028 GMT: [DEBUG] Condor Version as string 8.8.9 2022.10.05 22:09:48.028 GMT: [DEBUG] Condor Version detected is 80809 2022.10.05 22:09:48.028 GMT: [INFO] Generating codes for the executable workflow 2022.10.05 22:09:48.028 GMT: [INFO] event.pegasus.code.generation dax.id gw.dax-0 - STARTED 2022.10.05 22:09:48.029 GMT: [CONFIG] Kickstart Stating Disabled Completely - false 2022.10.05 22:09:48.029 GMT: [DEBUG] event.pegasus.code.generation dax.id gw.dax-0 - STARTED 2022.10.05 22:09:48.076 GMT: [DEBUG] Condor Version as string 8.8.9 2022.10.05 22:09:48.076 GMT: [DEBUG] Condor Version detected is 80809 2022.10.05 22:09:48.077 GMT: [CONFIG] Kickstart Stating Disabled Completely - false 2022.10.05 22:09:48.079 GMT: [DEBUG] Applying priority of 800 to create_dir_gw.dax_0_local 2022.10.05 22:09:48.120 GMT: [DEBUG] Mount Under Scratch Directories [/tmp, /var/tmp] 2022.10.05 22:09:48.137 GMT: [CONFIG] Kickstart Stating Disabled Completely - false 2022.10.05 22:09:48.144 GMT: [DEBUG] Trying to get TCEntries for pegasus::kickstart on resource local of type INSTALLED 2022.10.05 22:09:48.145 GMT: [DEBUG] Postscript constructed is /software/tools/pegasus/5.0/bin/pegasus-exitcode 2022.10.05 22:09:48.146 GMT: [WARNING] Removing unsupported key request_cpus in local universe for job create_dir_gw.dax_0_local 2022.10.05 22:09:48.146 GMT: [WARNING] Removing unsupported key request_memory in local universe for job create_dir_gw.dax_0_local 2022.10.05 22:09:48.147 GMT: [DEBUG] Unquoted arguments are pegasus-kickstart -n pegasus::dirmanager -N null -i - -R local -L gw.dax -T 2022-10-05T22:09:41+00:00 pegasus-transfer 2022.10.05 22:09:48.147 GMT: [DEBUG] Quoted arguments are "pegasus-kickstart -n pegasus::dirmanager -N null -i - -R local -L gw.dax -T 2022-10-05T22:09:41+00:00 pegasus-transfer " 2022.10.05 22:09:48.149 GMT: [DEBUG] Written Submit file : /local/user/yifan.wang/pycbc-tmp.1RDOkHIy6p/work/./create_dir_gw.dax_0_local.sub 2022.10.05 22:09:48.149 GMT: [DEBUG] Applying priority of 10 to pegasus-plan_gw-main 2022.10.05 22:09:48.149 GMT: [DEBUG] Generating code for DAX job pegasus-plan_gw-main 2022.10.05 22:09:48.149 GMT: [DEBUG] Arguments passed to SUBDAX Generator are -Dpegasus.dir.storage.mapper.replica.file=gw-main.map --basename gw-main --cluster label,horizontal --output-sites local --staging-site local=local --cache /work/yifan.wang/search-high-spin/prod/runs/O1/7/_reuse.cache --cleanup inplace -vvv /work/yifan.wang/search-high-spin/prod/runs/O1/7/gw-main.dax 2022.10.05 22:09:48.159 GMT: [DEBUG] Retrieving Metadata from the DAX file /work/yifan.wang/search-high-spin/prod/runs/O1/7/gw-main.dax 2022.10.05 22:09:57.349 GMT: [DEBUG] Submit directory in sub dax specified is ./gw-main.dax_gw-main 2022.10.05 22:09:57.350 GMT: [DEBUG] Base Submit directory for inner workflow set to /local/user/yifan.wang/pycbc-tmp.1RDOkHIy6p 2022.10.05 22:09:57.351 GMT: [DEBUG] Relative Submit Directory for inner workflow set to work/././gw-main.dax_gw-main 2022.10.05 22:09:57.351 GMT: [DEBUG] Submit directory for inner workflow set to /local/user/yifan.wang/pycbc-tmp.1RDOkHIy6p/work/././gw-main.dax_gw-main 2022.10.05 22:09:57.351 GMT: [DEBUG] Setting list of execution sites to the same as outer workflow 2022.10.05 22:09:57.351 GMT: [DEBUG] Submit Directory for SUB DAX is /local/user/yifan.wang/pycbc-tmp.1RDOkHIy6p/work/././gw-main.dax_gw-main 2022.10.05 22:09:57.353 GMT: [DEBUG] Relative Execution Directory for SUB DAX is work/./gw-main.dax_gw-main 2022.10.05 22:09:57.356 GMT: [DEBUG] Trying to get TCEntries for pegasus::pegasus-plan on resource local of type INSTALLED 2022.10.05 22:09:57.356 GMT: [DEBUG] Constructing the default path to the pegasus-plan 2022.10.05 22:09:57.357 GMT: [DEBUG] pegasus-plan invocation for job pegasus-plan_gw-main determined to be /software/tools/pegasus/5.0/bin/pegasus-plan -Dpegasus.log.*=/local/user/yifan.wang/pycbc-tmp.1RDOkHIy6p/work/pegasus-plan_gw-main.pre.log -Dpegasus.workflow.root.uuid=788b2966-4d62-4b86-8e24-27ec3cc664a4 -Dpegasus.dir.storage.mapper.replica.file=gw-main.map --conf /local/user/yifan.wang/pycbc-tmp.1RDOkHIy6p/work/pegasus.1845231942488000471.properties --dir /local/user/yifan.wang/pycbc-tmp.1RDOkHIy6p --relative-dir work/./gw-main.dax_gw-main --relative-submit-dir work/././gw-main.dax_gw-main --basename gw-main --sites local --staging-site local=local, --cache /work/yifan.wang/search-high-spin/prod/runs/O1/7/_reuse.cache,/local/user/yifan.wang/pycbc-tmp.1RDOkHIy6p/work/./pegasus-plan_gw-main.cache --inherited-rc-files /local/user/yifan.wang/pycbc-tmp.1RDOkHIy6p/work/gw.dax-0.replica.store --cluster label,horizontal --output-sites local --cleanup none --verbose --verbose --verbose --deferred /work/yifan.wang/search-high-spin/prod/runs/O1/7/gw-main.dax 2022.10.05 22:09:57.357 GMT: [DEBUG] Basename prefix for the sub workflow is gw-main 2022.10.05 22:09:57.357 GMT: [DEBUG] Cache File for the sub workflow is /local/user/yifan.wang/pycbc-tmp.1RDOkHIy6p/work/././gw-main.dax_gw-main/gw-main.cache 2022.10.05 22:09:57.358 GMT: [DEBUG] Trying to get TCEntries for condor::dagman on resource local of type INSTALLED 2022.10.05 22:09:57.358 GMT: [DEBUG] condor::dagman not catalogued in the Transformation Catalog. Trying to construct from the Site Catalog 2022.10.05 22:09:57.358 GMT: [DEBUG] DAGMan not catalogued in the Transformation Catalog or the Site Catalog. Trying to construct from the environment 2022.10.05 22:09:57.358 GMT: [DEBUG] Constructing path to dagman on basis of env variable CONDOR_LOCATION 2022.10.05 22:09:57.359 GMT: [DEBUG] Number of Resuce retries 999 2022.10.05 22:09:57.359 GMT: [DEBUG] Constructing arguments to dagman in 7.1.0 and later style 2022.10.05 22:09:57.363 GMT: [CONFIG] Kickstart Stating Disabled Completely - false 2022.10.05 22:09:57.366 GMT: [DEBUG] Setting job pegasus-plan_gw-main.pre to run via No container wrapping 2022.10.05 22:09:57.366 GMT: [DEBUG] Trying to get TCEntries for pegasus::kickstart on resource local of type INSTALLED 2022.10.05 22:09:57.366 GMT: [DEBUG] Trying to get TCEntries for pegasus::transfer on resource local of type INSTALLED 2022.10.05 22:09:57.372 GMT: [DEBUG] Unquoted arguments are -p 0 -f -l . -Notification never -Debug 3 -Lockfile gw-main.dag.lock -Dag gw-main.dag -AllowVersionMismatch -AutoRescue 1 -DoRescueFrom 0 -MaxPre 1 2022.10.05 22:09:57.372 GMT: [DEBUG] Quoted arguments are " -p 0 -f -l . -Notification never -Debug 3 -Lockfile gw-main.dag.lock -Dag gw-main.dag -AllowVersionMismatch -AutoRescue 1 -DoRescueFrom 0 -MaxPre 1" 2022.10.05 22:09:57.373 GMT: [DEBUG] Written Submit file : /local/user/yifan.wang/pycbc-tmp.1RDOkHIy6p/work/./pegasus-plan_gw-main.sub 2022.10.05 22:09:57.373 GMT: [DEBUG] Applying priority of 20 to pegasus-plan_gw-finalization 2022.10.05 22:09:57.373 GMT: [DEBUG] Generating code for DAX job pegasus-plan_gw-finalization 2022.10.05 22:09:57.373 GMT: [DEBUG] Arguments passed to SUBDAX Generator are -Dpegasus.dir.storage.mapper.replica.file=gw-finalization.map --basename gw-finalization --cluster label,horizontal --output-sites local --staging-site local=local --cache /work/yifan.wang/search-high-spin/prod/runs/O1/7/_reuse.cache --cleanup inplace -vvv /work/yifan.wang/search-high-spin/prod/runs/O1/7/gw-finalization.dax 2022.10.05 22:09:57.376 GMT: [DEBUG] Retrieving Metadata from the DAX file /work/yifan.wang/search-high-spin/prod/runs/O1/7/gw-finalization.dax 2022.10.05 22:09:57.384 GMT: [DEBUG] Submit directory in sub dax specified is ./gw-finalization.dax_gw-finalization 2022.10.05 22:09:57.385 GMT: [DEBUG] Base Submit directory for inner workflow set to /local/user/yifan.wang/pycbc-tmp.1RDOkHIy6p 2022.10.05 22:09:57.385 GMT: [DEBUG] Relative Submit Directory for inner workflow set to work/././gw-finalization.dax_gw-finalization 2022.10.05 22:09:57.385 GMT: [DEBUG] Submit directory for inner workflow set to /local/user/yifan.wang/pycbc-tmp.1RDOkHIy6p/work/././gw-finalization.dax_gw-finalization 2022.10.05 22:09:57.385 GMT: [DEBUG] Setting list of execution sites to the same as outer workflow 2022.10.05 22:09:57.385 GMT: [DEBUG] Parent DAX Jobs Transient RC's are [/local/user/yifan.wang/pycbc-tmp.1RDOkHIy6p/work/././gw-main.dax_gw-main/gw-main.cache] 2022.10.05 22:09:57.385 GMT: [DEBUG] Submit Directory for SUB DAX is /local/user/yifan.wang/pycbc-tmp.1RDOkHIy6p/work/././gw-finalization.dax_gw-finalization 2022.10.05 22:09:57.385 GMT: [DEBUG] Relative Execution Directory for SUB DAX is work/./gw-finalization.dax_gw-finalization 2022.10.05 22:09:57.385 GMT: [DEBUG] Trying to get TCEntries for pegasus::pegasus-plan on resource local of type INSTALLED 2022.10.05 22:09:57.385 GMT: [DEBUG] Constructing the default path to the pegasus-plan 2022.10.05 22:09:57.385 GMT: [DEBUG] pegasus-plan invocation for job pegasus-plan_gw-finalization determined to be /software/tools/pegasus/5.0/bin/pegasus-plan -Dpegasus.log.*=/local/user/yifan.wang/pycbc-tmp.1RDOkHIy6p/work/pegasus-plan_gw-finalization.pre.log -Dpegasus.workflow.root.uuid=788b2966-4d62-4b86-8e24-27ec3cc664a4 -Dpegasus.dir.storage.mapper.replica.file=gw-finalization.map --conf /local/user/yifan.wang/pycbc-tmp.1RDOkHIy6p/work/pegasus.1845231942488000471.properties --dir /local/user/yifan.wang/pycbc-tmp.1RDOkHIy6p --relative-dir work/./gw-finalization.dax_gw-finalization --relative-submit-dir work/././gw-finalization.dax_gw-finalization --basename gw-finalization --sites local --staging-site local=local, --cache /local/user/yifan.wang/pycbc-tmp.1RDOkHIy6p/work/./pegasus-plan_gw-finalization.cache,/work/yifan.wang/search-high-spin/prod/runs/O1/7/_reuse.cache,/local/user/yifan.wang/pycbc-tmp.1RDOkHIy6p/work/././gw-main.dax_gw-main/gw-main.cache --inherited-rc-files /local/user/yifan.wang/pycbc-tmp.1RDOkHIy6p/work/gw.dax-0.replica.store --cluster label,horizontal --output-sites local --cleanup none --verbose --verbose --verbose --deferred /work/yifan.wang/search-high-spin/prod/runs/O1/7/gw-finalization.dax 2022.10.05 22:09:57.386 GMT: [DEBUG] Basename prefix for the sub workflow is gw-finalization 2022.10.05 22:09:57.386 GMT: [DEBUG] Cache File for the sub workflow is /local/user/yifan.wang/pycbc-tmp.1RDOkHIy6p/work/././gw-finalization.dax_gw-finalization/gw-finalization.cache 2022.10.05 22:09:57.386 GMT: [DEBUG] Trying to get TCEntries for condor::dagman on resource local of type INSTALLED 2022.10.05 22:09:57.386 GMT: [DEBUG] condor::dagman not catalogued in the Transformation Catalog. Trying to construct from the Site Catalog 2022.10.05 22:09:57.386 GMT: [DEBUG] DAGMan not catalogued in the Transformation Catalog or the Site Catalog. Trying to construct from the environment 2022.10.05 22:09:57.386 GMT: [DEBUG] Constructing path to dagman on basis of env variable CONDOR_LOCATION 2022.10.05 22:09:57.386 GMT: [DEBUG] Number of Resuce retries 999 2022.10.05 22:09:57.386 GMT: [DEBUG] Constructing arguments to dagman in 7.1.0 and later style 2022.10.05 22:09:57.388 GMT: [DEBUG] Setting job pegasus-plan_gw-finalization.pre to run via No container wrapping 2022.10.05 22:09:57.388 GMT: [DEBUG] Trying to get TCEntries for pegasus::kickstart on resource local of type INSTALLED 2022.10.05 22:09:57.389 GMT: [DEBUG] Trying to get TCEntries for pegasus::transfer on resource local of type INSTALLED 2022.10.05 22:09:57.390 GMT: [DEBUG] Unquoted arguments are -p 0 -f -l . -Notification never -Debug 3 -Lockfile gw-finalization.dag.lock -Dag gw-finalization.dag -AllowVersionMismatch -AutoRescue 1 -DoRescueFrom 0 -MaxPre 1 2022.10.05 22:09:57.390 GMT: [DEBUG] Quoted arguments are " -p 0 -f -l . -Notification never -Debug 3 -Lockfile gw-finalization.dag.lock -Dag gw-finalization.dag -AllowVersionMismatch -AutoRescue 1 -DoRescueFrom 0 -MaxPre 1" 2022.10.05 22:09:57.390 GMT: [DEBUG] Written Submit file : /local/user/yifan.wang/pycbc-tmp.1RDOkHIy6p/work/./pegasus-plan_gw-finalization.sub 2022.10.05 22:09:57.390 GMT: [DEBUG] Applying priority of 1000 to cleanup_gw.dax_0_local 2022.10.05 22:09:57.391 GMT: [DEBUG] Trying to get TCEntries for pegasus::kickstart on resource local of type INSTALLED 2022.10.05 22:09:57.391 GMT: [DEBUG] Postscript constructed is /software/tools/pegasus/5.0/bin/pegasus-exitcode 2022.10.05 22:09:57.391 GMT: [WARNING] Removing unsupported key request_cpus in local universe for job cleanup_gw.dax_0_local 2022.10.05 22:09:57.391 GMT: [WARNING] Removing unsupported key request_memory in local universe for job cleanup_gw.dax_0_local 2022.10.05 22:09:57.392 GMT: [DEBUG] Unquoted arguments are pegasus-kickstart -n pegasus::cleanup -N null -i - -R local -L gw.dax -T 2022-10-05T22:09:41+00:00 pegasus-transfer 2022.10.05 22:09:57.392 GMT: [DEBUG] Quoted arguments are "pegasus-kickstart -n pegasus::cleanup -N null -i - -R local -L gw.dax -T 2022-10-05T22:09:41+00:00 pegasus-transfer " 2022.10.05 22:09:57.392 GMT: [DEBUG] Written Submit file : /local/user/yifan.wang/pycbc-tmp.1RDOkHIy6p/work/./cleanup_gw.dax_0_local.sub 2022.10.05 22:09:57.392 GMT: [DEBUG] event.pegasus.code.generation dax.id gw.dax-0 (9.363 seconds) - FINISHED 2022.10.05 22:09:57.393 GMT: [DEBUG] Written Dag File : gw.dax-0.dag.tmp 2022.10.05 22:09:57.393 GMT: [DEBUG] Writing out the DOT file 2022.10.05 22:09:57.409 GMT: [DEBUG] Written out notifications to gw.dax-0.notify 2022.10.05 22:09:57.409 GMT: [DEBUG] Writing out the DAX Replica Store to file /local/user/yifan.wang/pycbc-tmp.1RDOkHIy6p/work/gw.dax-0.replica.store 2022.10.05 22:09:57.409 GMT: [DEBUG] [Replica Factory] Connect properties detected for implementor SimpleFile -> {output=JDBCRC, output.db.create=true, file=/local/user/yifan.wang/pycbc-tmp.1RDOkHIy6p/work/gw.dax-0.replica.store, dax.asrc=true, output.db.url=jdbc:sqlite:/local/user/yifan.wang/pycbc-tmp.1RDOkHIy6p/work/gw.dax-0.replicas.db, output.db.driver=sqlite, cache.asrc=true} 2022.10.05 22:09:57.412 GMT: [DEBUG] Written out dax replica store to gw.dax-0.replica.store 2022.10.05 22:09:57.415 GMT: [DEBUG] Written out stampede events for the executable workflow to /local/user/yifan.wang/pycbc-tmp.1RDOkHIy6p/work/gw.dax-0.static.bp 2022.10.05 22:09:57.423 GMT: [DEBUG] Proxy whose DN will be logged in the braindump file /tmp/x509up_u44039 2022.10.05 22:09:57.528 GMT: [DEBUG] Unable to determine GRID DN class org.globus.gsi.gssapi.GlobusGSSException: Defective credential detected [Caused by: proxy not found] 2022.10.05 22:09:57.567 GMT: [DEBUG] Written out braindump to /local/user/yifan.wang/pycbc-tmp.1RDOkHIy6p/work/braindump.yml 2022.10.05 22:09:57.567 GMT: [DEBUG] Renamed temporary dag file to : /local/user/yifan.wang/pycbc-tmp.1RDOkHIy6p/work/gw.dax-0.dag 2022.10.05 22:09:57.569 GMT: [DEBUG] Executing /usr/bin/condor_submit_dag -append executable=/software/tools/pegasus/5.0/bin/pegasus-dagman -no_submit -MaxPre 1 -MaxPost 20 -append +pegasus_wf_uuid="788b2966-4d62-4b86-8e24-27ec3cc664a4" -append +pegasus_root_wf_uuid="788b2966-4d62-4b86-8e24-27ec3cc664a4" -append +pegasus_wf_name="gw.dax-0" -append +pegasus_wf_time="20221005T220946+0000" -append +pegasus_version="5.0.1" -append +pegasus_job_class=11 -append +pegasus_cluster_size=1 -append +pegasus_site="local" -append +pegasus_execution_sites="local" -append +pegasus_wf_xformation="pegasus::dagman" gw.dax-0.dag with environment = PATH=/opt/intel/2018/compilers_and_libraries_2018.0.128/linux/bin/intel64:/opt/intel/2018/compilers_and_libraries_2018.0.128/linux/mpi/intel64/bin:/software/tools/pegasus/5.0/bin/:/work/ahnitz/projects/4ogc/env/bin:/work/yifan.wang/lscsoft/opt/accomlal/bin:/usr/local/bin:/usr/bin:/bin:/usr/local/games:/usr/games;PKG_CONFIG_PATH=/work/yifan.wang/lscsoft/opt/accomlal/lib/pkgconfig:;LAL_DATA_PATH=/atlas/recent/cbc/ROM_data/;TBBROOT=/opt/intel/2018/compilers_and_libraries_2018.0.128/linux/tbb;TZ=:/etc/localtime;MODULEPATH=/etc/environment-modules/modules:/usr/share/modules/versions:/usr/share/modules/$MODULE_VERSION/modulefiles:/usr/share/modules/modulefiles;PEGASUS_PERL_DIR=/software/tools/pegasus/5.0/lib/pegasus/perl;DBUS_SESSION_BUS_ADDRESS=unix:path=/run/user/44039/bus;INTEL_LICENSE_FILE=/opt/intel/2018/compilers_and_libraries_2018.0.128/linux/licenses:/opt/intel/licenses:/work/yifan.wang/intel/licenses;MPM_LAUNCHER=/opt/intel/2018/debugger_2018/mpm/mic/bin/start_mpm.sh;MAIL=/var/mail/yifan.wang;LD_LIBRARY_PATH=/opt/intel/2018/compilers_and_libraries_2018.0.128/linux/compiler/lib/intel64:/opt/intel/2018/compilers_and_libraries_2018.0.128/linux/compiler/lib/intel64_lin:/opt/intel/2018/compilers_and_libraries_2018.0.128/linux/mpi/intel64/lib:/opt/intel/2018/compilers_and_libraries_2018.0.128/linux/mpi/mic/lib:/opt/intel/2018/compilers_and_libraries_2018.0.128/linux/ipp/lib/intel64:/opt/intel/2018/compilers_and_libraries_2018.0.128/linux/compiler/lib/intel64_lin:/opt/intel/2018/compilers_and_libraries_2018.0.128/linux/mkl/lib/intel64_lin:/opt/intel/2018/compilers_and_libraries_2018.0.128/linux/tbb/lib/intel64/gcc4.7:/opt/intel/2018/compilers_and_libraries_2018.0.128/linux/tbb/lib/intel64/gcc4.7:/opt/intel/2018/debugger_2018/iga/lib:/opt/intel/2018/debugger_2018/libipt/intel64/lib:/opt/intel/2018/compilers_and_libraries_2018.0.128/linux/daal/lib/intel64_lin:/opt/intel/2018/compilers_and_libraries_2018.0.128/linux/daal/../tbb/lib/intel64_lin/gcc4.4:/work/yifan.wang/lscsoft/opt/accomlal/lib/:/work/yifan.wang/lscsoft/MultiNest/lib:;LOGNAME=yifan.wang;PWD=/work/yifan.wang/search-high-spin/prod/runs/O1/7;INFOPATH=/opt/intel/2018/documentation_2018/en/debugger//gdb-ia/info/:/opt/intel/2018/documentation_2018/en/debugger//gdb-igfx/info/;PYTHONPATH=/software/tools/pegasus/5.0/lib/python3.7/dist-packages/:/work/yifan.wang/eccentricity/gitlab-eccsearch/waveform/ihes-teobresum/Python:/work/yifan.wang/eccentricity/gitlab-eccsearch//waveform/PyCBC-teobresums:/work/yifan.wang/lscsoft/src/TaylorF2e:;SHELL=/bin/bash;BASH_ENV=/usr/share/modules/init/bash;INTEL_PYTHONHOME=/opt/intel/2018/debugger_2018/python/intel64/;LM_LICENSE_FILE=/opt/matlab/default/etc/license.dat;OLDPWD=/work/yifan.wang/search-high-spin/prod/runs;TMPDIR=/local/user/yifan.wang;LIBRARY_PATH=/opt/intel/2018/compilers_and_libraries_2018.0.128/linux/ipp/lib/intel64:/opt/intel/2018/compilers_and_libraries_2018.0.128/linux/compiler/lib/intel64_lin:/opt/intel/2018/compilers_and_libraries_2018.0.128/linux/mkl/lib/intel64_lin:/opt/intel/2018/compilers_and_libraries_2018.0.128/linux/tbb/lib/intel64/gcc4.7:/opt/intel/2018/compilers_and_libraries_2018.0.128/linux/tbb/lib/intel64/gcc4.7:/opt/intel/2018/compilers_and_libraries_2018.0.128/linux/daal/lib/intel64_lin:/opt/intel/2018/compilers_and_libraries_2018.0.128/linux/daal/../tbb/lib/intel64_lin/gcc4.4;VIRTUAL_ENV=/work/ahnitz/projects/4ogc/env;PSTLROOT=/opt/intel/2018/compilers_and_libraries_2018.0.128/linux/pstl;MODULEPATH_modshare=/etc/environment-modules/modules:1:/usr/share/modules/$MODULE_VERSION/modulefiles:1:/usr/share/modules/modulefiles:1:/usr/share/modules/versions:1;LC_ALL=C;PEGASUS_PYTHON_DIR=/software/tools/pegasus/5.0/lib/python3.7/dist-packages;LC_CTYPE=en_US.UTF-8;CPATH=/opt/intel/2018/compilers_and_libraries_2018.0.128/linux/ipp/include:/opt/intel/2018/compilers_and_libraries_2018.0.128/linux/mkl/include:/opt/intel/2018/compilers_and_libraries_2018.0.128/linux/pstl/include:/opt/intel/2018/compilers_and_libraries_2018.0.128/linux/tbb/include:/opt/intel/2018/compilers_and_libraries_2018.0.128/linux/tbb/include:/opt/intel/2018/compilers_and_libraries_2018.0.128/linux/daal/include;SHLVL=2;SLACK_BOT_TOKEN=xoxp-1889174914644-1876230893430-1893291262468-e9d8766407c79a32862f209441631a1f;MKLROOT=/opt/intel/2018/compilers_and_libraries_2018.0.128/linux/mkl;CONDOR_LOCATION=/usr;LOADEDMODULES=;MANPATH=/opt/intel/2018/man/common:/opt/intel/2018/compilers_and_libraries_2018.0.128/linux/mpi/man:/opt/intel/2018/documentation_2018/en/debugger//gdb-ia/man/:/opt/intel/2018/documentation_2018/en/debugger//gdb-igfx/man/:/work/ahnitz/projects/4ogc/env/share/man:/usr/local/man:/usr/local/share/man:/usr/share/man:;SCRATCH=/local/user/yifan.wang;I_MPI_ROOT=/opt/intel/2018/compilers_and_libraries_2018.0.128/linux/mpi;PEGASUS_SCHEMA_DIR=/software/tools/pegasus/5.0/share/pegasus/schema;JAVA_HOME=/usr/lib/jvm/default-java;TERM=xterm-256color;ENV=/usr/share/modules/init/profile.sh;LANG=en_US.UTF-8;XDG_SESSION_ID=9334;XDG_SESSION_TYPE=tty;DAALROOT=/opt/intel/2018/compilers_and_libraries_2018.0.128/linux/daal;IPPROOT=/opt/intel/2018/compilers_and_libraries_2018.0.128/linux/ipp;GDBSERVER_MIC=/opt/intel/2018/debugger_2018/gdb/targets/intel64/x200/bin/gdbserver;PEGASUS_PYTHON_EXTERNALS_DIR=/software/tools/pegasus/5.0/lib/pegasus/externals/python;XDG_SESSION_CLASS=user;_=/software/tools/pegasus/5.0/bin/pegasus-plan;PEGASUS_JAVA_DIR=/software/tools/pegasus/5.0/share/pegasus/java;SSH_TTY=/dev/pts/0;SSH_CLIENT=93.233.51.163 56400 22;USER=yifan.wang;CLASSPATH=/software/tools/pegasus/5.0/share/pegasus/java/accessors.jar:/software/tools/pegasus/5.0/share/pegasus/java/bcprov-jdk15on-150.jar:/software/tools/pegasus/5.0/share/pegasus/java/btf-1.2.jar:/software/tools/pegasus/5.0/share/pegasus/java/commons-lang3-3.4.jar:/software/tools/pegasus/5.0/share/pegasus/java/commons-logging.jar:/software/tools/pegasus/5.0/share/pegasus/java/commons-pool.jar:/software/tools/pegasus/5.0/share/pegasus/java/exist-optional.jar:/software/tools/pegasus/5.0/share/pegasus/java/exist.jar:/software/tools/pegasus/5.0/share/pegasus/java/gram-2.1.0.jar:/software/tools/pegasus/5.0/share/pegasus/java/gridftp-2.1.0.jar:/software/tools/pegasus/5.0/share/pegasus/java/gson-2.2.2.jar:/software/tools/pegasus/5.0/share/pegasus/java/gss-2.1.0.jar:/software/tools/pegasus/5.0/share/pegasus/java/guava-16.0.1.jar:/software/tools/pegasus/5.0/share/pegasus/java/hamcrest-core-1.3.jar:/software/tools/pegasus/5.0/share/pegasus/java/io-2.1.0.jar:/software/tools/pegasus/5.0/share/pegasus/java/jackson-annotations-2.9.10.jar:/software/tools/pegasus/5.0/share/pegasus/java/jackson-core-2.9.10.jar:/software/tools/pegasus/5.0/share/pegasus/java/jackson-coreutils-1.8.jar:/software/tools/pegasus/5.0/share/pegasus/java/jackson-databind-2.9.10.jar:/software/tools/pegasus/5.0/share/pegasus/java/jackson-dataformat-yaml-2.9.10.jar:/software/tools/pegasus/5.0/share/pegasus/java/jakarta-oro.jar:/software/tools/pegasus/5.0/share/pegasus/java/java-getopt-1.0.9.jar:/software/tools/pegasus/5.0/share/pegasus/java/javax.json-1.0.4.jar:/software/tools/pegasus/5.0/share/pegasus/java/jcodings-1.0.46.jar:/software/tools/pegasus/5.0/share/pegasus/java/joda-time-2.3.jar:/software/tools/pegasus/5.0/share/pegasus/java/joni-2.1.31.jar:/software/tools/pegasus/5.0/share/pegasus/java/json-schema-validator-1.0.41.jar:/software/tools/pegasus/5.0/share/pegasus/java/jsse-2.1.0.jar:/software/tools/pegasus/5.0/share/pegasus/java/libphonenumber-6.2.jar:/software/tools/pegasus/5.0/share/pegasus/java/log4j-1.2.17.jar:/software/tools/pegasus/5.0/share/pegasus/java/mailapi-1.4.3.jar:/software/tools/pegasus/5.0/share/pegasus/java/msg-simple-1.1.jar:/software/tools/pegasus/5.0/share/pegasus/java/myproxy-2.1.0.jar:/software/tools/pegasus/5.0/share/pegasus/java/mysql-connector-java-5.1.47.jar:/software/tools/pegasus/5.0/share/pegasus/java/pegasus-aws-batch.jar:/software/tools/pegasus/5.0/share/pegasus/java/pegasus.jar:/software/tools/pegasus/5.0/share/pegasus/java/postgresql-8.1dev-400.jdbc3.jar:/software/tools/pegasus/5.0/share/pegasus/java/resolver.jar:/software/tools/pegasus/5.0/share/pegasus/java/rhino-1.7R4.jar:/software/tools/pegasus/5.0/share/pegasus/java/snakeyaml-1.25.jar:/software/tools/pegasus/5.0/share/pegasus/java/sqlite-jdbc-3.8.11.2.jar:/software/tools/pegasus/5.0/share/pegasus/java/ssl-proxies-2.1.0-patched.jar:/software/tools/pegasus/5.0/share/pegasus/java/super-csv-2.4.0.jar:/software/tools/pegasus/5.0/share/pegasus/java/uri-template-0.9.jar:/software/tools/pegasus/5.0/share/pegasus/java/vdl.jar:/software/tools/pegasus/5.0/share/pegasus/java/xercesImpl.jar:/software/tools/pegasus/5.0/share/pegasus/java/xml-apis-1.4.01.jar:/software/tools/pegasus/5.0/share/pegasus/java/xmlParserAPIs.jar:/software/tools/pegasus/5.0/share/pegasus/java/xmldb.jar:/software/tools/pegasus/5.0/share/pegasus/java/xmlrpc.jar:/opt/intel/2018/compilers_and_libraries_2018.0.128/linux/mpi/intel64/lib/mpi.jar:/opt/intel/2018/compilers_and_libraries_2018.0.128/linux/daal/lib/daal.jar:/software/tools/pegasus/5.0/share/pegasus/java/aws/apache-client-2.0.0-preview-2.jar:/software/tools/pegasus/5.0/share/pegasus/java/aws/batch-2.0.0-preview-2.jar:/software/tools/pegasus/5.0/share/pegasus/java/aws/commons-logging-api-1.1.jar:/software/tools/pegasus/5.0/share/pegasus/java/aws/core-2.0.0-preview-2.jar:/software/tools/pegasus/5.0/share/pegasus/java/aws/http-client-spi-2.0.0-preview-2.jar:/software/tools/pegasus/5.0/share/pegasus/java/aws/httpclient-4.5.2.jar:/software/tools/pegasus/5.0/share/pegasus/java/aws/httpcore-4.4.4.jar:/software/tools/pegasus/5.0/share/pegasus/java/aws/jackson-annotations-2.8.8.jar:/software/tools/pegasus/5.0/share/pegasus/java/aws/jackson-databind-2.8.8.jar:/software/tools/pegasus/5.0/share/pegasus/java/aws/jackson-jr-objects-2.9.0.pr4.jar:/software/tools/pegasus/5.0/share/pegasus/java/aws/joda-time-2.8.1.jar:/software/tools/pegasus/5.0/share/pegasus/java/aws/jopt-simple-5.0.4.jar:/software/tools/pegasus/5.0/share/pegasus/java/aws/logs-2.0.0-preview-2.jar:/software/tools/pegasus/5.0/share/pegasus/java/aws/metrics-spi-2.0.0-preview-2.jar:/software/tools/pegasus/5.0/share/pegasus/java/aws/netty-nio-client-2.0.0-preview-2.jar:/software/tools/pegasus/5.0/share/pegasus/java/aws/s3-2.0.0-preview-2.jar:/software/tools/pegasus/5.0/share/pegasus/java/aws/slf4j-api-1.7.25.jar:/software/tools/pegasus/5.0/share/pegasus/java/aws/slf4j-log4j12-1.7.25.jar:/software/tools/pegasus/5.0/share/pegasus/java/aws/thirdparty-logging.jar:/software/tools/pegasus/5.0/share/pegasus/java/aws/thirdparty.jar:/software/tools/pegasus/5.0/share/pegasus/java/aws/utils-2.0.0-preview-2.jar;C_INCLUDE_PATH=/work/yifan.wang/lscsoft/opt/accomlal/include:;PEGASUS_BIN_DIR=/software/tools/pegasus/5.0/bin;SSH_CONNECTION=93.233.51.163 56400 130.75.116.16 22;MODULESHOME=/usr/share/modules;PEGASUS_CONF_DIR=/software/tools/pegasus/5.0/etc;TMP=/local/user/yifan.wang;PYCBC_WAVEFORM=taylorf2e;NLSPATH=/opt/intel/2018/compilers_and_libraries_2018.0.128/linux/compiler/lib/intel64/locale/%l_%t/%N:/opt/intel/2018/compilers_and_libraries_2018.0.128/linux/mkl/lib/intel64_lin/locale/%l_%t/%N:/opt/intel/2018/debugger_2018/gdb/intel64/share/locale/%l_%t/%N;PEGASUS_ORIG_CLASSPATH=/opt/intel/2018/compilers_and_libraries_2018.0.128/linux/mpi/intel64/lib/mpi.jar:/opt/intel/2018/compilers_and_libraries_2018.0.128/linux/daal/lib/daal.jar;MODULES_CMD=/usr/lib/x86_64-linux-gnu/modulecmd.tcl;LIGO_DATAFIND_SERVER=datafind.atlas.local:80;GW_DATAFIND_SERVER=datafind.atlas.local:80;XDG_RUNTIME_DIR=/run/user/44039;GDB_CROSS=/opt/intel/2018/debugger_2018/gdb/intel64/bin/gdb-ia;PEGASUS_SHARE_DIR=/software/tools/pegasus/5.0/share/pegasus;HOME=/work/yifan.wang; 2022.10.05 22:09:57.636 GMT: 2022.10.05 22:09:57.660 GMT: ----------------------------------------------------------------------- 2022.10.05 22:09:57.668 GMT: File for submitting this DAG to HTCondor : gw.dax-0.dag.condor.sub 2022.10.05 22:09:57.673 GMT: Log of DAGMan debugging messages : gw.dax-0.dag.dagman.out 2022.10.05 22:09:57.684 GMT: Log of HTCondor library output : gw.dax-0.dag.lib.out 2022.10.05 22:09:57.692 GMT: Log of HTCondor library error messages : gw.dax-0.dag.lib.err 2022.10.05 22:09:57.700 GMT: Log of the life of condor_dagman itself : gw.dax-0.dag.dagman.log 2022.10.05 22:09:57.708 GMT: 2022.10.05 22:09:57.718 GMT: -no_submit given, not submitting DAG to HTCondor. You can do this with: 2022.10.05 22:09:57.731 GMT: ----------------------------------------------------------------------- 2022.10.05 22:09:57.740 GMT: [DEBUG] condor_submit_dag exited with status 0 2022.10.05 22:09:57.749 GMT: [DEBUG] Updated environment for dagman is environment = _CONDOR_SCHEDD_ADDRESS_FILE=/local/condor/spool/.schedd_address;_CONDOR_MAX_DAGMAN_LOG=0;_CONDOR_SCHEDD_DAEMON_AD_FILE=/local/condor/spool/.schedd_classad;_CONDOR_DAGMAN_LOG=gw.dax-0.dag.dagman.out;PEGASUS_METRICS=true; 2022.10.05 22:09:57.750 GMT: [INFO] event.pegasus.code.generation dax.id gw.dax-0 (9.722 seconds) - FINISHED 2022.10.05 22:09:57.752 GMT: [DEBUG] Executing /software/tools/pegasus/5.0/bin/pegasus-db-admin update -t master -c /local/user/yifan.wang/pycbc-tmp.1RDOkHIy6p/work/pegasus.1845231942488000471.properties 2022.10.05 22:09:59.536 GMT: Database version: '5.0.1' (sqlite:////work/yifan.wang/.pegasus/workflow.db) 2022.10.05 22:09:59.633 GMT: [DEBUG] pegasus-db-admin exited with status 0 2022.10.05 22:09:59.637 GMT: [DEBUG] Executing /software/tools/pegasus/5.0/bin/pegasus-db-admin create -Dpegasus.catalog.replica.db.url=jdbc:sqlite:/local/user/yifan.wang/pycbc-tmp.1RDOkHIy6p/work/gw.dax-0.replicas.db -Dpegasus.catalog.replica=JDBCRC -Dpegasus.catalog.replica.db.driver=sqlite -t jdbcrc -c /local/user/yifan.wang/pycbc-tmp.1RDOkHIy6p/work/pegasus.1845231942488000471.properties 2022.10.05 22:10:02.864 GMT: Pegasus database was successfully created. 2022.10.05 22:10:02.872 GMT: Database version: '5.0.1' (sqlite:////local/user/yifan.wang/pycbc-tmp.1RDOkHIy6p/work/gw.dax-0.replicas.db) 2022.10.05 22:10:02.946 GMT: [DEBUG] pegasus-db-admin exited with status 0 2022.10.05 22:10:02.947 GMT: Output replica catalog set to jdbc:sqlite:/local/user/yifan.wang/pycbc-tmp.1RDOkHIy6p/work/gw.dax-0.replicas.db 2022.10.05 22:10:02.947 GMT: [DEBUG] Executing /software/tools/pegasus/5.0/bin/pegasus-run --nogrid /local/user/yifan.wang/pycbc-tmp.1RDOkHIy6p/work 2022.10.05 22:10:03.454 GMT: Submitting to condor gw.dax-0.dag.condor.sub 2022.10.05 22:10:05.587 GMT: 2022.10.05 22:10:05.596 GMT: Your workflow has been started and is running in the base directory: 2022.10.05 22:10:05.601 GMT: 2022.10.05 22:10:05.606 GMT: /local/user/yifan.wang/pycbc-tmp.1RDOkHIy6p/work 2022.10.05 22:10:05.611 GMT: 2022.10.05 22:10:05.617 GMT: *** To monitor the workflow you can run *** 2022.10.05 22:10:05.622 GMT: 2022.10.05 22:10:05.627 GMT: pegasus-status -l /local/user/yifan.wang/pycbc-tmp.1RDOkHIy6p/work 2022.10.05 22:10:05.632 GMT: 2022.10.05 22:10:05.637 GMT: *** To remove your workflow run *** 2022.10.05 22:10:05.643 GMT: 2022.10.05 22:10:05.648 GMT: pegasus-remove /local/user/yifan.wang/pycbc-tmp.1RDOkHIy6p/work 2022.10.05 22:10:05.653 GMT: [DEBUG] Submission of workflow exited with status 0 2022.10.05 22:10:05.662 GMT: [DEBUG] Sending Planner Metrics to [1 of 1] http://metrics.pegasus.isi.edu/metrics 2022.10.05 22:10:06.046 GMT: [DEBUG] Metrics succesfully sent to the server 2022.10.05 22:10:06.047 GMT: Time taken to execute is 19.625 seconds 2022.10.05 22:10:06.048 GMT: [INFO] event.pegasus.planner planner.version 5.0.1 (19.741 seconds) - FINISHED Querying Pegasus database for workflow stored in /local/user/yifan.wang/pycbc-tmp.1RDOkHIy6p/work This may take up to 120 seconds. Please wait............. Done. Workflow submission completed successfully. The Pegasus dashboard URL for this workflow is: https://condor1.atlas.local/pegasus/u/yifan.wang/r/144/w?wf_uuid=788b2966-4d62-4b86-8e24-27ec3cc664a4 Note that it make take a while for the dashboard entry to appear while the workflow is parsed by the dashboard. The delay can be on the order of one hour for very large workflows.