+ PROC_LSST_SUBMIT_PATH=/mmfs1/home/stevengs/dirac/DEEP/submit + bps submit /gscratch/dirac/shared/opt/proc_lsst/pipelines/submit.yaml -b /mmfs1/home/stevengs/dirac/DEEP/repo -i DEEP/20190827/B0b --output-run DEEP/20190827/B0b/science#step1/20240313T064306Z --qgraph pipeline.qgraph lsst.ctrl.bps.drivers INFO: DISCLAIMER: All values regarding memory consumption reported below are approximate and may not accurately reflect actual memory usage by the bps process. lsst.ctrl.bps.drivers INFO: Starting submission process lsst.ctrl.bps.drivers INFO: Initializing execution environment lsst.ctrl.bps.drivers INFO: Initializing execution environment completed: Took 0.9652 seconds; current memory usage: 0.179 Gibyte, delta: 0.038 Gibyte, peak delta: 0.052 Gibyte lsst.ctrl.bps.drivers INFO: Peak memory usage for bps process 0.193 Gibyte (main), 0.000 Gibyte (largest child process) lsst.ctrl.bps.drivers INFO: Starting acquire stage (generating and/or reading quantum graph) lsst.ctrl.bps.pre_transform INFO: Copying quantum graph from 'pipeline.qgraph' lsst.ctrl.bps.pre_transform INFO: Completed copying quantum graph: Took 0.0024 seconds lsst.ctrl.bps.pre_transform INFO: Backing up quantum graph from '/mmfs1/home/stevengs/dirac/DEEP/submit/DEEP/20190827/B0b/science#step1/20240313T064306Z/submit/DEEP/20190827/B0b/science#step1/20240313T064306Z/pipeline.qgraph' lsst.ctrl.bps.pre_transform INFO: Completed backing up quantum graph: Took 0.0057 seconds lsst.ctrl.bps.pre_transform INFO: /mmfs1/gscratch/dirac/shared/opt/conda/envs/lsst-scipipe-8.0.0/share/eups/Linux64/ctrl_mpexec/g1ce94f1343+c79f27626b/bin/pipetask --long-log --log-level=VERBOSE update-graph-run /mmfs1/home/stevengs/dirac/DEEP/submit/DEEP/20190827/B0b/science#step1/20240313T064306Z/submit/DEEP/20190827/B0b/science#step1/20240313T064306Z/pipeline_orig.qgraph DEEP/20190827/B0b/science#step1/20240313T064306Z /mmfs1/home/stevengs/dirac/DEEP/submit/DEEP/20190827/B0b/science#step1/20240313T064306Z/submit/DEEP/20190827/B0b/science#step1/20240313T064306Z/pipeline.qgraph lsst.ctrl.bps.pre_transform INFO: Reading quantum graph from '/mmfs1/home/stevengs/dirac/DEEP/submit/DEEP/20190827/B0b/science#step1/20240313T064306Z/submit/DEEP/20190827/B0b/science#step1/20240313T064306Z/pipeline.qgraph' lsst.ctrl.bps.pre_transform INFO: Completed reading quantum graph: Took 3.7001 seconds lsst.ctrl.bps.drivers INFO: Acquire stage completed: Took 8.8677 seconds; current memory usage: 0.344 Gibyte, delta: 0.165 Gibyte, peak delta: 0.151 Gibyte lsst.ctrl.bps.drivers INFO: Peak memory usage for bps process 0.344 Gibyte (main), 0.332 Gibyte (largest child process) lsst.ctrl.bps.drivers INFO: Starting cluster stage (grouping quanta into jobs) lsst.ctrl.bps.drivers INFO: Cluster stage completed: Took 0.0081 seconds; current memory usage: 0.344 Gibyte, delta: 0.000 Gibyte, peak delta: 0.000 Gibyte lsst.ctrl.bps.drivers INFO: Peak memory usage for bps process 0.344 Gibyte (main), 0.332 Gibyte (largest child process) lsst.ctrl.bps.drivers INFO: ClusteredQuantumGraph contains 28 cluster(s) lsst.ctrl.bps.drivers INFO: Starting transform stage (creating generic workflow) lsst.ctrl.bps.drivers INFO: Generic workflow name 'DEEP_20190827_B0b_science#step1_20240313T064306Z' lsst.ctrl.bps.drivers INFO: Transform stage completed: Took 0.0236 seconds; current memory usage: 0.344 Gibyte, delta: 0.000 Gibyte, peak delta: 0.000 Gibyte lsst.ctrl.bps.drivers INFO: Peak memory usage for bps process 0.344 Gibyte (main), 0.332 Gibyte (largest child process) lsst.ctrl.bps.drivers INFO: GenericWorkflow contains 30 job(s) (including final) lsst.ctrl.bps.drivers INFO: Starting prepare stage (creating specific implementation of workflow) parsl.addresses ERROR: Ignoring failure to fetch address from interface eno2 Traceback (most recent call last): File "/mmfs1/gscratch/dirac/shared/opt/conda/envs/lsst-scipipe-8.0.0/lib/python3.11/site-packages/parsl/addresses.py", line 111, in get_all_addresses s_addresses.add(address_by_interface(interface)) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ File "/mmfs1/gscratch/dirac/shared/opt/conda/envs/lsst-scipipe-8.0.0/lib/python3.11/site-packages/typeguard/__init__.py", line 1033, in wrapper retval = func(*args, **kwargs) ^^^^^^^^^^^^^^^^^^^^^ File "/mmfs1/gscratch/dirac/shared/opt/conda/envs/lsst-scipipe-8.0.0/lib/python3.11/site-packages/parsl/addresses.py", line 93, in address_by_interface return socket.inet_ntoa(fcntl.ioctl( ^^^^^^^^^^^^ OSError: [Errno 99] Cannot assign requested address lsst.ctrl.bps.parsl INFO: Writing workflow with ID=/mmfs1/home/stevengs/dirac/DEEP/submit/DEEP/20190827/B0b/science#step1/20240313T064306Z/submit/DEEP/20190827/B0b/science#step1/20240313T064306Z lsst.ctrl.bps.drivers INFO: Prepare stage completed: Took 0.1122 seconds; current memory usage: 0.345 Gibyte, delta: 0.001 Gibyte, peak delta: 0.001 Gibyte lsst.ctrl.bps.drivers INFO: Peak memory usage for bps process 0.345 Gibyte (main), 0.332 Gibyte (largest child process) lsst.ctrl.bps.drivers INFO: Starting submit stage lsst.ctrl.bps.submit INFO: Submitting run to a workflow management system for execution parsl.dataflow.rundirs DEBUG: Parsl run initializing in rundir: runinfo/000 parsl.dataflow.dflow INFO: Starting DataFlowKernel with config Config( app_cache=True, checkpoint_files=None, checkpoint_mode='task_exit', checkpoint_period=None, executors=[MultiHighThroughputExecutor()], garbage_collect=True, initialize_logging=True, internal_tasks_max_threads=10, max_idletime=120.0, monitoring=None, retries=1, retry_handler=None, run_dir='runinfo', strategy='simple', usage_tracking=False ) parsl.dataflow.dflow INFO: Parsl version: 2023.06.12 parsl.usage_tracking.usage DEBUG: Tracking status: False parsl.dataflow.dflow INFO: Run id is: ffe10c25-6ac0-4705-a492-45bfb34b4b4c parsl.dataflow.dflow DEBUG: Considering candidate for workflow name: /mmfs1/gscratch/dirac/shared/opt/conda/envs/lsst-scipipe-8.0.0/lib/python3.11/site-packages/parsl/dataflow/dflow.py parsl.dataflow.dflow DEBUG: Considering candidate for workflow name: /mmfs1/gscratch/dirac/shared/opt/conda/envs/lsst-scipipe-8.0.0/lib/python3.11/site-packages/typeguard/__init__.py parsl.dataflow.dflow DEBUG: Considering candidate for workflow name: /mmfs1/gscratch/dirac/shared/opt/conda/envs/lsst-scipipe-8.0.0/lib/python3.11/site-packages/parsl/dataflow/dflow.py parsl.dataflow.dflow DEBUG: Considering candidate for workflow name: /mmfs1/gscratch/dirac/shared/opt/conda/envs/lsst-scipipe-8.0.0/lib/python3.11/site-packages/typeguard/__init__.py parsl.dataflow.dflow DEBUG: Considering candidate for workflow name: /mmfs1/gscratch/dirac/shared/opt/conda/envs/lsst-scipipe-8.0.0/share/eups/Linux64/ctrl_bps_parsl/g145af14111+5b908e21bc/python/lsst/ctrl/bps/parsl/workflow.py parsl.dataflow.dflow DEBUG: Using workflow.py as workflow name parsl.dataflow.memoization INFO: App caching initialized parsl.dataflow.strategy DEBUG: Scaling strategy: simple parsl.executors.high_throughput.executor DEBUG: Starting queue management thread parsl.executors.high_throughput.executor DEBUG: queue management worker starting parsl.executors.high_throughput.executor DEBUG: Started queue management thread Submit dir: /mmfs1/home/stevengs/dirac/DEEP/submit/DEEP/20190827/B0b/science#step1/20240313T064306Z/submit/DEEP/20190827/B0b/science#step1/20240313T064306Z parsl.executors.high_throughput.executor DEBUG: Created management thread: parsl.executors.high_throughput.executor DEBUG: Launch command: process_worker_pool.py -a n3009,198.48.92.26,127.0.0.1,10.64.65.9,169.254.95.120,10.64.129.9 -p 0 -c 1.0 -m None --poll 10 --task_port=54440 --result_port=54877 --logdir=/mmfs1/gscratch/dirac/DEEP/submit/DEEP/20190827/B0b/science#step1/20240313T064306Z/runinfo/000/multi --block_id={block_id} --hb_period=30 --hb_threshold=120 --cpu-affinity none --available-accelerators --start-method spawn parsl.executors.high_throughput.executor DEBUG: Starting HighThroughputExecutor with provider: parsl.executors.status_handling INFO: Scaling out by 1 blocks parsl.executors.status_handling INFO: Allocated block ID 0 parsl.executors.status_handling DEBUG: Submitting to provider with job_name parsl.multi.block-0 2024-03-13 06:43:35 proc_lsst.multi:153 [INFO] [multi] got submit process_worker_pool.py -a n3009,198.48.92.26,127.0.0.1,10.64.65.9,169.254.95.120,10.64.129.9 -p 0 -c 1.0 -m None --poll 10 --task_port=54440 --result_port=54877 --logdir=/mmfs1/gscratch/dirac/DEEP/submit/DEEP/20190827/B0b/science#step1/20240313T064306Z/runinfo/000/multi --block_id=0 --hb_period=30 --hb_threshold=120 --cpu-affinity none --available-accelerators --start-method spawn 1 parsl.multi.block-0 proc_lsst.multi INFO: [multi] got submit process_worker_pool.py -a n3009,198.48.92.26,127.0.0.1,10.64.65.9,169.254.95.120,10.64.129.9 -p 0 -c 1.0 -m None --poll 10 --task_port=54440 --result_port=54877 --logdir=/mmfs1/gscratch/dirac/DEEP/submit/DEEP/20190827/B0b/science#step1/20240313T064306Z/runinfo/000/multi --block_id=0 --hb_period=30 --hb_threshold=120 --cpu-affinity none --available-accelerators --start-method spawn 1 parsl.multi.block-0 2024-03-13 06:43:35 proc_lsst.multi:162 [INFO] [multi] local process_worker_pool.py -a n3009,198.48.92.26,127.0.0.1,10.64.65.9,169.254.95.120,10.64.129.9 -p 0 -c 1.0 -m None --poll 10 --task_port=54440 --result_port=54877 --logdir=/mmfs1/gscratch/dirac/DEEP/submit/DEEP/20190827/B0b/science#step1/20240313T064306Z/runinfo/000/multi --block_id=0 --hb_period=30 --hb_threshold=120 --cpu-affinity none --available-accelerators --start-method spawn --max_workers 2 proc_lsst.multi INFO: [multi] local process_worker_pool.py -a n3009,198.48.92.26,127.0.0.1,10.64.65.9,169.254.95.120,10.64.129.9 -p 0 -c 1.0 -m None --poll 10 --task_port=54440 --result_port=54877 --logdir=/mmfs1/gscratch/dirac/DEEP/submit/DEEP/20190827/B0b/science#step1/20240313T064306Z/runinfo/000/multi --block_id=0 --hb_period=30 --hb_threshold=120 --cpu-affinity none --available-accelerators --start-method spawn --max_workers 2 2024-03-13 06:43:35 proc_lsst.multi:163 [INFO] [multi] len(self.providers[provider].resources) 0 proc_lsst.multi INFO: [multi] len(self.providers[provider].resources) 0 2024-03-13 06:43:35 proc_lsst.multi:164 [INFO] [multi] self.providers[provider].max_blocks 1 proc_lsst.multi INFO: [multi] self.providers[provider].max_blocks 1 2024-03-13 06:43:35 proc_lsst.multi:166 [INFO] [multi] submitting process_worker_pool.py -a n3009,198.48.92.26,127.0.0.1,10.64.65.9,169.254.95.120,10.64.129.9 -p 0 -c 1.0 -m None --poll 10 --task_port=54440 --result_port=54877 --logdir=/mmfs1/gscratch/dirac/DEEP/submit/DEEP/20190827/B0b/science#step1/20240313T064306Z/runinfo/000/multi --block_id=0 --hb_period=30 --hb_threshold=120 --cpu-affinity none --available-accelerators --start-method spawn --max_workers 2 to local proc_lsst.multi INFO: [multi] submitting process_worker_pool.py -a n3009,198.48.92.26,127.0.0.1,10.64.65.9,169.254.95.120,10.64.129.9 -p 0 -c 1.0 -m None --poll 10 --task_port=54440 --result_port=54877 --logdir=/mmfs1/gscratch/dirac/DEEP/submit/DEEP/20190827/B0b/science#step1/20240313T064306Z/runinfo/000/multi --block_id=0 --hb_period=30 --hb_threshold=120 --cpu-affinity none --available-accelerators --start-method spawn --max_workers 2 to local parsl.providers.local.local DEBUG: Launching in remote mode 2024-03-13 06:43:35 proc_lsst.multi:170 [INFO] [multi] job_id 48154 proc_lsst.multi INFO: [multi] job_id 48154 2024-03-13 06:43:35 proc_lsst.multi:171 [INFO] [multi] len(self.providers[provider].resources) 1 proc_lsst.multi INFO: [multi] len(self.providers[provider].resources) 1 2024-03-13 06:43:35 proc_lsst.multi:172 [INFO] [multi] self.providers[provider].max_blocks 1 proc_lsst.multi INFO: [multi] self.providers[provider].max_blocks 1 2024-03-13 06:43:35 proc_lsst.multi:178 [INFO] [multi] provider local accepted submit and returned 48154 proc_lsst.multi INFO: [multi] provider local accepted submit and returned 48154 parsl.executors.status_handling DEBUG: Launched block 0 on executor multi with job ID 48154 parsl.dataflow.job_status_poller DEBUG: Adding executor multi parsl.dataflow.dflow DEBUG: Task 0 will be sent to executor multi parsl.dataflow.dflow DEBUG: Adding output dependencies parsl.dataflow.dflow INFO: Task 0 submitted for App calibrate, not waiting on any dependency parsl.dataflow.dflow DEBUG: Task 0 set to pending state with AppFuture: parsl.dataflow.memoization DEBUG: Ignoring these kwargs for checkpointing: ['stderr', 'stdout'] parsl.dataflow.memoization DEBUG: Ignoring kwarg stderr parsl.dataflow.memoization DEBUG: Ignoring kwarg stdout parsl.dataflow.memoization DEBUG: Task 0 has memoization hash 066974e73335b84bd5a3e4da3c94b901 parsl.dataflow.memoization INFO: Task 0 had no result in cache parsl.executors.high_throughput.executor DEBUG: Pushing function .wrapper at 0x1541e8ce9300> to queue with args ("'${CTRL_MPEXEC_DIR}/bin/pipetask --long-log --log-level=VERBOSE run-qbb /mmfs1/home/stevengs/dirac/D...",) parsl.dataflow.dflow INFO: Parsl task 0 try 0 launched on executor multi with executor id 1 parsl.dataflow.dflow INFO: Standard output for task 0 available at /mmfs1/home/stevengs/dirac/DEEP/submit/DEEP/20190827/B0b/science#step1/20240313T064306Z/submit/DEEP/20190827/B0b/science#step1/20240313T064306Z/logs/calibrate/890939/d6874642-13c5-415c-929b-08bc9969eaf1_calibrate_890939_61.stdout parsl.dataflow.dflow INFO: Standard error for task 0 available at /mmfs1/home/stevengs/dirac/DEEP/submit/DEEP/20190827/B0b/science#step1/20240313T064306Z/submit/DEEP/20190827/B0b/science#step1/20240313T064306Z/logs/calibrate/890939/d6874642-13c5-415c-929b-08bc9969eaf1_calibrate_890939_61.stderr parsl.dataflow.dflow DEBUG: Task 1 will be sent to executor multi parsl.dataflow.dflow DEBUG: Adding output dependencies parsl.dataflow.dflow INFO: Task 1 submitted for App calibrate, not waiting on any dependency parsl.dataflow.dflow DEBUG: Task 1 set to pending state with AppFuture: parsl.dataflow.memoization DEBUG: Ignoring these kwargs for checkpointing: ['stderr', 'stdout'] parsl.dataflow.memoization DEBUG: Ignoring kwarg stderr parsl.dataflow.memoization DEBUG: Ignoring kwarg stdout parsl.dataflow.memoization DEBUG: Task 1 has memoization hash 195b7a20d72e3f8ba2243ab3dc63d265 parsl.dataflow.memoization INFO: Task 1 had no result in cache parsl.executors.high_throughput.executor DEBUG: Pushing function .wrapper at 0x1541e8ce93a0> to queue with args ("'${CTRL_MPEXEC_DIR}/bin/pipetask --long-log --log-level=VERBOSE run-qbb /mmfs1/home/stevengs/dirac/D...",) parsl.dataflow.dflow INFO: Parsl task 1 try 0 launched on executor multi with executor id 2 parsl.dataflow.dflow INFO: Standard output for task 1 available at /mmfs1/home/stevengs/dirac/DEEP/submit/DEEP/20190827/B0b/science#step1/20240313T064306Z/submit/DEEP/20190827/B0b/science#step1/20240313T064306Z/logs/calibrate/890938/8c16f2c6-023d-4a22-b2ab-74808e864101_calibrate_890938_61.stdout parsl.dataflow.dflow INFO: Standard error for task 1 available at /mmfs1/home/stevengs/dirac/DEEP/submit/DEEP/20190827/B0b/science#step1/20240313T064306Z/submit/DEEP/20190827/B0b/science#step1/20240313T064306Z/logs/calibrate/890938/8c16f2c6-023d-4a22-b2ab-74808e864101_calibrate_890938_61.stderr parsl.dataflow.dflow DEBUG: Task 2 will be sent to executor multi parsl.dataflow.dflow DEBUG: Adding output dependencies parsl.dataflow.dflow INFO: Task 2 submitted for App calibrate, not waiting on any dependency parsl.dataflow.dflow DEBUG: Task 2 set to pending state with AppFuture: parsl.dataflow.memoization DEBUG: Ignoring these kwargs for checkpointing: ['stderr', 'stdout'] parsl.dataflow.memoization DEBUG: Ignoring kwarg stderr parsl.dataflow.memoization DEBUG: Ignoring kwarg stdout parsl.dataflow.memoization DEBUG: Task 2 has memoization hash 1917517d4584c3803db50e1fe8f53203 parsl.dataflow.memoization INFO: Task 2 had no result in cache parsl.executors.high_throughput.executor DEBUG: Pushing function .wrapper at 0x1541e8ce98a0> to queue with args ("'${CTRL_MPEXEC_DIR}/bin/pipetask --long-log --log-level=VERBOSE run-qbb /mmfs1/home/stevengs/dirac/D...",) parsl.dataflow.dflow INFO: Parsl task 2 try 0 launched on executor multi with executor id 3 parsl.dataflow.dflow INFO: Standard output for task 2 available at /mmfs1/home/stevengs/dirac/DEEP/submit/DEEP/20190827/B0b/science#step1/20240313T064306Z/submit/DEEP/20190827/B0b/science#step1/20240313T064306Z/logs/calibrate/890940/5e0b598b-92cc-40e4-8974-08534fbf3894_calibrate_890940_61.stdout parsl.dataflow.dflow INFO: Standard error for task 2 available at /mmfs1/home/stevengs/dirac/DEEP/submit/DEEP/20190827/B0b/science#step1/20240313T064306Z/submit/DEEP/20190827/B0b/science#step1/20240313T064306Z/logs/calibrate/890940/5e0b598b-92cc-40e4-8974-08534fbf3894_calibrate_890940_61.stderr parsl.dataflow.dflow DEBUG: Task 3 will be sent to executor multi parsl.dataflow.dflow DEBUG: Adding output dependencies parsl.dataflow.dflow INFO: Task 3 submitted for App calibrate, not waiting on any dependency parsl.dataflow.dflow DEBUG: Task 3 set to pending state with AppFuture: parsl.dataflow.memoization DEBUG: Ignoring these kwargs for checkpointing: ['stderr', 'stdout'] parsl.dataflow.memoization DEBUG: Ignoring kwarg stderr parsl.dataflow.memoization DEBUG: Ignoring kwarg stdout parsl.dataflow.memoization DEBUG: Task 3 has memoization hash c79aac85763efa2a323bf02a99d5064a parsl.dataflow.memoization INFO: Task 3 had no result in cache parsl.executors.high_throughput.executor DEBUG: Pushing function .wrapper at 0x1541e8ce9940> to queue with args ("'${CTRL_MPEXEC_DIR}/bin/pipetask --long-log --log-level=VERBOSE run-qbb /mmfs1/home/stevengs/dirac/D...",) parsl.dataflow.dflow INFO: Parsl task 3 try 0 launched on executor multi with executor id 4 parsl.dataflow.dflow INFO: Standard output for task 3 available at /mmfs1/home/stevengs/dirac/DEEP/submit/DEEP/20190827/B0b/science#step1/20240313T064306Z/submit/DEEP/20190827/B0b/science#step1/20240313T064306Z/logs/calibrate/890937/68cc28c3-61c8-4d74-a6e4-b86339a9769b_calibrate_890937_61.stdout parsl.dataflow.dflow INFO: Standard error for task 3 available at /mmfs1/home/stevengs/dirac/DEEP/submit/DEEP/20190827/B0b/science#step1/20240313T064306Z/submit/DEEP/20190827/B0b/science#step1/20240313T064306Z/logs/calibrate/890937/68cc28c3-61c8-4d74-a6e4-b86339a9769b_calibrate_890937_61.stderr parsl.dataflow.dflow DEBUG: Task 4 will be sent to executor multi parsl.dataflow.dflow DEBUG: Adding output dependencies parsl.dataflow.dflow INFO: Task 4 submitted for App characterizeImage, not waiting on any dependency parsl.dataflow.dflow DEBUG: Task 4 set to pending state with AppFuture: parsl.dataflow.memoization DEBUG: Ignoring these kwargs for checkpointing: ['stderr', 'stdout'] parsl.dataflow.memoization DEBUG: Ignoring kwarg stderr parsl.dataflow.memoization DEBUG: Ignoring kwarg stdout parsl.dataflow.memoization DEBUG: Task 4 has memoization hash c8402befe8520c3274b69ed563a18646 parsl.dataflow.memoization INFO: Task 4 had no result in cache parsl.executors.high_throughput.executor DEBUG: Pushing function .wrapper at 0x1541e8ce9800> to queue with args ("'${CTRL_MPEXEC_DIR}/bin/pipetask --long-log --log-level=VERBOSE run-qbb /mmfs1/home/stevengs/dirac/D...",) parsl.dataflow.dflow INFO: Parsl task 4 try 0 launched on executor multi with executor id 5 parsl.dataflow.dflow INFO: Standard output for task 4 available at /mmfs1/home/stevengs/dirac/DEEP/submit/DEEP/20190827/B0b/science#step1/20240313T064306Z/submit/DEEP/20190827/B0b/science#step1/20240313T064306Z/logs/characterizeImage/890936/aed96731-2ed8-4b64-b635-469dd34d0331_characterizeImage_890936_61.stdout parsl.dataflow.dflow INFO: Standard error for task 4 available at /mmfs1/home/stevengs/dirac/DEEP/submit/DEEP/20190827/B0b/science#step1/20240313T064306Z/submit/DEEP/20190827/B0b/science#step1/20240313T064306Z/logs/characterizeImage/890936/aed96731-2ed8-4b64-b635-469dd34d0331_characterizeImage_890936_61.stderr parsl.dataflow.dflow DEBUG: Task 5 will be sent to executor multi parsl.dataflow.dflow DEBUG: Adding output dependencies parsl.dataflow.dflow INFO: Task 5 submitted for App calibrate, waiting on task 4 parsl.dataflow.dflow DEBUG: Task 5 set to pending state with AppFuture: parsl.dataflow.dflow DEBUG: Task 5 has outstanding dependencies, so launch_if_ready skipping parsl.dataflow.dflow DEBUG: Task 6 will be sent to executor multi parsl.dataflow.dflow DEBUG: Adding output dependencies parsl.dataflow.dflow INFO: Task 6 submitted for App characterizeImage, not waiting on any dependency parsl.dataflow.dflow DEBUG: Task 6 set to pending state with AppFuture: parsl.dataflow.memoization DEBUG: Ignoring these kwargs for checkpointing: ['stderr', 'stdout'] parsl.dataflow.memoization DEBUG: Ignoring kwarg stderr parsl.dataflow.memoization DEBUG: Ignoring kwarg stdout parsl.dataflow.memoization DEBUG: Task 6 has memoization hash 29e9c5beeab127c767e96112023a28bf parsl.dataflow.memoization INFO: Task 6 had no result in cache parsl.executors.high_throughput.executor DEBUG: Pushing function .wrapper at 0x1541e8ce99e0> to queue with args ("'${CTRL_MPEXEC_DIR}/bin/pipetask --long-log --log-level=VERBOSE run-qbb /mmfs1/home/stevengs/dirac/D...",) parsl.dataflow.dflow INFO: Parsl task 6 try 0 launched on executor multi with executor id 6 parsl.dataflow.dflow INFO: Standard output for task 6 available at /mmfs1/home/stevengs/dirac/DEEP/submit/DEEP/20190827/B0b/science#step1/20240313T064306Z/submit/DEEP/20190827/B0b/science#step1/20240313T064306Z/logs/characterizeImage/890940/ae8ab3af-0bec-442b-a18c-fe585de0c308_characterizeImage_890940_2.stdout parsl.dataflow.dflow INFO: Standard error for task 6 available at /mmfs1/home/stevengs/dirac/DEEP/submit/DEEP/20190827/B0b/science#step1/20240313T064306Z/submit/DEEP/20190827/B0b/science#step1/20240313T064306Z/logs/characterizeImage/890940/ae8ab3af-0bec-442b-a18c-fe585de0c308_characterizeImage_890940_2.stderr parsl.dataflow.dflow DEBUG: Task 7 will be sent to executor multi parsl.dataflow.dflow DEBUG: Adding output dependencies parsl.dataflow.dflow INFO: Task 7 submitted for App calibrate, waiting on task 6 parsl.dataflow.dflow DEBUG: Task 7 set to pending state with AppFuture: parsl.dataflow.dflow DEBUG: Task 7 has outstanding dependencies, so launch_if_ready skipping parsl.dataflow.dflow DEBUG: Task 8 will be sent to executor multi parsl.dataflow.dflow DEBUG: Adding output dependencies parsl.dataflow.dflow INFO: Task 8 submitted for App characterizeImage, not waiting on any dependency parsl.dataflow.dflow DEBUG: Task 8 set to pending state with AppFuture: parsl.dataflow.memoization DEBUG: Ignoring these kwargs for checkpointing: ['stderr', 'stdout'] parsl.dataflow.memoization DEBUG: Ignoring kwarg stderr parsl.dataflow.memoization DEBUG: Ignoring kwarg stdout parsl.dataflow.memoization DEBUG: Task 8 has memoization hash 33798d6799b5e8dfe132833655f330fd parsl.dataflow.memoization INFO: Task 8 had no result in cache parsl.executors.high_throughput.executor DEBUG: Pushing function .wrapper at 0x1541e8ce9760> to queue with args ("'${CTRL_MPEXEC_DIR}/bin/pipetask --long-log --log-level=VERBOSE run-qbb /mmfs1/home/stevengs/dirac/D...",) parsl.dataflow.dflow INFO: Parsl task 8 try 0 launched on executor multi with executor id 7 parsl.dataflow.dflow INFO: Standard output for task 8 available at /mmfs1/home/stevengs/dirac/DEEP/submit/DEEP/20190827/B0b/science#step1/20240313T064306Z/submit/DEEP/20190827/B0b/science#step1/20240313T064306Z/logs/characterizeImage/890937/1f16a1d8-2ee4-4f0b-b820-9d0f913898ba_characterizeImage_890937_2.stdout parsl.dataflow.dflow INFO: Standard error for task 8 available at /mmfs1/home/stevengs/dirac/DEEP/submit/DEEP/20190827/B0b/science#step1/20240313T064306Z/submit/DEEP/20190827/B0b/science#step1/20240313T064306Z/logs/characterizeImage/890937/1f16a1d8-2ee4-4f0b-b820-9d0f913898ba_characterizeImage_890937_2.stderr parsl.dataflow.dflow DEBUG: Task 9 will be sent to executor multi parsl.dataflow.dflow DEBUG: Adding output dependencies parsl.dataflow.dflow INFO: Task 9 submitted for App calibrate, waiting on task 8 parsl.dataflow.dflow DEBUG: Task 9 set to pending state with AppFuture: parsl.dataflow.dflow DEBUG: Task 9 has outstanding dependencies, so launch_if_ready skipping parsl.dataflow.dflow DEBUG: Task 10 will be sent to executor multi parsl.dataflow.dflow DEBUG: Adding output dependencies parsl.dataflow.dflow INFO: Task 10 submitted for App characterizeImage, not waiting on any dependency parsl.dataflow.dflow DEBUG: Task 10 set to pending state with AppFuture: parsl.dataflow.memoization DEBUG: Ignoring these kwargs for checkpointing: ['stderr', 'stdout'] parsl.dataflow.memoization DEBUG: Ignoring kwarg stderr parsl.dataflow.memoization DEBUG: Ignoring kwarg stdout parsl.dataflow.memoization DEBUG: Task 10 has memoization hash 7573b578282609e4c18db779aca250b0 parsl.dataflow.memoization INFO: Task 10 had no result in cache parsl.executors.high_throughput.executor DEBUG: Pushing function .wrapper at 0x1541e8ce94e0> to queue with args ("'${CTRL_MPEXEC_DIR}/bin/pipetask --long-log --log-level=VERBOSE run-qbb /mmfs1/home/stevengs/dirac/D...",) parsl.dataflow.dflow INFO: Parsl task 10 try 0 launched on executor multi with executor id 8 parsl.dataflow.dflow INFO: Standard output for task 10 available at /mmfs1/home/stevengs/dirac/DEEP/submit/DEEP/20190827/B0b/science#step1/20240313T064306Z/submit/DEEP/20190827/B0b/science#step1/20240313T064306Z/logs/characterizeImage/890942/2ce7a7e4-5757-4164-8090-046ef96f1a00_characterizeImage_890942_2.stdout parsl.dataflow.dflow INFO: Standard error for task 10 available at /mmfs1/home/stevengs/dirac/DEEP/submit/DEEP/20190827/B0b/science#step1/20240313T064306Z/submit/DEEP/20190827/B0b/science#step1/20240313T064306Z/logs/characterizeImage/890942/2ce7a7e4-5757-4164-8090-046ef96f1a00_characterizeImage_890942_2.stderr parsl.dataflow.dflow DEBUG: Task 11 will be sent to executor multi parsl.dataflow.dflow DEBUG: Adding output dependencies parsl.dataflow.dflow INFO: Task 11 submitted for App calibrate, waiting on task 10 parsl.dataflow.dflow DEBUG: Task 11 set to pending state with AppFuture: parsl.dataflow.dflow DEBUG: Task 11 has outstanding dependencies, so launch_if_ready skipping parsl.dataflow.dflow DEBUG: Task 12 will be sent to executor multi parsl.dataflow.dflow DEBUG: Adding output dependencies parsl.dataflow.dflow INFO: Task 12 submitted for App characterizeImage, not waiting on any dependency parsl.dataflow.dflow DEBUG: Task 12 set to pending state with AppFuture: parsl.dataflow.memoization DEBUG: Ignoring these kwargs for checkpointing: ['stderr', 'stdout'] parsl.dataflow.memoization DEBUG: Ignoring kwarg stderr parsl.dataflow.memoization DEBUG: Ignoring kwarg stdout parsl.dataflow.memoization DEBUG: Task 12 has memoization hash 6d6bf02f714bd4f7218acd2d623618ae parsl.dataflow.memoization INFO: Task 12 had no result in cache parsl.executors.high_throughput.executor DEBUG: Pushing function .wrapper at 0x1541e8cea200> to queue with args ("'${CTRL_MPEXEC_DIR}/bin/pipetask --long-log --log-level=VERBOSE run-qbb /mmfs1/home/stevengs/dirac/D...",) parsl.dataflow.dflow INFO: Parsl task 12 try 0 launched on executor multi with executor id 9 parsl.dataflow.dflow INFO: Standard output for task 12 available at /mmfs1/home/stevengs/dirac/DEEP/submit/DEEP/20190827/B0b/science#step1/20240313T064306Z/submit/DEEP/20190827/B0b/science#step1/20240313T064306Z/logs/characterizeImage/890936/37f041e3-382c-455c-bf65-05022dc758bf_characterizeImage_890936_2.stdout parsl.dataflow.dflow INFO: Standard error for task 12 available at /mmfs1/home/stevengs/dirac/DEEP/submit/DEEP/20190827/B0b/science#step1/20240313T064306Z/submit/DEEP/20190827/B0b/science#step1/20240313T064306Z/logs/characterizeImage/890936/37f041e3-382c-455c-bf65-05022dc758bf_characterizeImage_890936_2.stderr parsl.dataflow.dflow DEBUG: Task 13 will be sent to executor multi parsl.dataflow.dflow DEBUG: Adding output dependencies parsl.dataflow.dflow INFO: Task 13 submitted for App calibrate, waiting on task 12 parsl.dataflow.dflow DEBUG: Task 13 set to pending state with AppFuture: parsl.dataflow.dflow DEBUG: Task 13 has outstanding dependencies, so launch_if_ready skipping parsl.dataflow.dflow DEBUG: Task 14 will be sent to executor multi parsl.dataflow.dflow DEBUG: Adding output dependencies parsl.dataflow.dflow INFO: Task 14 submitted for App characterizeImage, not waiting on any dependency parsl.dataflow.dflow DEBUG: Task 14 set to pending state with AppFuture: parsl.dataflow.memoization DEBUG: Ignoring these kwargs for checkpointing: ['stderr', 'stdout'] parsl.dataflow.memoization DEBUG: Ignoring kwarg stderr parsl.dataflow.memoization DEBUG: Ignoring kwarg stdout parsl.dataflow.memoization DEBUG: Task 14 has memoization hash f57d9a889b0d0cefa53a0b5335ff6c8d parsl.dataflow.memoization INFO: Task 14 had no result in cache parsl.executors.high_throughput.executor DEBUG: Pushing function .wrapper at 0x1541e8cea480> to queue with args ("'${CTRL_MPEXEC_DIR}/bin/pipetask --long-log --log-level=VERBOSE run-qbb /mmfs1/home/stevengs/dirac/D...",) parsl.dataflow.dflow INFO: Parsl task 14 try 0 launched on executor multi with executor id 10 parsl.dataflow.dflow INFO: Standard output for task 14 available at /mmfs1/home/stevengs/dirac/DEEP/submit/DEEP/20190827/B0b/science#step1/20240313T064306Z/submit/DEEP/20190827/B0b/science#step1/20240313T064306Z/logs/characterizeImage/890939/41749cfd-040e-4f5d-b162-dd0362b5f651_characterizeImage_890939_2.stdout parsl.dataflow.dflow INFO: Standard error for task 14 available at /mmfs1/home/stevengs/dirac/DEEP/submit/DEEP/20190827/B0b/science#step1/20240313T064306Z/submit/DEEP/20190827/B0b/science#step1/20240313T064306Z/logs/characterizeImage/890939/41749cfd-040e-4f5d-b162-dd0362b5f651_characterizeImage_890939_2.stderr parsl.dataflow.dflow DEBUG: Task 15 will be sent to executor multi parsl.dataflow.dflow DEBUG: Adding output dependencies parsl.dataflow.dflow INFO: Task 15 submitted for App calibrate, waiting on task 14 parsl.dataflow.dflow DEBUG: Task 15 set to pending state with AppFuture: parsl.dataflow.dflow DEBUG: Task 15 has outstanding dependencies, so launch_if_ready skipping parsl.dataflow.dflow DEBUG: Task 16 will be sent to executor multi parsl.dataflow.dflow DEBUG: Adding output dependencies parsl.dataflow.dflow INFO: Task 16 submitted for App characterizeImage, not waiting on any dependency parsl.dataflow.dflow DEBUG: Task 16 set to pending state with AppFuture: parsl.dataflow.memoization DEBUG: Ignoring these kwargs for checkpointing: ['stderr', 'stdout'] parsl.dataflow.memoization DEBUG: Ignoring kwarg stderr parsl.dataflow.memoization DEBUG: Ignoring kwarg stdout parsl.dataflow.memoization DEBUG: Task 16 has memoization hash 2b23b109b9eb5318ea3acf54740c7092 parsl.dataflow.memoization INFO: Task 16 had no result in cache parsl.executors.high_throughput.executor DEBUG: Pushing function .wrapper at 0x1541e8cea340> to queue with args ("'${CTRL_MPEXEC_DIR}/bin/pipetask --long-log --log-level=VERBOSE run-qbb /mmfs1/home/stevengs/dirac/D...",) parsl.dataflow.dflow INFO: Parsl task 16 try 0 launched on executor multi with executor id 11 parsl.dataflow.dflow INFO: Standard output for task 16 available at /mmfs1/home/stevengs/dirac/DEEP/submit/DEEP/20190827/B0b/science#step1/20240313T064306Z/submit/DEEP/20190827/B0b/science#step1/20240313T064306Z/logs/characterizeImage/890943/7ac1413d-ee0a-45df-976a-f933bed31e3a_characterizeImage_890943_2.stdout parsl.dataflow.dflow INFO: Standard error for task 16 available at /mmfs1/home/stevengs/dirac/DEEP/submit/DEEP/20190827/B0b/science#step1/20240313T064306Z/submit/DEEP/20190827/B0b/science#step1/20240313T064306Z/logs/characterizeImage/890943/7ac1413d-ee0a-45df-976a-f933bed31e3a_characterizeImage_890943_2.stderr parsl.dataflow.dflow DEBUG: Task 17 will be sent to executor multi parsl.dataflow.dflow DEBUG: Adding output dependencies parsl.dataflow.dflow INFO: Task 17 submitted for App calibrate, waiting on task 16 parsl.dataflow.dflow DEBUG: Task 17 set to pending state with AppFuture: parsl.dataflow.dflow DEBUG: Task 17 has outstanding dependencies, so launch_if_ready skipping parsl.dataflow.dflow DEBUG: Task 18 will be sent to executor multi parsl.dataflow.dflow DEBUG: Adding output dependencies parsl.dataflow.dflow INFO: Task 18 submitted for App characterizeImage, not waiting on any dependency parsl.dataflow.dflow DEBUG: Task 18 set to pending state with AppFuture: parsl.dataflow.memoization DEBUG: Ignoring these kwargs for checkpointing: ['stderr', 'stdout'] parsl.dataflow.memoization DEBUG: Ignoring kwarg stderr parsl.dataflow.memoization DEBUG: Ignoring kwarg stdout parsl.dataflow.memoization DEBUG: Task 18 has memoization hash d6e90bb14cb84d111216874eb2510706 parsl.dataflow.memoization INFO: Task 18 had no result in cache parsl.executors.high_throughput.executor DEBUG: Pushing function .wrapper at 0x1541e8cea660> to queue with args ("'${CTRL_MPEXEC_DIR}/bin/pipetask --long-log --log-level=VERBOSE run-qbb /mmfs1/home/stevengs/dirac/D...",) parsl.dataflow.dflow INFO: Parsl task 18 try 0 launched on executor multi with executor id 12 parsl.dataflow.dflow INFO: Standard output for task 18 available at /mmfs1/home/stevengs/dirac/DEEP/submit/DEEP/20190827/B0b/science#step1/20240313T064306Z/submit/DEEP/20190827/B0b/science#step1/20240313T064306Z/logs/characterizeImage/891062/325ce171-0a6e-4427-b0b2-9ed9f8b3e611_characterizeImage_891062_2.stdout parsl.dataflow.dflow INFO: Standard error for task 18 available at /mmfs1/home/stevengs/dirac/DEEP/submit/DEEP/20190827/B0b/science#step1/20240313T064306Z/submit/DEEP/20190827/B0b/science#step1/20240313T064306Z/logs/characterizeImage/891062/325ce171-0a6e-4427-b0b2-9ed9f8b3e611_characterizeImage_891062_2.stderr parsl.dataflow.dflow DEBUG: Task 19 will be sent to executor multi parsl.dataflow.dflow DEBUG: Adding output dependencies parsl.dataflow.dflow INFO: Task 19 submitted for App calibrate, waiting on task 18 parsl.dataflow.dflow DEBUG: Task 19 set to pending state with AppFuture: parsl.dataflow.dflow DEBUG: Task 19 has outstanding dependencies, so launch_if_ready skipping parsl.dataflow.dflow DEBUG: Task 20 will be sent to executor multi parsl.dataflow.dflow DEBUG: Adding output dependencies parsl.dataflow.dflow INFO: Task 20 submitted for App characterizeImage, not waiting on any dependency parsl.dataflow.dflow DEBUG: Task 20 set to pending state with AppFuture: parsl.dataflow.memoization DEBUG: Ignoring these kwargs for checkpointing: ['stderr', 'stdout'] parsl.dataflow.memoization DEBUG: Ignoring kwarg stderr parsl.dataflow.memoization DEBUG: Ignoring kwarg stdout parsl.dataflow.memoization DEBUG: Task 20 has memoization hash 6b1e40f6fdf43ea8731b5b32c55c7b9b parsl.dataflow.memoization INFO: Task 20 had no result in cache parsl.executors.high_throughput.executor DEBUG: Pushing function .wrapper at 0x1541e8cea980> to queue with args ("'${CTRL_MPEXEC_DIR}/bin/pipetask --long-log --log-level=VERBOSE run-qbb /mmfs1/home/stevengs/dirac/D...",) parsl.dataflow.dflow INFO: Parsl task 20 try 0 launched on executor multi with executor id 13 parsl.dataflow.dflow INFO: Standard output for task 20 available at /mmfs1/home/stevengs/dirac/DEEP/submit/DEEP/20190827/B0b/science#step1/20240313T064306Z/submit/DEEP/20190827/B0b/science#step1/20240313T064306Z/logs/characterizeImage/891064/8dd5bcbc-1a01-4986-8f24-79dd541b8098_characterizeImage_891064_2.stdout parsl.dataflow.dflow INFO: Standard error for task 20 available at /mmfs1/home/stevengs/dirac/DEEP/submit/DEEP/20190827/B0b/science#step1/20240313T064306Z/submit/DEEP/20190827/B0b/science#step1/20240313T064306Z/logs/characterizeImage/891064/8dd5bcbc-1a01-4986-8f24-79dd541b8098_characterizeImage_891064_2.stderr parsl.dataflow.dflow DEBUG: Task 21 will be sent to executor multi parsl.dataflow.dflow DEBUG: Adding output dependencies parsl.dataflow.dflow INFO: Task 21 submitted for App calibrate, waiting on task 20 parsl.dataflow.dflow DEBUG: Task 21 set to pending state with AppFuture: parsl.dataflow.dflow DEBUG: Task 21 has outstanding dependencies, so launch_if_ready skipping parsl.dataflow.dflow DEBUG: Task 22 will be sent to executor multi parsl.dataflow.dflow DEBUG: Adding output dependencies parsl.dataflow.dflow INFO: Task 22 submitted for App characterizeImage, not waiting on any dependency parsl.dataflow.dflow DEBUG: Task 22 set to pending state with AppFuture: parsl.dataflow.memoization DEBUG: Ignoring these kwargs for checkpointing: ['stderr', 'stdout'] parsl.dataflow.memoization DEBUG: Ignoring kwarg stderr parsl.dataflow.memoization DEBUG: Ignoring kwarg stdout parsl.dataflow.memoization DEBUG: Task 22 has memoization hash 60a77a506e43ab04aa31ee7315e3803a parsl.dataflow.memoization INFO: Task 22 had no result in cache parsl.executors.high_throughput.executor DEBUG: Pushing function .wrapper at 0x1541e8ceade0> to queue with args ("'${CTRL_MPEXEC_DIR}/bin/pipetask --long-log --log-level=VERBOSE run-qbb /mmfs1/home/stevengs/dirac/D...",) parsl.dataflow.dflow INFO: Parsl task 22 try 0 launched on executor multi with executor id 14 parsl.dataflow.dflow INFO: Standard output for task 22 available at /mmfs1/home/stevengs/dirac/DEEP/submit/DEEP/20190827/B0b/science#step1/20240313T064306Z/submit/DEEP/20190827/B0b/science#step1/20240313T064306Z/logs/characterizeImage/891063/b5eda604-1ff6-4bec-932e-7249b2fd2d68_characterizeImage_891063_2.stdout parsl.dataflow.dflow INFO: Standard error for task 22 available at /mmfs1/home/stevengs/dirac/DEEP/submit/DEEP/20190827/B0b/science#step1/20240313T064306Z/submit/DEEP/20190827/B0b/science#step1/20240313T064306Z/logs/characterizeImage/891063/b5eda604-1ff6-4bec-932e-7249b2fd2d68_characterizeImage_891063_2.stderr parsl.dataflow.dflow DEBUG: Task 23 will be sent to executor multi parsl.dataflow.dflow DEBUG: Adding output dependencies parsl.dataflow.dflow INFO: Task 23 submitted for App calibrate, waiting on task 22 parsl.dataflow.dflow DEBUG: Task 23 set to pending state with AppFuture: parsl.dataflow.dflow DEBUG: Task 23 has outstanding dependencies, so launch_if_ready skipping parsl.dataflow.dflow DEBUG: Task 24 will be sent to executor multi parsl.dataflow.dflow DEBUG: Adding output dependencies parsl.dataflow.dflow INFO: Task 24 submitted for App characterizeImage, not waiting on any dependency parsl.dataflow.dflow DEBUG: Task 24 set to pending state with AppFuture: parsl.dataflow.memoization DEBUG: Ignoring these kwargs for checkpointing: ['stderr', 'stdout'] parsl.dataflow.memoization DEBUG: Ignoring kwarg stderr parsl.dataflow.memoization DEBUG: Ignoring kwarg stdout parsl.dataflow.memoization DEBUG: Task 24 has memoization hash e756fbf308d81e5fe99ac55a69e044a1 parsl.dataflow.memoization INFO: Task 24 had no result in cache parsl.executors.high_throughput.executor DEBUG: Pushing function .wrapper at 0x1541e8ceafc0> to queue with args ("'${CTRL_MPEXEC_DIR}/bin/pipetask --long-log --log-level=VERBOSE run-qbb /mmfs1/home/stevengs/dirac/D...",) parsl.dataflow.dflow INFO: Parsl task 24 try 0 launched on executor multi with executor id 15 parsl.dataflow.dflow INFO: Standard output for task 24 available at /mmfs1/home/stevengs/dirac/DEEP/submit/DEEP/20190827/B0b/science#step1/20240313T064306Z/submit/DEEP/20190827/B0b/science#step1/20240313T064306Z/logs/characterizeImage/890938/0c8ec344-4794-4861-9dde-58d7a0ccae6b_characterizeImage_890938_2.stdout parsl.dataflow.dflow INFO: Standard error for task 24 available at /mmfs1/home/stevengs/dirac/DEEP/submit/DEEP/20190827/B0b/science#step1/20240313T064306Z/submit/DEEP/20190827/B0b/science#step1/20240313T064306Z/logs/characterizeImage/890938/0c8ec344-4794-4861-9dde-58d7a0ccae6b_characterizeImage_890938_2.stderr parsl.dataflow.dflow DEBUG: Task 25 will be sent to executor multi parsl.dataflow.dflow DEBUG: Adding output dependencies parsl.dataflow.dflow INFO: Task 25 submitted for App calibrate, waiting on task 24 parsl.dataflow.dflow DEBUG: Task 25 set to pending state with AppFuture: parsl.dataflow.dflow DEBUG: Task 25 has outstanding dependencies, so launch_if_ready skipping parsl.dataflow.dflow DEBUG: Task 26 will be sent to executor multi parsl.dataflow.dflow DEBUG: Adding output dependencies parsl.dataflow.dflow INFO: Task 26 submitted for App characterizeImage, not waiting on any dependency parsl.dataflow.dflow DEBUG: Task 26 set to pending state with AppFuture: parsl.dataflow.memoization DEBUG: Ignoring these kwargs for checkpointing: ['stderr', 'stdout'] parsl.dataflow.memoization DEBUG: Ignoring kwarg stderr parsl.dataflow.memoization DEBUG: Ignoring kwarg stdout parsl.dataflow.memoization DEBUG: Task 26 has memoization hash 7d829c3909add40875ca5697aa79a662 parsl.dataflow.memoization INFO: Task 26 had no result in cache parsl.executors.high_throughput.executor DEBUG: Pushing function .wrapper at 0x1541e8ceb060> to queue with args ("'${CTRL_MPEXEC_DIR}/bin/pipetask --long-log --log-level=VERBOSE run-qbb /mmfs1/home/stevengs/dirac/D...",) parsl.dataflow.dflow INFO: Parsl task 26 try 0 launched on executor multi with executor id 16 parsl.dataflow.dflow INFO: Standard output for task 26 available at /mmfs1/home/stevengs/dirac/DEEP/submit/DEEP/20190827/B0b/science#step1/20240313T064306Z/submit/DEEP/20190827/B0b/science#step1/20240313T064306Z/logs/characterizeImage/890941/fc71c8fe-4ced-4b33-9a13-6f8e1cd7a9a1_characterizeImage_890941_2.stdout parsl.dataflow.dflow INFO: Standard error for task 26 available at /mmfs1/home/stevengs/dirac/DEEP/submit/DEEP/20190827/B0b/science#step1/20240313T064306Z/submit/DEEP/20190827/B0b/science#step1/20240313T064306Z/logs/characterizeImage/890941/fc71c8fe-4ced-4b33-9a13-6f8e1cd7a9a1_characterizeImage_890941_2.stderr parsl.dataflow.dflow DEBUG: Task 27 will be sent to executor multi parsl.dataflow.dflow DEBUG: Adding output dependencies parsl.dataflow.dflow INFO: Task 27 submitted for App calibrate, waiting on task 26 parsl.dataflow.dflow DEBUG: Task 27 set to pending state with AppFuture: parsl.dataflow.dflow DEBUG: Task 27 has outstanding dependencies, so launch_if_ready skipping 2024-03-13 06:43:36 proc_lsst.multi:146 [INFO] found job 48154 in provider local proc_lsst.multi INFO: found job 48154 in provider local parsl.dataflow.strategy DEBUG: general strategy starting with strategy_type simple for 1 executors parsl.dataflow.strategy DEBUG: Strategizing for executor multi parsl.dataflow.strategy DEBUG: Slot ratio calculation: active_slots = 1, active_tasks = 16 parsl.dataflow.strategy DEBUG: Executor multi has 16 active tasks, 1/0 running/pending blocks, and 2 connected workers parsl.dataflow.strategy DEBUG: Strategy case 2: slots are overloaded - (slot_ratio = active_slots/active_tasks) < parallelism parsl.dataflow.strategy DEBUG: Strategy case 2b: active_blocks 1 < max_blocks 312 so scaling out parsl.dataflow.strategy DEBUG: Requesting 3 more blocks parsl.executors.status_handling INFO: Scaling out by 3 blocks parsl.executors.status_handling INFO: Allocated block ID 1 parsl.executors.status_handling DEBUG: Submitting to provider with job_name parsl.multi.block-1 2024-03-13 06:43:37 proc_lsst.multi:153 [INFO] [multi] got submit process_worker_pool.py -a n3009,198.48.92.26,127.0.0.1,10.64.65.9,169.254.95.120,10.64.129.9 -p 0 -c 1.0 -m None --poll 10 --task_port=54440 --result_port=54877 --logdir=/mmfs1/gscratch/dirac/DEEP/submit/DEEP/20190827/B0b/science#step1/20240313T064306Z/runinfo/000/multi --block_id=1 --hb_period=30 --hb_threshold=120 --cpu-affinity none --available-accelerators --start-method spawn 1 parsl.multi.block-1 proc_lsst.multi INFO: [multi] got submit process_worker_pool.py -a n3009,198.48.92.26,127.0.0.1,10.64.65.9,169.254.95.120,10.64.129.9 -p 0 -c 1.0 -m None --poll 10 --task_port=54440 --result_port=54877 --logdir=/mmfs1/gscratch/dirac/DEEP/submit/DEEP/20190827/B0b/science#step1/20240313T064306Z/runinfo/000/multi --block_id=1 --hb_period=30 --hb_threshold=120 --cpu-affinity none --available-accelerators --start-method spawn 1 parsl.multi.block-1 2024-03-13 06:43:37 proc_lsst.multi:162 [INFO] [multi] local process_worker_pool.py -a n3009,198.48.92.26,127.0.0.1,10.64.65.9,169.254.95.120,10.64.129.9 -p 0 -c 1.0 -m None --poll 10 --task_port=54440 --result_port=54877 --logdir=/mmfs1/gscratch/dirac/DEEP/submit/DEEP/20190827/B0b/science#step1/20240313T064306Z/runinfo/000/multi --block_id=1 --hb_period=30 --hb_threshold=120 --cpu-affinity none --available-accelerators --start-method spawn --max_workers 2 proc_lsst.multi INFO: [multi] local process_worker_pool.py -a n3009,198.48.92.26,127.0.0.1,10.64.65.9,169.254.95.120,10.64.129.9 -p 0 -c 1.0 -m None --poll 10 --task_port=54440 --result_port=54877 --logdir=/mmfs1/gscratch/dirac/DEEP/submit/DEEP/20190827/B0b/science#step1/20240313T064306Z/runinfo/000/multi --block_id=1 --hb_period=30 --hb_threshold=120 --cpu-affinity none --available-accelerators --start-method spawn --max_workers 2 2024-03-13 06:43:37 proc_lsst.multi:163 [INFO] [multi] len(self.providers[provider].resources) 1 proc_lsst.multi INFO: [multi] len(self.providers[provider].resources) 1 2024-03-13 06:43:37 proc_lsst.multi:164 [INFO] [multi] self.providers[provider].max_blocks 1 proc_lsst.multi INFO: [multi] self.providers[provider].max_blocks 1 2024-03-13 06:43:37 proc_lsst.multi:162 [INFO] [multi] astro process_worker_pool.py -a n3009,198.48.92.26,127.0.0.1,10.64.65.9,169.254.95.120,10.64.129.9 -p 0 -c 1.0 -m None --poll 10 --task_port=54440 --result_port=54877 --logdir=/mmfs1/gscratch/dirac/DEEP/submit/DEEP/20190827/B0b/science#step1/20240313T064306Z/runinfo/000/multi --block_id=1 --hb_period=30 --hb_threshold=120 --cpu-affinity none --available-accelerators --start-method spawn proc_lsst.multi INFO: [multi] astro process_worker_pool.py -a n3009,198.48.92.26,127.0.0.1,10.64.65.9,169.254.95.120,10.64.129.9 -p 0 -c 1.0 -m None --poll 10 --task_port=54440 --result_port=54877 --logdir=/mmfs1/gscratch/dirac/DEEP/submit/DEEP/20190827/B0b/science#step1/20240313T064306Z/runinfo/000/multi --block_id=1 --hb_period=30 --hb_threshold=120 --cpu-affinity none --available-accelerators --start-method spawn 2024-03-13 06:43:37 proc_lsst.multi:163 [INFO] [multi] len(self.providers[provider].resources) 0 proc_lsst.multi INFO: [multi] len(self.providers[provider].resources) 0 2024-03-13 06:43:37 proc_lsst.multi:164 [INFO] [multi] self.providers[provider].max_blocks 30 proc_lsst.multi INFO: [multi] self.providers[provider].max_blocks 30 2024-03-13 06:43:37 proc_lsst.multi:166 [INFO] [multi] submitting process_worker_pool.py -a n3009,198.48.92.26,127.0.0.1,10.64.65.9,169.254.95.120,10.64.129.9 -p 0 -c 1.0 -m None --poll 10 --task_port=54440 --result_port=54877 --logdir=/mmfs1/gscratch/dirac/DEEP/submit/DEEP/20190827/B0b/science#step1/20240313T064306Z/runinfo/000/multi --block_id=1 --hb_period=30 --hb_threshold=120 --cpu-affinity none --available-accelerators --start-method spawn to astro proc_lsst.multi INFO: [multi] submitting process_worker_pool.py -a n3009,198.48.92.26,127.0.0.1,10.64.65.9,169.254.95.120,10.64.129.9 -p 0 -c 1.0 -m None --poll 10 --task_port=54440 --result_port=54877 --logdir=/mmfs1/gscratch/dirac/DEEP/submit/DEEP/20190827/B0b/science#step1/20240313T064306Z/runinfo/000/multi --block_id=1 --hb_period=30 --hb_threshold=120 --cpu-affinity none --available-accelerators --start-method spawn to astro parsl.providers.slurm.slurm DEBUG: Requesting one block with 1 nodes parsl.providers.slurm.slurm DEBUG: Writing submit script parsl.providers.slurm.slurm DEBUG: moving files 2024-03-13 06:43:37 proc_lsst.multi:170 [INFO] [multi] job_id 17008089 proc_lsst.multi INFO: [multi] job_id 17008089 2024-03-13 06:43:37 proc_lsst.multi:171 [INFO] [multi] len(self.providers[provider].resources) 1 proc_lsst.multi INFO: [multi] len(self.providers[provider].resources) 1 2024-03-13 06:43:37 proc_lsst.multi:172 [INFO] [multi] self.providers[provider].max_blocks 30 proc_lsst.multi INFO: [multi] self.providers[provider].max_blocks 30 2024-03-13 06:43:37 proc_lsst.multi:178 [INFO] [multi] provider astro accepted submit and returned 17008089 proc_lsst.multi INFO: [multi] provider astro accepted submit and returned 17008089 parsl.executors.status_handling DEBUG: Launched block 1 on executor multi with job ID 17008089 parsl.executors.status_handling INFO: Allocated block ID 2 parsl.executors.status_handling DEBUG: Submitting to provider with job_name parsl.multi.block-2 2024-03-13 06:43:37 proc_lsst.multi:153 [INFO] [multi] got submit process_worker_pool.py -a n3009,198.48.92.26,127.0.0.1,10.64.65.9,169.254.95.120,10.64.129.9 -p 0 -c 1.0 -m None --poll 10 --task_port=54440 --result_port=54877 --logdir=/mmfs1/gscratch/dirac/DEEP/submit/DEEP/20190827/B0b/science#step1/20240313T064306Z/runinfo/000/multi --block_id=2 --hb_period=30 --hb_threshold=120 --cpu-affinity none --available-accelerators --start-method spawn 1 parsl.multi.block-2 proc_lsst.multi INFO: [multi] got submit process_worker_pool.py -a n3009,198.48.92.26,127.0.0.1,10.64.65.9,169.254.95.120,10.64.129.9 -p 0 -c 1.0 -m None --poll 10 --task_port=54440 --result_port=54877 --logdir=/mmfs1/gscratch/dirac/DEEP/submit/DEEP/20190827/B0b/science#step1/20240313T064306Z/runinfo/000/multi --block_id=2 --hb_period=30 --hb_threshold=120 --cpu-affinity none --available-accelerators --start-method spawn 1 parsl.multi.block-2 2024-03-13 06:43:37 proc_lsst.multi:162 [INFO] [multi] local process_worker_pool.py -a n3009,198.48.92.26,127.0.0.1,10.64.65.9,169.254.95.120,10.64.129.9 -p 0 -c 1.0 -m None --poll 10 --task_port=54440 --result_port=54877 --logdir=/mmfs1/gscratch/dirac/DEEP/submit/DEEP/20190827/B0b/science#step1/20240313T064306Z/runinfo/000/multi --block_id=2 --hb_period=30 --hb_threshold=120 --cpu-affinity none --available-accelerators --start-method spawn --max_workers 2 proc_lsst.multi INFO: [multi] local process_worker_pool.py -a n3009,198.48.92.26,127.0.0.1,10.64.65.9,169.254.95.120,10.64.129.9 -p 0 -c 1.0 -m None --poll 10 --task_port=54440 --result_port=54877 --logdir=/mmfs1/gscratch/dirac/DEEP/submit/DEEP/20190827/B0b/science#step1/20240313T064306Z/runinfo/000/multi --block_id=2 --hb_period=30 --hb_threshold=120 --cpu-affinity none --available-accelerators --start-method spawn --max_workers 2 2024-03-13 06:43:37 proc_lsst.multi:163 [INFO] [multi] len(self.providers[provider].resources) 1 proc_lsst.multi INFO: [multi] len(self.providers[provider].resources) 1 2024-03-13 06:43:37 proc_lsst.multi:164 [INFO] [multi] self.providers[provider].max_blocks 1 proc_lsst.multi INFO: [multi] self.providers[provider].max_blocks 1 2024-03-13 06:43:37 proc_lsst.multi:162 [INFO] [multi] astro process_worker_pool.py -a n3009,198.48.92.26,127.0.0.1,10.64.65.9,169.254.95.120,10.64.129.9 -p 0 -c 1.0 -m None --poll 10 --task_port=54440 --result_port=54877 --logdir=/mmfs1/gscratch/dirac/DEEP/submit/DEEP/20190827/B0b/science#step1/20240313T064306Z/runinfo/000/multi --block_id=2 --hb_period=30 --hb_threshold=120 --cpu-affinity none --available-accelerators --start-method spawn proc_lsst.multi INFO: [multi] astro process_worker_pool.py -a n3009,198.48.92.26,127.0.0.1,10.64.65.9,169.254.95.120,10.64.129.9 -p 0 -c 1.0 -m None --poll 10 --task_port=54440 --result_port=54877 --logdir=/mmfs1/gscratch/dirac/DEEP/submit/DEEP/20190827/B0b/science#step1/20240313T064306Z/runinfo/000/multi --block_id=2 --hb_period=30 --hb_threshold=120 --cpu-affinity none --available-accelerators --start-method spawn 2024-03-13 06:43:37 proc_lsst.multi:163 [INFO] [multi] len(self.providers[provider].resources) 1 proc_lsst.multi INFO: [multi] len(self.providers[provider].resources) 1 2024-03-13 06:43:37 proc_lsst.multi:164 [INFO] [multi] self.providers[provider].max_blocks 30 proc_lsst.multi INFO: [multi] self.providers[provider].max_blocks 30 2024-03-13 06:43:37 proc_lsst.multi:166 [INFO] [multi] submitting process_worker_pool.py -a n3009,198.48.92.26,127.0.0.1,10.64.65.9,169.254.95.120,10.64.129.9 -p 0 -c 1.0 -m None --poll 10 --task_port=54440 --result_port=54877 --logdir=/mmfs1/gscratch/dirac/DEEP/submit/DEEP/20190827/B0b/science#step1/20240313T064306Z/runinfo/000/multi --block_id=2 --hb_period=30 --hb_threshold=120 --cpu-affinity none --available-accelerators --start-method spawn to astro proc_lsst.multi INFO: [multi] submitting process_worker_pool.py -a n3009,198.48.92.26,127.0.0.1,10.64.65.9,169.254.95.120,10.64.129.9 -p 0 -c 1.0 -m None --poll 10 --task_port=54440 --result_port=54877 --logdir=/mmfs1/gscratch/dirac/DEEP/submit/DEEP/20190827/B0b/science#step1/20240313T064306Z/runinfo/000/multi --block_id=2 --hb_period=30 --hb_threshold=120 --cpu-affinity none --available-accelerators --start-method spawn to astro parsl.providers.slurm.slurm DEBUG: Requesting one block with 1 nodes parsl.providers.slurm.slurm DEBUG: Writing submit script parsl.providers.slurm.slurm DEBUG: moving files 2024-03-13 06:43:37 proc_lsst.multi:170 [INFO] [multi] job_id 17008090 proc_lsst.multi INFO: [multi] job_id 17008090 2024-03-13 06:43:37 proc_lsst.multi:171 [INFO] [multi] len(self.providers[provider].resources) 2 proc_lsst.multi INFO: [multi] len(self.providers[provider].resources) 2 2024-03-13 06:43:37 proc_lsst.multi:172 [INFO] [multi] self.providers[provider].max_blocks 30 proc_lsst.multi INFO: [multi] self.providers[provider].max_blocks 30 2024-03-13 06:43:37 proc_lsst.multi:178 [INFO] [multi] provider astro accepted submit and returned 17008090 proc_lsst.multi INFO: [multi] provider astro accepted submit and returned 17008090 parsl.executors.status_handling DEBUG: Launched block 2 on executor multi with job ID 17008090 parsl.executors.status_handling INFO: Allocated block ID 3 parsl.executors.status_handling DEBUG: Submitting to provider with job_name parsl.multi.block-3 2024-03-13 06:43:37 proc_lsst.multi:153 [INFO] [multi] got submit process_worker_pool.py -a n3009,198.48.92.26,127.0.0.1,10.64.65.9,169.254.95.120,10.64.129.9 -p 0 -c 1.0 -m None --poll 10 --task_port=54440 --result_port=54877 --logdir=/mmfs1/gscratch/dirac/DEEP/submit/DEEP/20190827/B0b/science#step1/20240313T064306Z/runinfo/000/multi --block_id=3 --hb_period=30 --hb_threshold=120 --cpu-affinity none --available-accelerators --start-method spawn 1 parsl.multi.block-3 proc_lsst.multi INFO: [multi] got submit process_worker_pool.py -a n3009,198.48.92.26,127.0.0.1,10.64.65.9,169.254.95.120,10.64.129.9 -p 0 -c 1.0 -m None --poll 10 --task_port=54440 --result_port=54877 --logdir=/mmfs1/gscratch/dirac/DEEP/submit/DEEP/20190827/B0b/science#step1/20240313T064306Z/runinfo/000/multi --block_id=3 --hb_period=30 --hb_threshold=120 --cpu-affinity none --available-accelerators --start-method spawn 1 parsl.multi.block-3 2024-03-13 06:43:37 proc_lsst.multi:162 [INFO] [multi] local process_worker_pool.py -a n3009,198.48.92.26,127.0.0.1,10.64.65.9,169.254.95.120,10.64.129.9 -p 0 -c 1.0 -m None --poll 10 --task_port=54440 --result_port=54877 --logdir=/mmfs1/gscratch/dirac/DEEP/submit/DEEP/20190827/B0b/science#step1/20240313T064306Z/runinfo/000/multi --block_id=3 --hb_period=30 --hb_threshold=120 --cpu-affinity none --available-accelerators --start-method spawn --max_workers 2 proc_lsst.multi INFO: [multi] local process_worker_pool.py -a n3009,198.48.92.26,127.0.0.1,10.64.65.9,169.254.95.120,10.64.129.9 -p 0 -c 1.0 -m None --poll 10 --task_port=54440 --result_port=54877 --logdir=/mmfs1/gscratch/dirac/DEEP/submit/DEEP/20190827/B0b/science#step1/20240313T064306Z/runinfo/000/multi --block_id=3 --hb_period=30 --hb_threshold=120 --cpu-affinity none --available-accelerators --start-method spawn --max_workers 2 2024-03-13 06:43:37 proc_lsst.multi:163 [INFO] [multi] len(self.providers[provider].resources) 1 proc_lsst.multi INFO: [multi] len(self.providers[provider].resources) 1 2024-03-13 06:43:37 proc_lsst.multi:164 [INFO] [multi] self.providers[provider].max_blocks 1 proc_lsst.multi INFO: [multi] self.providers[provider].max_blocks 1 2024-03-13 06:43:37 proc_lsst.multi:162 [INFO] [multi] astro process_worker_pool.py -a n3009,198.48.92.26,127.0.0.1,10.64.65.9,169.254.95.120,10.64.129.9 -p 0 -c 1.0 -m None --poll 10 --task_port=54440 --result_port=54877 --logdir=/mmfs1/gscratch/dirac/DEEP/submit/DEEP/20190827/B0b/science#step1/20240313T064306Z/runinfo/000/multi --block_id=3 --hb_period=30 --hb_threshold=120 --cpu-affinity none --available-accelerators --start-method spawn proc_lsst.multi INFO: [multi] astro process_worker_pool.py -a n3009,198.48.92.26,127.0.0.1,10.64.65.9,169.254.95.120,10.64.129.9 -p 0 -c 1.0 -m None --poll 10 --task_port=54440 --result_port=54877 --logdir=/mmfs1/gscratch/dirac/DEEP/submit/DEEP/20190827/B0b/science#step1/20240313T064306Z/runinfo/000/multi --block_id=3 --hb_period=30 --hb_threshold=120 --cpu-affinity none --available-accelerators --start-method spawn 2024-03-13 06:43:37 proc_lsst.multi:163 [INFO] [multi] len(self.providers[provider].resources) 2 proc_lsst.multi INFO: [multi] len(self.providers[provider].resources) 2 2024-03-13 06:43:37 proc_lsst.multi:164 [INFO] [multi] self.providers[provider].max_blocks 30 proc_lsst.multi INFO: [multi] self.providers[provider].max_blocks 30 2024-03-13 06:43:37 proc_lsst.multi:166 [INFO] [multi] submitting process_worker_pool.py -a n3009,198.48.92.26,127.0.0.1,10.64.65.9,169.254.95.120,10.64.129.9 -p 0 -c 1.0 -m None --poll 10 --task_port=54440 --result_port=54877 --logdir=/mmfs1/gscratch/dirac/DEEP/submit/DEEP/20190827/B0b/science#step1/20240313T064306Z/runinfo/000/multi --block_id=3 --hb_period=30 --hb_threshold=120 --cpu-affinity none --available-accelerators --start-method spawn to astro proc_lsst.multi INFO: [multi] submitting process_worker_pool.py -a n3009,198.48.92.26,127.0.0.1,10.64.65.9,169.254.95.120,10.64.129.9 -p 0 -c 1.0 -m None --poll 10 --task_port=54440 --result_port=54877 --logdir=/mmfs1/gscratch/dirac/DEEP/submit/DEEP/20190827/B0b/science#step1/20240313T064306Z/runinfo/000/multi --block_id=3 --hb_period=30 --hb_threshold=120 --cpu-affinity none --available-accelerators --start-method spawn to astro parsl.providers.slurm.slurm DEBUG: Requesting one block with 1 nodes parsl.providers.slurm.slurm DEBUG: Writing submit script parsl.providers.slurm.slurm DEBUG: moving files 2024-03-13 06:43:37 proc_lsst.multi:170 [INFO] [multi] job_id 17008091 proc_lsst.multi INFO: [multi] job_id 17008091 2024-03-13 06:43:37 proc_lsst.multi:171 [INFO] [multi] len(self.providers[provider].resources) 3 proc_lsst.multi INFO: [multi] len(self.providers[provider].resources) 3 2024-03-13 06:43:37 proc_lsst.multi:172 [INFO] [multi] self.providers[provider].max_blocks 30 proc_lsst.multi INFO: [multi] self.providers[provider].max_blocks 30 2024-03-13 06:43:37 proc_lsst.multi:178 [INFO] [multi] provider astro accepted submit and returned 17008091 proc_lsst.multi INFO: [multi] provider astro accepted submit and returned 17008091 parsl.executors.status_handling DEBUG: Launched block 3 on executor multi with job ID 17008091 parsl.process_loggers DEBUG: Normal ending for _general_strategy on thread JobStatusPoller-Timer-Thread-23372821068240 parsl.dataflow.strategy DEBUG: general strategy starting with strategy_type simple for 1 executors parsl.dataflow.strategy DEBUG: Strategizing for executor multi parsl.dataflow.strategy DEBUG: Slot ratio calculation: active_slots = 4, active_tasks = 16 parsl.dataflow.strategy DEBUG: Executor multi has 16 active tasks, 1/3 running/pending blocks, and 5 connected workers parsl.dataflow.strategy DEBUG: Strategy case 3: no changes necessary to current block load parsl.process_loggers DEBUG: Normal ending for _general_strategy on thread JobStatusPoller-Timer-Thread-23372821068240 parsl.dataflow.strategy DEBUG: general strategy starting with strategy_type simple for 1 executors parsl.dataflow.strategy DEBUG: Strategizing for executor multi parsl.dataflow.strategy DEBUG: Slot ratio calculation: active_slots = 4, active_tasks = 16 parsl.dataflow.strategy DEBUG: Executor multi has 16 active tasks, 1/3 running/pending blocks, and 5 connected workers parsl.dataflow.strategy DEBUG: Strategy case 3: no changes necessary to current block load parsl.process_loggers DEBUG: Normal ending for _general_strategy on thread JobStatusPoller-Timer-Thread-23372821068240 parsl.dataflow.strategy DEBUG: general strategy starting with strategy_type simple for 1 executors parsl.dataflow.strategy DEBUG: Strategizing for executor multi parsl.dataflow.strategy DEBUG: Slot ratio calculation: active_slots = 4, active_tasks = 16 parsl.dataflow.strategy DEBUG: Executor multi has 16 active tasks, 1/3 running/pending blocks, and 5 connected workers parsl.dataflow.strategy DEBUG: Strategy case 3: no changes necessary to current block load parsl.process_loggers DEBUG: Normal ending for _general_strategy on thread JobStatusPoller-Timer-Thread-23372821068240 parsl.app.errors DEBUG: Reraising exception of type parsl.dataflow.dflow DEBUG: Task 4 try 0 failed parsl.dataflow.dflow INFO: Task 4 marked for retry parsl.dataflow.dflow INFO: Standard output for task 4 available at /mmfs1/home/stevengs/dirac/DEEP/submit/DEEP/20190827/B0b/science#step1/20240313T064306Z/submit/DEEP/20190827/B0b/science#step1/20240313T064306Z/logs/characterizeImage/890936/aed96731-2ed8-4b64-b635-469dd34d0331_characterizeImage_890936_61.stdout parsl.dataflow.dflow INFO: Standard error for task 4 available at /mmfs1/home/stevengs/dirac/DEEP/submit/DEEP/20190827/B0b/science#step1/20240313T064306Z/submit/DEEP/20190827/B0b/science#step1/20240313T064306Z/logs/characterizeImage/890936/aed96731-2ed8-4b64-b635-469dd34d0331_characterizeImage_890936_61.stderr parsl.dataflow.memoization DEBUG: Ignoring these kwargs for checkpointing: ['stderr', 'stdout'] parsl.dataflow.memoization DEBUG: Ignoring kwarg stderr parsl.dataflow.memoization DEBUG: Ignoring kwarg stdout parsl.dataflow.memoization DEBUG: Task 4 has memoization hash c8402befe8520c3274b69ed563a18646 parsl.dataflow.memoization INFO: Task 4 had no result in cache parsl.executors.high_throughput.executor DEBUG: Pushing function .wrapper at 0x1541e8ce9800> to queue with args ("'${CTRL_MPEXEC_DIR}/bin/pipetask --long-log --log-level=VERBOSE run-qbb /mmfs1/home/stevengs/dirac/D...",) parsl.dataflow.dflow INFO: Parsl task 4 try 1 launched on executor multi with executor id 17 parsl.dataflow.dflow INFO: Standard output for task 4 available at /mmfs1/home/stevengs/dirac/DEEP/submit/DEEP/20190827/B0b/science#step1/20240313T064306Z/submit/DEEP/20190827/B0b/science#step1/20240313T064306Z/logs/characterizeImage/890936/aed96731-2ed8-4b64-b635-469dd34d0331_characterizeImage_890936_61.stdout parsl.dataflow.dflow INFO: Standard error for task 4 available at /mmfs1/home/stevengs/dirac/DEEP/submit/DEEP/20190827/B0b/science#step1/20240313T064306Z/submit/DEEP/20190827/B0b/science#step1/20240313T064306Z/logs/characterizeImage/890936/aed96731-2ed8-4b64-b635-469dd34d0331_characterizeImage_890936_61.stderr parsl.dataflow.strategy DEBUG: general strategy starting with strategy_type simple for 1 executors parsl.dataflow.strategy DEBUG: Strategizing for executor multi parsl.dataflow.strategy DEBUG: Slot ratio calculation: active_slots = 4, active_tasks = 16 parsl.dataflow.strategy DEBUG: Executor multi has 16 active tasks, 1/3 running/pending blocks, and 5 connected workers parsl.dataflow.strategy DEBUG: Strategy case 3: no changes necessary to current block load parsl.process_loggers DEBUG: Normal ending for _general_strategy on thread JobStatusPoller-Timer-Thread-23372821068240 parsl.app.errors DEBUG: Reraising exception of type parsl.dataflow.dflow DEBUG: Task 0 try 0 failed parsl.dataflow.dflow INFO: Task 0 marked for retry parsl.dataflow.dflow INFO: Standard output for task 0 available at /mmfs1/home/stevengs/dirac/DEEP/submit/DEEP/20190827/B0b/science#step1/20240313T064306Z/submit/DEEP/20190827/B0b/science#step1/20240313T064306Z/logs/calibrate/890939/d6874642-13c5-415c-929b-08bc9969eaf1_calibrate_890939_61.stdout parsl.dataflow.dflow INFO: Standard error for task 0 available at /mmfs1/home/stevengs/dirac/DEEP/submit/DEEP/20190827/B0b/science#step1/20240313T064306Z/submit/DEEP/20190827/B0b/science#step1/20240313T064306Z/logs/calibrate/890939/d6874642-13c5-415c-929b-08bc9969eaf1_calibrate_890939_61.stderr parsl.dataflow.memoization DEBUG: Ignoring these kwargs for checkpointing: ['stderr', 'stdout'] parsl.dataflow.memoization DEBUG: Ignoring kwarg stderr parsl.dataflow.memoization DEBUG: Ignoring kwarg stdout parsl.dataflow.memoization DEBUG: Task 0 has memoization hash 066974e73335b84bd5a3e4da3c94b901 parsl.dataflow.memoization INFO: Task 0 had no result in cache parsl.executors.high_throughput.executor DEBUG: Pushing function .wrapper at 0x1541e8ce9300> to queue with args ("'${CTRL_MPEXEC_DIR}/bin/pipetask --long-log --log-level=VERBOSE run-qbb /mmfs1/home/stevengs/dirac/D...",) parsl.dataflow.dflow INFO: Parsl task 0 try 1 launched on executor multi with executor id 18 parsl.dataflow.dflow INFO: Standard output for task 0 available at /mmfs1/home/stevengs/dirac/DEEP/submit/DEEP/20190827/B0b/science#step1/20240313T064306Z/submit/DEEP/20190827/B0b/science#step1/20240313T064306Z/logs/calibrate/890939/d6874642-13c5-415c-929b-08bc9969eaf1_calibrate_890939_61.stdout parsl.dataflow.dflow INFO: Standard error for task 0 available at /mmfs1/home/stevengs/dirac/DEEP/submit/DEEP/20190827/B0b/science#step1/20240313T064306Z/submit/DEEP/20190827/B0b/science#step1/20240313T064306Z/logs/calibrate/890939/d6874642-13c5-415c-929b-08bc9969eaf1_calibrate_890939_61.stderr parsl.app.errors DEBUG: Reraising exception of type parsl.dataflow.dflow DEBUG: Task 1 try 0 failed parsl.dataflow.dflow INFO: Task 1 marked for retry parsl.dataflow.dflow INFO: Standard output for task 1 available at /mmfs1/home/stevengs/dirac/DEEP/submit/DEEP/20190827/B0b/science#step1/20240313T064306Z/submit/DEEP/20190827/B0b/science#step1/20240313T064306Z/logs/calibrate/890938/8c16f2c6-023d-4a22-b2ab-74808e864101_calibrate_890938_61.stdout parsl.dataflow.dflow INFO: Standard error for task 1 available at /mmfs1/home/stevengs/dirac/DEEP/submit/DEEP/20190827/B0b/science#step1/20240313T064306Z/submit/DEEP/20190827/B0b/science#step1/20240313T064306Z/logs/calibrate/890938/8c16f2c6-023d-4a22-b2ab-74808e864101_calibrate_890938_61.stderr parsl.dataflow.memoization DEBUG: Ignoring these kwargs for checkpointing: ['stderr', 'stdout'] parsl.dataflow.memoization DEBUG: Ignoring kwarg stderr parsl.dataflow.memoization DEBUG: Ignoring kwarg stdout parsl.dataflow.memoization DEBUG: Task 1 has memoization hash 195b7a20d72e3f8ba2243ab3dc63d265 parsl.dataflow.memoization INFO: Task 1 had no result in cache parsl.executors.high_throughput.executor DEBUG: Pushing function .wrapper at 0x1541e8ce93a0> to queue with args ("'${CTRL_MPEXEC_DIR}/bin/pipetask --long-log --log-level=VERBOSE run-qbb /mmfs1/home/stevengs/dirac/D...",) parsl.dataflow.dflow INFO: Parsl task 1 try 1 launched on executor multi with executor id 19 parsl.dataflow.dflow INFO: Standard output for task 1 available at /mmfs1/home/stevengs/dirac/DEEP/submit/DEEP/20190827/B0b/science#step1/20240313T064306Z/submit/DEEP/20190827/B0b/science#step1/20240313T064306Z/logs/calibrate/890938/8c16f2c6-023d-4a22-b2ab-74808e864101_calibrate_890938_61.stdout parsl.dataflow.dflow INFO: Standard error for task 1 available at /mmfs1/home/stevengs/dirac/DEEP/submit/DEEP/20190827/B0b/science#step1/20240313T064306Z/submit/DEEP/20190827/B0b/science#step1/20240313T064306Z/logs/calibrate/890938/8c16f2c6-023d-4a22-b2ab-74808e864101_calibrate_890938_61.stderr parsl.app.errors DEBUG: Reraising exception of type parsl.dataflow.dflow DEBUG: Task 6 try 0 failed parsl.dataflow.dflow INFO: Task 6 marked for retry parsl.dataflow.dflow INFO: Standard output for task 6 available at /mmfs1/home/stevengs/dirac/DEEP/submit/DEEP/20190827/B0b/science#step1/20240313T064306Z/submit/DEEP/20190827/B0b/science#step1/20240313T064306Z/logs/characterizeImage/890940/ae8ab3af-0bec-442b-a18c-fe585de0c308_characterizeImage_890940_2.stdout parsl.dataflow.dflow INFO: Standard error for task 6 available at /mmfs1/home/stevengs/dirac/DEEP/submit/DEEP/20190827/B0b/science#step1/20240313T064306Z/submit/DEEP/20190827/B0b/science#step1/20240313T064306Z/logs/characterizeImage/890940/ae8ab3af-0bec-442b-a18c-fe585de0c308_characterizeImage_890940_2.stderr parsl.dataflow.memoization DEBUG: Ignoring these kwargs for checkpointing: ['stderr', 'stdout'] parsl.dataflow.memoization DEBUG: Ignoring kwarg stderr parsl.dataflow.memoization DEBUG: Ignoring kwarg stdout parsl.dataflow.memoization DEBUG: Task 6 has memoization hash 29e9c5beeab127c767e96112023a28bf parsl.dataflow.memoization INFO: Task 6 had no result in cache parsl.executors.high_throughput.executor DEBUG: Pushing function .wrapper at 0x1541e8ce99e0> to queue with args ("'${CTRL_MPEXEC_DIR}/bin/pipetask --long-log --log-level=VERBOSE run-qbb /mmfs1/home/stevengs/dirac/D...",) parsl.dataflow.dflow INFO: Parsl task 6 try 1 launched on executor multi with executor id 20 parsl.dataflow.dflow INFO: Standard output for task 6 available at /mmfs1/home/stevengs/dirac/DEEP/submit/DEEP/20190827/B0b/science#step1/20240313T064306Z/submit/DEEP/20190827/B0b/science#step1/20240313T064306Z/logs/characterizeImage/890940/ae8ab3af-0bec-442b-a18c-fe585de0c308_characterizeImage_890940_2.stdout parsl.dataflow.dflow INFO: Standard error for task 6 available at /mmfs1/home/stevengs/dirac/DEEP/submit/DEEP/20190827/B0b/science#step1/20240313T064306Z/submit/DEEP/20190827/B0b/science#step1/20240313T064306Z/logs/characterizeImage/890940/ae8ab3af-0bec-442b-a18c-fe585de0c308_characterizeImage_890940_2.stderr parsl.dataflow.strategy DEBUG: general strategy starting with strategy_type simple for 1 executors parsl.dataflow.strategy DEBUG: Strategizing for executor multi parsl.dataflow.strategy DEBUG: Slot ratio calculation: active_slots = 4, active_tasks = 16 parsl.dataflow.strategy DEBUG: Executor multi has 16 active tasks, 1/3 running/pending blocks, and 5 connected workers parsl.dataflow.strategy DEBUG: Strategy case 3: no changes necessary to current block load parsl.process_loggers DEBUG: Normal ending for _general_strategy on thread JobStatusPoller-Timer-Thread-23372821068240 parsl.app.errors DEBUG: Reraising exception of type parsl.dataflow.dflow DEBUG: Task 3 try 0 failed parsl.dataflow.dflow INFO: Task 3 marked for retry parsl.dataflow.dflow INFO: Standard output for task 3 available at /mmfs1/home/stevengs/dirac/DEEP/submit/DEEP/20190827/B0b/science#step1/20240313T064306Z/submit/DEEP/20190827/B0b/science#step1/20240313T064306Z/logs/calibrate/890937/68cc28c3-61c8-4d74-a6e4-b86339a9769b_calibrate_890937_61.stdout parsl.dataflow.dflow INFO: Standard error for task 3 available at /mmfs1/home/stevengs/dirac/DEEP/submit/DEEP/20190827/B0b/science#step1/20240313T064306Z/submit/DEEP/20190827/B0b/science#step1/20240313T064306Z/logs/calibrate/890937/68cc28c3-61c8-4d74-a6e4-b86339a9769b_calibrate_890937_61.stderr parsl.dataflow.memoization DEBUG: Ignoring these kwargs for checkpointing: ['stderr', 'stdout'] parsl.dataflow.memoization DEBUG: Ignoring kwarg stderr parsl.dataflow.memoization DEBUG: Ignoring kwarg stdout parsl.dataflow.memoization DEBUG: Task 3 has memoization hash c79aac85763efa2a323bf02a99d5064a parsl.dataflow.memoization INFO: Task 3 had no result in cache parsl.executors.high_throughput.executor DEBUG: Pushing function .wrapper at 0x1541e8ce9940> to queue with args ("'${CTRL_MPEXEC_DIR}/bin/pipetask --long-log --log-level=VERBOSE run-qbb /mmfs1/home/stevengs/dirac/D...",) parsl.dataflow.dflow INFO: Parsl task 3 try 1 launched on executor multi with executor id 21 parsl.dataflow.dflow INFO: Standard output for task 3 available at /mmfs1/home/stevengs/dirac/DEEP/submit/DEEP/20190827/B0b/science#step1/20240313T064306Z/submit/DEEP/20190827/B0b/science#step1/20240313T064306Z/logs/calibrate/890937/68cc28c3-61c8-4d74-a6e4-b86339a9769b_calibrate_890937_61.stdout parsl.dataflow.dflow INFO: Standard error for task 3 available at /mmfs1/home/stevengs/dirac/DEEP/submit/DEEP/20190827/B0b/science#step1/20240313T064306Z/submit/DEEP/20190827/B0b/science#step1/20240313T064306Z/logs/calibrate/890937/68cc28c3-61c8-4d74-a6e4-b86339a9769b_calibrate_890937_61.stderr parsl.app.errors DEBUG: Reraising exception of type parsl.dataflow.dflow DEBUG: Task 8 try 0 failed parsl.dataflow.dflow INFO: Task 8 marked for retry parsl.dataflow.dflow INFO: Standard output for task 8 available at /mmfs1/home/stevengs/dirac/DEEP/submit/DEEP/20190827/B0b/science#step1/20240313T064306Z/submit/DEEP/20190827/B0b/science#step1/20240313T064306Z/logs/characterizeImage/890937/1f16a1d8-2ee4-4f0b-b820-9d0f913898ba_characterizeImage_890937_2.stdout parsl.dataflow.dflow INFO: Standard error for task 8 available at /mmfs1/home/stevengs/dirac/DEEP/submit/DEEP/20190827/B0b/science#step1/20240313T064306Z/submit/DEEP/20190827/B0b/science#step1/20240313T064306Z/logs/characterizeImage/890937/1f16a1d8-2ee4-4f0b-b820-9d0f913898ba_characterizeImage_890937_2.stderr parsl.dataflow.memoization DEBUG: Ignoring these kwargs for checkpointing: ['stderr', 'stdout'] parsl.dataflow.memoization DEBUG: Ignoring kwarg stderr parsl.dataflow.memoization DEBUG: Ignoring kwarg stdout parsl.dataflow.memoization DEBUG: Task 8 has memoization hash 33798d6799b5e8dfe132833655f330fd parsl.dataflow.memoization INFO: Task 8 had no result in cache parsl.executors.high_throughput.executor DEBUG: Pushing function .wrapper at 0x1541e8ce9760> to queue with args ("'${CTRL_MPEXEC_DIR}/bin/pipetask --long-log --log-level=VERBOSE run-qbb /mmfs1/home/stevengs/dirac/D...",) parsl.dataflow.dflow INFO: Parsl task 8 try 1 launched on executor multi with executor id 22 parsl.dataflow.dflow INFO: Standard output for task 8 available at /mmfs1/home/stevengs/dirac/DEEP/submit/DEEP/20190827/B0b/science#step1/20240313T064306Z/submit/DEEP/20190827/B0b/science#step1/20240313T064306Z/logs/characterizeImage/890937/1f16a1d8-2ee4-4f0b-b820-9d0f913898ba_characterizeImage_890937_2.stdout parsl.dataflow.dflow INFO: Standard error for task 8 available at /mmfs1/home/stevengs/dirac/DEEP/submit/DEEP/20190827/B0b/science#step1/20240313T064306Z/submit/DEEP/20190827/B0b/science#step1/20240313T064306Z/logs/characterizeImage/890937/1f16a1d8-2ee4-4f0b-b820-9d0f913898ba_characterizeImage_890937_2.stderr parsl.app.errors DEBUG: Reraising exception of type parsl.dataflow.dflow DEBUG: Task 10 try 0 failed parsl.dataflow.dflow INFO: Task 10 marked for retry parsl.dataflow.dflow INFO: Standard output for task 10 available at /mmfs1/home/stevengs/dirac/DEEP/submit/DEEP/20190827/B0b/science#step1/20240313T064306Z/submit/DEEP/20190827/B0b/science#step1/20240313T064306Z/logs/characterizeImage/890942/2ce7a7e4-5757-4164-8090-046ef96f1a00_characterizeImage_890942_2.stdout parsl.dataflow.dflow INFO: Standard error for task 10 available at /mmfs1/home/stevengs/dirac/DEEP/submit/DEEP/20190827/B0b/science#step1/20240313T064306Z/submit/DEEP/20190827/B0b/science#step1/20240313T064306Z/logs/characterizeImage/890942/2ce7a7e4-5757-4164-8090-046ef96f1a00_characterizeImage_890942_2.stderr parsl.dataflow.memoization DEBUG: Ignoring these kwargs for checkpointing: ['stderr', 'stdout'] parsl.dataflow.memoization DEBUG: Ignoring kwarg stderr parsl.dataflow.memoization DEBUG: Ignoring kwarg stdout parsl.dataflow.memoization DEBUG: Task 10 has memoization hash 7573b578282609e4c18db779aca250b0 parsl.dataflow.memoization INFO: Task 10 had no result in cache parsl.executors.high_throughput.executor DEBUG: Pushing function .wrapper at 0x1541e8ce94e0> to queue with args ("'${CTRL_MPEXEC_DIR}/bin/pipetask --long-log --log-level=VERBOSE run-qbb /mmfs1/home/stevengs/dirac/D...",) parsl.dataflow.dflow INFO: Parsl task 10 try 1 launched on executor multi with executor id 23 parsl.dataflow.dflow INFO: Standard output for task 10 available at /mmfs1/home/stevengs/dirac/DEEP/submit/DEEP/20190827/B0b/science#step1/20240313T064306Z/submit/DEEP/20190827/B0b/science#step1/20240313T064306Z/logs/characterizeImage/890942/2ce7a7e4-5757-4164-8090-046ef96f1a00_characterizeImage_890942_2.stdout parsl.dataflow.dflow INFO: Standard error for task 10 available at /mmfs1/home/stevengs/dirac/DEEP/submit/DEEP/20190827/B0b/science#step1/20240313T064306Z/submit/DEEP/20190827/B0b/science#step1/20240313T064306Z/logs/characterizeImage/890942/2ce7a7e4-5757-4164-8090-046ef96f1a00_characterizeImage_890942_2.stderr parsl.app.errors DEBUG: Reraising exception of type parsl.dataflow.dflow DEBUG: Task 12 try 0 failed parsl.dataflow.dflow INFO: Task 12 marked for retry parsl.dataflow.dflow INFO: Standard output for task 12 available at /mmfs1/home/stevengs/dirac/DEEP/submit/DEEP/20190827/B0b/science#step1/20240313T064306Z/submit/DEEP/20190827/B0b/science#step1/20240313T064306Z/logs/characterizeImage/890936/37f041e3-382c-455c-bf65-05022dc758bf_characterizeImage_890936_2.stdout parsl.dataflow.dflow INFO: Standard error for task 12 available at /mmfs1/home/stevengs/dirac/DEEP/submit/DEEP/20190827/B0b/science#step1/20240313T064306Z/submit/DEEP/20190827/B0b/science#step1/20240313T064306Z/logs/characterizeImage/890936/37f041e3-382c-455c-bf65-05022dc758bf_characterizeImage_890936_2.stderr parsl.dataflow.memoization DEBUG: Ignoring these kwargs for checkpointing: ['stderr', 'stdout'] parsl.dataflow.memoization DEBUG: Ignoring kwarg stderr parsl.dataflow.memoization DEBUG: Ignoring kwarg stdout parsl.dataflow.memoization DEBUG: Task 12 has memoization hash 6d6bf02f714bd4f7218acd2d623618ae parsl.dataflow.memoization INFO: Task 12 had no result in cache parsl.executors.high_throughput.executor DEBUG: Pushing function .wrapper at 0x1541e8cea200> to queue with args ("'${CTRL_MPEXEC_DIR}/bin/pipetask --long-log --log-level=VERBOSE run-qbb /mmfs1/home/stevengs/dirac/D...",) parsl.dataflow.dflow INFO: Parsl task 12 try 1 launched on executor multi with executor id 24 parsl.dataflow.dflow INFO: Standard output for task 12 available at /mmfs1/home/stevengs/dirac/DEEP/submit/DEEP/20190827/B0b/science#step1/20240313T064306Z/submit/DEEP/20190827/B0b/science#step1/20240313T064306Z/logs/characterizeImage/890936/37f041e3-382c-455c-bf65-05022dc758bf_characterizeImage_890936_2.stdout parsl.dataflow.dflow INFO: Standard error for task 12 available at /mmfs1/home/stevengs/dirac/DEEP/submit/DEEP/20190827/B0b/science#step1/20240313T064306Z/submit/DEEP/20190827/B0b/science#step1/20240313T064306Z/logs/characterizeImage/890936/37f041e3-382c-455c-bf65-05022dc758bf_characterizeImage_890936_2.stderr 2024-03-13 06:44:06 proc_lsst.multi:146 [INFO] found job 48154 in provider local proc_lsst.multi INFO: found job 48154 in provider local 2024-03-13 06:44:07 proc_lsst.multi:146 [INFO] found job 17008089 in provider astro proc_lsst.multi INFO: found job 17008089 in provider astro parsl.providers.slurm.slurm DEBUG: Executing squeue --noheader --format='%i %t' --job '17008089,17008090,17008091' parsl.app.errors DEBUG: Reraising exception of type parsl.dataflow.dflow DEBUG: Task 2 try 0 failed parsl.dataflow.dflow INFO: Task 2 marked for retry parsl.dataflow.dflow INFO: Standard output for task 2 available at /mmfs1/home/stevengs/dirac/DEEP/submit/DEEP/20190827/B0b/science#step1/20240313T064306Z/submit/DEEP/20190827/B0b/science#step1/20240313T064306Z/logs/calibrate/890940/5e0b598b-92cc-40e4-8974-08534fbf3894_calibrate_890940_61.stdout parsl.dataflow.dflow INFO: Standard error for task 2 available at /mmfs1/home/stevengs/dirac/DEEP/submit/DEEP/20190827/B0b/science#step1/20240313T064306Z/submit/DEEP/20190827/B0b/science#step1/20240313T064306Z/logs/calibrate/890940/5e0b598b-92cc-40e4-8974-08534fbf3894_calibrate_890940_61.stderr parsl.dataflow.memoization DEBUG: Ignoring these kwargs for checkpointing: ['stderr', 'stdout'] parsl.dataflow.memoization DEBUG: Ignoring kwarg stderr parsl.dataflow.memoization DEBUG: Ignoring kwarg stdout parsl.dataflow.memoization DEBUG: Task 2 has memoization hash 1917517d4584c3803db50e1fe8f53203 parsl.dataflow.memoization INFO: Task 2 had no result in cache parsl.executors.high_throughput.executor DEBUG: Pushing function .wrapper at 0x1541e8ce98a0> to queue with args ("'${CTRL_MPEXEC_DIR}/bin/pipetask --long-log --log-level=VERBOSE run-qbb /mmfs1/home/stevengs/dirac/D...",) parsl.dataflow.dflow INFO: Parsl task 2 try 1 launched on executor multi with executor id 25 parsl.dataflow.dflow INFO: Standard output for task 2 available at /mmfs1/home/stevengs/dirac/DEEP/submit/DEEP/20190827/B0b/science#step1/20240313T064306Z/submit/DEEP/20190827/B0b/science#step1/20240313T064306Z/logs/calibrate/890940/5e0b598b-92cc-40e4-8974-08534fbf3894_calibrate_890940_61.stdout parsl.dataflow.dflow INFO: Standard error for task 2 available at /mmfs1/home/stevengs/dirac/DEEP/submit/DEEP/20190827/B0b/science#step1/20240313T064306Z/submit/DEEP/20190827/B0b/science#step1/20240313T064306Z/logs/calibrate/890940/5e0b598b-92cc-40e4-8974-08534fbf3894_calibrate_890940_61.stderr parsl.providers.slurm.slurm DEBUG: squeue returned 17008089 R 17008090 R 17008091 R parsl.providers.slurm.slurm DEBUG: Updating job 17008089 with slurm status R to parsl state JobState.RUNNING parsl.providers.slurm.slurm DEBUG: Updating job 17008090 with slurm status R to parsl state JobState.RUNNING parsl.providers.slurm.slurm DEBUG: Updating job 17008091 with slurm status R to parsl state JobState.RUNNING 2024-03-13 06:44:07 proc_lsst.multi:146 [INFO] found job 17008090 in provider astro proc_lsst.multi INFO: found job 17008090 in provider astro parsl.providers.slurm.slurm DEBUG: Executing squeue --noheader --format='%i %t' --job '17008089,17008090,17008091' parsl.providers.slurm.slurm DEBUG: squeue returned 17008089 R 17008090 R 17008091 R parsl.providers.slurm.slurm DEBUG: Updating job 17008089 with slurm status R to parsl state JobState.RUNNING parsl.providers.slurm.slurm DEBUG: Updating job 17008090 with slurm status R to parsl state JobState.RUNNING parsl.providers.slurm.slurm DEBUG: Updating job 17008091 with slurm status R to parsl state JobState.RUNNING 2024-03-13 06:44:08 proc_lsst.multi:146 [INFO] found job 17008091 in provider astro proc_lsst.multi INFO: found job 17008091 in provider astro parsl.providers.slurm.slurm DEBUG: Executing squeue --noheader --format='%i %t' --job '17008089,17008090,17008091' parsl.app.errors DEBUG: Reraising exception of type parsl.dataflow.dflow DEBUG: Task 14 try 0 failed parsl.dataflow.dflow INFO: Task 14 marked for retry parsl.dataflow.dflow INFO: Standard output for task 14 available at /mmfs1/home/stevengs/dirac/DEEP/submit/DEEP/20190827/B0b/science#step1/20240313T064306Z/submit/DEEP/20190827/B0b/science#step1/20240313T064306Z/logs/characterizeImage/890939/41749cfd-040e-4f5d-b162-dd0362b5f651_characterizeImage_890939_2.stdout parsl.dataflow.dflow INFO: Standard error for task 14 available at /mmfs1/home/stevengs/dirac/DEEP/submit/DEEP/20190827/B0b/science#step1/20240313T064306Z/submit/DEEP/20190827/B0b/science#step1/20240313T064306Z/logs/characterizeImage/890939/41749cfd-040e-4f5d-b162-dd0362b5f651_characterizeImage_890939_2.stderr parsl.dataflow.memoization DEBUG: Ignoring these kwargs for checkpointing: ['stderr', 'stdout'] parsl.dataflow.memoization DEBUG: Ignoring kwarg stderr parsl.dataflow.memoization DEBUG: Ignoring kwarg stdout parsl.dataflow.memoization DEBUG: Task 14 has memoization hash f57d9a889b0d0cefa53a0b5335ff6c8d parsl.dataflow.memoization INFO: Task 14 had no result in cache parsl.executors.high_throughput.executor DEBUG: Pushing function .wrapper at 0x1541e8cea480> to queue with args ("'${CTRL_MPEXEC_DIR}/bin/pipetask --long-log --log-level=VERBOSE run-qbb /mmfs1/home/stevengs/dirac/D...",) parsl.dataflow.dflow INFO: Parsl task 14 try 1 launched on executor multi with executor id 26 parsl.dataflow.dflow INFO: Standard output for task 14 available at /mmfs1/home/stevengs/dirac/DEEP/submit/DEEP/20190827/B0b/science#step1/20240313T064306Z/submit/DEEP/20190827/B0b/science#step1/20240313T064306Z/logs/characterizeImage/890939/41749cfd-040e-4f5d-b162-dd0362b5f651_characterizeImage_890939_2.stdout parsl.dataflow.dflow INFO: Standard error for task 14 available at /mmfs1/home/stevengs/dirac/DEEP/submit/DEEP/20190827/B0b/science#step1/20240313T064306Z/submit/DEEP/20190827/B0b/science#step1/20240313T064306Z/logs/characterizeImage/890939/41749cfd-040e-4f5d-b162-dd0362b5f651_characterizeImage_890939_2.stderr parsl.providers.slurm.slurm DEBUG: squeue returned 17008089 R 17008090 R 17008091 R parsl.providers.slurm.slurm DEBUG: Updating job 17008089 with slurm status R to parsl state JobState.RUNNING parsl.providers.slurm.slurm DEBUG: Updating job 17008090 with slurm status R to parsl state JobState.RUNNING parsl.providers.slurm.slurm DEBUG: Updating job 17008091 with slurm status R to parsl state JobState.RUNNING parsl.dataflow.strategy DEBUG: general strategy starting with strategy_type simple for 1 executors parsl.dataflow.strategy DEBUG: Strategizing for executor multi parsl.dataflow.strategy DEBUG: Slot ratio calculation: active_slots = 4, active_tasks = 16 parsl.dataflow.strategy DEBUG: Executor multi has 16 active tasks, 4/0 running/pending blocks, and 5 connected workers parsl.dataflow.strategy DEBUG: Strategy case 3: no changes necessary to current block load parsl.process_loggers DEBUG: Normal ending for _general_strategy on thread JobStatusPoller-Timer-Thread-23372821068240 parsl.app.errors DEBUG: Reraising exception of type parsl.dataflow.dflow DEBUG: Task 16 try 0 failed parsl.dataflow.dflow INFO: Task 16 marked for retry parsl.dataflow.dflow INFO: Standard output for task 16 available at /mmfs1/home/stevengs/dirac/DEEP/submit/DEEP/20190827/B0b/science#step1/20240313T064306Z/submit/DEEP/20190827/B0b/science#step1/20240313T064306Z/logs/characterizeImage/890943/7ac1413d-ee0a-45df-976a-f933bed31e3a_characterizeImage_890943_2.stdout parsl.dataflow.dflow INFO: Standard error for task 16 available at /mmfs1/home/stevengs/dirac/DEEP/submit/DEEP/20190827/B0b/science#step1/20240313T064306Z/submit/DEEP/20190827/B0b/science#step1/20240313T064306Z/logs/characterizeImage/890943/7ac1413d-ee0a-45df-976a-f933bed31e3a_characterizeImage_890943_2.stderr parsl.dataflow.memoization DEBUG: Ignoring these kwargs for checkpointing: ['stderr', 'stdout'] parsl.dataflow.memoization DEBUG: Ignoring kwarg stderr parsl.dataflow.memoization DEBUG: Ignoring kwarg stdout parsl.dataflow.memoization DEBUG: Task 16 has memoization hash 2b23b109b9eb5318ea3acf54740c7092 parsl.dataflow.memoization INFO: Task 16 had no result in cache parsl.executors.high_throughput.executor DEBUG: Pushing function .wrapper at 0x1541e8cea340> to queue with args ("'${CTRL_MPEXEC_DIR}/bin/pipetask --long-log --log-level=VERBOSE run-qbb /mmfs1/home/stevengs/dirac/D...",) parsl.dataflow.dflow INFO: Parsl task 16 try 1 launched on executor multi with executor id 27 parsl.dataflow.dflow INFO: Standard output for task 16 available at /mmfs1/home/stevengs/dirac/DEEP/submit/DEEP/20190827/B0b/science#step1/20240313T064306Z/submit/DEEP/20190827/B0b/science#step1/20240313T064306Z/logs/characterizeImage/890943/7ac1413d-ee0a-45df-976a-f933bed31e3a_characterizeImage_890943_2.stdout parsl.dataflow.dflow INFO: Standard error for task 16 available at /mmfs1/home/stevengs/dirac/DEEP/submit/DEEP/20190827/B0b/science#step1/20240313T064306Z/submit/DEEP/20190827/B0b/science#step1/20240313T064306Z/logs/characterizeImage/890943/7ac1413d-ee0a-45df-976a-f933bed31e3a_characterizeImage_890943_2.stderr parsl.dataflow.strategy DEBUG: general strategy starting with strategy_type simple for 1 executors parsl.dataflow.strategy DEBUG: Strategizing for executor multi parsl.dataflow.strategy DEBUG: Slot ratio calculation: active_slots = 4, active_tasks = 16 parsl.dataflow.strategy DEBUG: Executor multi has 16 active tasks, 4/0 running/pending blocks, and 5 connected workers parsl.dataflow.strategy DEBUG: Strategy case 3: no changes necessary to current block load parsl.process_loggers DEBUG: Normal ending for _general_strategy on thread JobStatusPoller-Timer-Thread-23372821068240 parsl.app.errors DEBUG: Reraising exception of type parsl.dataflow.dflow DEBUG: Task 20 try 0 failed parsl.dataflow.dflow INFO: Task 20 marked for retry parsl.dataflow.dflow INFO: Standard output for task 20 available at /mmfs1/home/stevengs/dirac/DEEP/submit/DEEP/20190827/B0b/science#step1/20240313T064306Z/submit/DEEP/20190827/B0b/science#step1/20240313T064306Z/logs/characterizeImage/891064/8dd5bcbc-1a01-4986-8f24-79dd541b8098_characterizeImage_891064_2.stdout parsl.dataflow.dflow INFO: Standard error for task 20 available at /mmfs1/home/stevengs/dirac/DEEP/submit/DEEP/20190827/B0b/science#step1/20240313T064306Z/submit/DEEP/20190827/B0b/science#step1/20240313T064306Z/logs/characterizeImage/891064/8dd5bcbc-1a01-4986-8f24-79dd541b8098_characterizeImage_891064_2.stderr parsl.dataflow.memoization DEBUG: Ignoring these kwargs for checkpointing: ['stderr', 'stdout'] parsl.dataflow.memoization DEBUG: Ignoring kwarg stderr parsl.dataflow.memoization DEBUG: Ignoring kwarg stdout parsl.dataflow.memoization DEBUG: Task 20 has memoization hash 6b1e40f6fdf43ea8731b5b32c55c7b9b parsl.dataflow.memoization INFO: Task 20 had no result in cache parsl.executors.high_throughput.executor DEBUG: Pushing function .wrapper at 0x1541e8cea980> to queue with args ("'${CTRL_MPEXEC_DIR}/bin/pipetask --long-log --log-level=VERBOSE run-qbb /mmfs1/home/stevengs/dirac/D...",) parsl.dataflow.dflow INFO: Parsl task 20 try 1 launched on executor multi with executor id 28 parsl.dataflow.dflow INFO: Standard output for task 20 available at /mmfs1/home/stevengs/dirac/DEEP/submit/DEEP/20190827/B0b/science#step1/20240313T064306Z/submit/DEEP/20190827/B0b/science#step1/20240313T064306Z/logs/characterizeImage/891064/8dd5bcbc-1a01-4986-8f24-79dd541b8098_characterizeImage_891064_2.stdout parsl.dataflow.dflow INFO: Standard error for task 20 available at /mmfs1/home/stevengs/dirac/DEEP/submit/DEEP/20190827/B0b/science#step1/20240313T064306Z/submit/DEEP/20190827/B0b/science#step1/20240313T064306Z/logs/characterizeImage/891064/8dd5bcbc-1a01-4986-8f24-79dd541b8098_characterizeImage_891064_2.stderr parsl.app.errors DEBUG: Reraising exception of type parsl.dataflow.dflow DEBUG: Task 18 try 0 failed parsl.dataflow.dflow INFO: Task 18 marked for retry parsl.dataflow.dflow INFO: Standard output for task 18 available at /mmfs1/home/stevengs/dirac/DEEP/submit/DEEP/20190827/B0b/science#step1/20240313T064306Z/submit/DEEP/20190827/B0b/science#step1/20240313T064306Z/logs/characterizeImage/891062/325ce171-0a6e-4427-b0b2-9ed9f8b3e611_characterizeImage_891062_2.stdout parsl.dataflow.dflow INFO: Standard error for task 18 available at /mmfs1/home/stevengs/dirac/DEEP/submit/DEEP/20190827/B0b/science#step1/20240313T064306Z/submit/DEEP/20190827/B0b/science#step1/20240313T064306Z/logs/characterizeImage/891062/325ce171-0a6e-4427-b0b2-9ed9f8b3e611_characterizeImage_891062_2.stderr parsl.dataflow.memoization DEBUG: Ignoring these kwargs for checkpointing: ['stderr', 'stdout'] parsl.dataflow.memoization DEBUG: Ignoring kwarg stderr parsl.dataflow.memoization DEBUG: Ignoring kwarg stdout parsl.dataflow.memoization DEBUG: Task 18 has memoization hash d6e90bb14cb84d111216874eb2510706 parsl.dataflow.memoization INFO: Task 18 had no result in cache parsl.executors.high_throughput.executor DEBUG: Pushing function .wrapper at 0x1541e8cea660> to queue with args ("'${CTRL_MPEXEC_DIR}/bin/pipetask --long-log --log-level=VERBOSE run-qbb /mmfs1/home/stevengs/dirac/D...",) parsl.dataflow.dflow INFO: Parsl task 18 try 1 launched on executor multi with executor id 29 parsl.dataflow.dflow INFO: Standard output for task 18 available at /mmfs1/home/stevengs/dirac/DEEP/submit/DEEP/20190827/B0b/science#step1/20240313T064306Z/submit/DEEP/20190827/B0b/science#step1/20240313T064306Z/logs/characterizeImage/891062/325ce171-0a6e-4427-b0b2-9ed9f8b3e611_characterizeImage_891062_2.stdout parsl.dataflow.dflow INFO: Standard error for task 18 available at /mmfs1/home/stevengs/dirac/DEEP/submit/DEEP/20190827/B0b/science#step1/20240313T064306Z/submit/DEEP/20190827/B0b/science#step1/20240313T064306Z/logs/characterizeImage/891062/325ce171-0a6e-4427-b0b2-9ed9f8b3e611_characterizeImage_891062_2.stderr parsl.app.errors DEBUG: Reraising exception of type parsl.dataflow.dflow DEBUG: Task 22 try 0 failed parsl.dataflow.dflow INFO: Task 22 marked for retry parsl.dataflow.dflow INFO: Standard output for task 22 available at /mmfs1/home/stevengs/dirac/DEEP/submit/DEEP/20190827/B0b/science#step1/20240313T064306Z/submit/DEEP/20190827/B0b/science#step1/20240313T064306Z/logs/characterizeImage/891063/b5eda604-1ff6-4bec-932e-7249b2fd2d68_characterizeImage_891063_2.stdout parsl.dataflow.dflow INFO: Standard error for task 22 available at /mmfs1/home/stevengs/dirac/DEEP/submit/DEEP/20190827/B0b/science#step1/20240313T064306Z/submit/DEEP/20190827/B0b/science#step1/20240313T064306Z/logs/characterizeImage/891063/b5eda604-1ff6-4bec-932e-7249b2fd2d68_characterizeImage_891063_2.stderr parsl.dataflow.memoization DEBUG: Ignoring these kwargs for checkpointing: ['stderr', 'stdout'] parsl.dataflow.memoization DEBUG: Ignoring kwarg stderr parsl.dataflow.memoization DEBUG: Ignoring kwarg stdout parsl.dataflow.memoization DEBUG: Task 22 has memoization hash 60a77a506e43ab04aa31ee7315e3803a parsl.dataflow.memoization INFO: Task 22 had no result in cache parsl.executors.high_throughput.executor DEBUG: Pushing function .wrapper at 0x1541e8ceade0> to queue with args ("'${CTRL_MPEXEC_DIR}/bin/pipetask --long-log --log-level=VERBOSE run-qbb /mmfs1/home/stevengs/dirac/D...",) parsl.dataflow.dflow INFO: Parsl task 22 try 1 launched on executor multi with executor id 30 parsl.dataflow.dflow INFO: Standard output for task 22 available at /mmfs1/home/stevengs/dirac/DEEP/submit/DEEP/20190827/B0b/science#step1/20240313T064306Z/submit/DEEP/20190827/B0b/science#step1/20240313T064306Z/logs/characterizeImage/891063/b5eda604-1ff6-4bec-932e-7249b2fd2d68_characterizeImage_891063_2.stdout parsl.dataflow.dflow INFO: Standard error for task 22 available at /mmfs1/home/stevengs/dirac/DEEP/submit/DEEP/20190827/B0b/science#step1/20240313T064306Z/submit/DEEP/20190827/B0b/science#step1/20240313T064306Z/logs/characterizeImage/891063/b5eda604-1ff6-4bec-932e-7249b2fd2d68_characterizeImage_891063_2.stderr parsl.app.errors DEBUG: Reraising exception of type parsl.dataflow.dflow DEBUG: Task 24 try 0 failed parsl.dataflow.dflow INFO: Task 24 marked for retry parsl.dataflow.dflow INFO: Standard output for task 24 available at /mmfs1/home/stevengs/dirac/DEEP/submit/DEEP/20190827/B0b/science#step1/20240313T064306Z/submit/DEEP/20190827/B0b/science#step1/20240313T064306Z/logs/characterizeImage/890938/0c8ec344-4794-4861-9dde-58d7a0ccae6b_characterizeImage_890938_2.stdout parsl.dataflow.dflow INFO: Standard error for task 24 available at /mmfs1/home/stevengs/dirac/DEEP/submit/DEEP/20190827/B0b/science#step1/20240313T064306Z/submit/DEEP/20190827/B0b/science#step1/20240313T064306Z/logs/characterizeImage/890938/0c8ec344-4794-4861-9dde-58d7a0ccae6b_characterizeImage_890938_2.stderr parsl.dataflow.memoization DEBUG: Ignoring these kwargs for checkpointing: ['stderr', 'stdout'] parsl.dataflow.memoization DEBUG: Ignoring kwarg stderr parsl.dataflow.memoization DEBUG: Ignoring kwarg stdout parsl.dataflow.memoization DEBUG: Task 24 has memoization hash e756fbf308d81e5fe99ac55a69e044a1 parsl.dataflow.memoization INFO: Task 24 had no result in cache parsl.executors.high_throughput.executor DEBUG: Pushing function .wrapper at 0x1541e8ceafc0> to queue with args ("'${CTRL_MPEXEC_DIR}/bin/pipetask --long-log --log-level=VERBOSE run-qbb /mmfs1/home/stevengs/dirac/D...",) parsl.dataflow.dflow INFO: Parsl task 24 try 1 launched on executor multi with executor id 31 parsl.dataflow.dflow INFO: Standard output for task 24 available at /mmfs1/home/stevengs/dirac/DEEP/submit/DEEP/20190827/B0b/science#step1/20240313T064306Z/submit/DEEP/20190827/B0b/science#step1/20240313T064306Z/logs/characterizeImage/890938/0c8ec344-4794-4861-9dde-58d7a0ccae6b_characterizeImage_890938_2.stdout parsl.dataflow.dflow INFO: Standard error for task 24 available at /mmfs1/home/stevengs/dirac/DEEP/submit/DEEP/20190827/B0b/science#step1/20240313T064306Z/submit/DEEP/20190827/B0b/science#step1/20240313T064306Z/logs/characterizeImage/890938/0c8ec344-4794-4861-9dde-58d7a0ccae6b_characterizeImage_890938_2.stderr parsl.app.errors DEBUG: Reraising exception of type parsl.dataflow.dflow DEBUG: Task 26 try 0 failed parsl.dataflow.dflow INFO: Task 26 marked for retry parsl.dataflow.dflow INFO: Standard output for task 26 available at /mmfs1/home/stevengs/dirac/DEEP/submit/DEEP/20190827/B0b/science#step1/20240313T064306Z/submit/DEEP/20190827/B0b/science#step1/20240313T064306Z/logs/characterizeImage/890941/fc71c8fe-4ced-4b33-9a13-6f8e1cd7a9a1_characterizeImage_890941_2.stdout parsl.dataflow.dflow INFO: Standard error for task 26 available at /mmfs1/home/stevengs/dirac/DEEP/submit/DEEP/20190827/B0b/science#step1/20240313T064306Z/submit/DEEP/20190827/B0b/science#step1/20240313T064306Z/logs/characterizeImage/890941/fc71c8fe-4ced-4b33-9a13-6f8e1cd7a9a1_characterizeImage_890941_2.stderr parsl.dataflow.memoization DEBUG: Ignoring these kwargs for checkpointing: ['stderr', 'stdout'] parsl.dataflow.memoization DEBUG: Ignoring kwarg stderr parsl.dataflow.memoization DEBUG: Ignoring kwarg stdout parsl.dataflow.memoization DEBUG: Task 26 has memoization hash 7d829c3909add40875ca5697aa79a662 parsl.dataflow.memoization INFO: Task 26 had no result in cache parsl.executors.high_throughput.executor DEBUG: Pushing function .wrapper at 0x1541e8ceb060> to queue with args ("'${CTRL_MPEXEC_DIR}/bin/pipetask --long-log --log-level=VERBOSE run-qbb /mmfs1/home/stevengs/dirac/D...",) parsl.dataflow.dflow INFO: Parsl task 26 try 1 launched on executor multi with executor id 32 parsl.dataflow.dflow INFO: Standard output for task 26 available at /mmfs1/home/stevengs/dirac/DEEP/submit/DEEP/20190827/B0b/science#step1/20240313T064306Z/submit/DEEP/20190827/B0b/science#step1/20240313T064306Z/logs/characterizeImage/890941/fc71c8fe-4ced-4b33-9a13-6f8e1cd7a9a1_characterizeImage_890941_2.stdout parsl.dataflow.dflow INFO: Standard error for task 26 available at /mmfs1/home/stevengs/dirac/DEEP/submit/DEEP/20190827/B0b/science#step1/20240313T064306Z/submit/DEEP/20190827/B0b/science#step1/20240313T064306Z/logs/characterizeImage/890941/fc71c8fe-4ced-4b33-9a13-6f8e1cd7a9a1_characterizeImage_890941_2.stderr parsl.dataflow.strategy DEBUG: general strategy starting with strategy_type simple for 1 executors parsl.dataflow.strategy DEBUG: Strategizing for executor multi parsl.dataflow.strategy DEBUG: Slot ratio calculation: active_slots = 4, active_tasks = 16 parsl.dataflow.strategy DEBUG: Executor multi has 16 active tasks, 4/0 running/pending blocks, and 5 connected workers parsl.dataflow.strategy DEBUG: Strategy case 3: no changes necessary to current block load parsl.process_loggers DEBUG: Normal ending for _general_strategy on thread JobStatusPoller-Timer-Thread-23372821068240 parsl.app.errors DEBUG: Reraising exception of type parsl.dataflow.dflow DEBUG: Task 6 try 1 failed parsl.dataflow.dflow ERROR: Task 6 failed after 1 retry attempts Traceback (most recent call last): File "/mmfs1/gscratch/dirac/shared/opt/conda/envs/lsst-scipipe-8.0.0/lib/python3.11/site-packages/parsl/dataflow/dflow.py", line 300, in handle_exec_update res = self._unwrap_remote_exception_wrapper(future) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ File "/mmfs1/gscratch/dirac/shared/opt/conda/envs/lsst-scipipe-8.0.0/lib/python3.11/site-packages/parsl/dataflow/dflow.py", line 566, in _unwrap_remote_exception_wrapper result.reraise() File "/mmfs1/gscratch/dirac/shared/opt/conda/envs/lsst-scipipe-8.0.0/lib/python3.11/site-packages/parsl/app/errors.py", line 123, in reraise reraise(t, v, v.__traceback__) File "/mmfs1/gscratch/dirac/shared/opt/conda/envs/lsst-scipipe-8.0.0/lib/python3.11/site-packages/six.py", line 719, in reraise raise value File "/mmfs1/gscratch/dirac/shared/opt/conda/envs/lsst-scipipe-8.0.0/lib/python3.11/site-packages/parsl/app/errors.py", line 146, in wrapper return func(*args, **kwargs) ^^^^^^^^^^^^^^^^^ File "/mmfs1/gscratch/dirac/shared/opt/conda/envs/lsst-scipipe-8.0.0/lib/python3.11/site-packages/parsl/app/bash.py", line 86, in remote_side_bash_executor raise pe.BashExitFailure(func_name, proc.returncode) ^^^^^^^^^^^^^^^^^ parsl.app.errors.BashExitFailure: bash_app characterizeImage failed with unix exit code 1 parsl.dataflow.memoization DEBUG: Storing app cache entry 29e9c5beeab127c767e96112023a28bf with result from task 6 parsl.dataflow.dflow WARNING: No tasks checkpointed so far in this run. Please ensure caching is enabled parsl.dataflow.dflow INFO: Task 7 failed due to dependency failure parsl.dataflow.dflow DEBUG: Task 7 try 0 failed parsl.dataflow.dflow INFO: Task 7 failed due to dependency failure so skipping retries parsl.dataflow.memoization ERROR: Attempting to update app cache entry but hashsum is not a string key parsl.dataflow.dflow WARNING: No tasks checkpointed so far in this run. Please ensure caching is enabled parsl.dataflow.dflow INFO: Standard output for task 7 available at /mmfs1/home/stevengs/dirac/DEEP/submit/DEEP/20190827/B0b/science#step1/20240313T064306Z/submit/DEEP/20190827/B0b/science#step1/20240313T064306Z/logs/calibrate/890940/833ba478-aba0-4d2f-ab68-caa110598922_calibrate_890940_2.stdout parsl.dataflow.dflow INFO: Standard error for task 7 available at /mmfs1/home/stevengs/dirac/DEEP/submit/DEEP/20190827/B0b/science#step1/20240313T064306Z/submit/DEEP/20190827/B0b/science#step1/20240313T064306Z/logs/calibrate/890940/833ba478-aba0-4d2f-ab68-caa110598922_calibrate_890940_2.stderr parsl.dataflow.dflow INFO: Standard output for task 6 available at /mmfs1/home/stevengs/dirac/DEEP/submit/DEEP/20190827/B0b/science#step1/20240313T064306Z/submit/DEEP/20190827/B0b/science#step1/20240313T064306Z/logs/characterizeImage/890940/ae8ab3af-0bec-442b-a18c-fe585de0c308_characterizeImage_890940_2.stdout parsl.dataflow.dflow INFO: Standard error for task 6 available at /mmfs1/home/stevengs/dirac/DEEP/submit/DEEP/20190827/B0b/science#step1/20240313T064306Z/submit/DEEP/20190827/B0b/science#step1/20240313T064306Z/logs/characterizeImage/890940/ae8ab3af-0bec-442b-a18c-fe585de0c308_characterizeImage_890940_2.stderr parsl.dataflow.strategy DEBUG: general strategy starting with strategy_type simple for 1 executors parsl.dataflow.strategy DEBUG: Strategizing for executor multi parsl.dataflow.strategy DEBUG: Slot ratio calculation: active_slots = 4, active_tasks = 15 parsl.dataflow.strategy DEBUG: Executor multi has 15 active tasks, 4/0 running/pending blocks, and 5 connected workers parsl.dataflow.strategy DEBUG: Strategy case 3: no changes necessary to current block load parsl.process_loggers DEBUG: Normal ending for _general_strategy on thread JobStatusPoller-Timer-Thread-23372821068240 parsl.app.errors DEBUG: Reraising exception of type parsl.dataflow.dflow DEBUG: Task 8 try 1 failed parsl.dataflow.dflow ERROR: Task 8 failed after 1 retry attempts Traceback (most recent call last): File "/mmfs1/gscratch/dirac/shared/opt/conda/envs/lsst-scipipe-8.0.0/lib/python3.11/site-packages/parsl/dataflow/dflow.py", line 300, in handle_exec_update res = self._unwrap_remote_exception_wrapper(future) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ File "/mmfs1/gscratch/dirac/shared/opt/conda/envs/lsst-scipipe-8.0.0/lib/python3.11/site-packages/parsl/dataflow/dflow.py", line 566, in _unwrap_remote_exception_wrapper result.reraise() File "/mmfs1/gscratch/dirac/shared/opt/conda/envs/lsst-scipipe-8.0.0/lib/python3.11/site-packages/parsl/app/errors.py", line 123, in reraise reraise(t, v, v.__traceback__) File "/mmfs1/gscratch/dirac/shared/opt/conda/envs/lsst-scipipe-8.0.0/lib/python3.11/site-packages/six.py", line 719, in reraise raise value File "/mmfs1/gscratch/dirac/shared/opt/conda/envs/lsst-scipipe-8.0.0/lib/python3.11/site-packages/parsl/app/errors.py", line 146, in wrapper return func(*args, **kwargs) ^^^^^^^^^^^^^^^^^ File "/mmfs1/gscratch/dirac/shared/opt/conda/envs/lsst-scipipe-8.0.0/lib/python3.11/site-packages/parsl/app/bash.py", line 86, in remote_side_bash_executor raise pe.BashExitFailure(func_name, proc.returncode) ^^^^^^^^^^^^^^^^^ parsl.app.errors.BashExitFailure: bash_app characterizeImage failed with unix exit code 1 parsl.dataflow.memoization DEBUG: Storing app cache entry 33798d6799b5e8dfe132833655f330fd with result from task 8 parsl.dataflow.dflow WARNING: No tasks checkpointed so far in this run. Please ensure caching is enabled parsl.dataflow.dflow INFO: Task 9 failed due to dependency failure parsl.dataflow.dflow DEBUG: Task 9 try 0 failed parsl.dataflow.dflow INFO: Task 9 failed due to dependency failure so skipping retries parsl.dataflow.memoization ERROR: Attempting to update app cache entry but hashsum is not a string key parsl.dataflow.dflow WARNING: No tasks checkpointed so far in this run. Please ensure caching is enabled parsl.dataflow.dflow INFO: Standard output for task 9 available at /mmfs1/home/stevengs/dirac/DEEP/submit/DEEP/20190827/B0b/science#step1/20240313T064306Z/submit/DEEP/20190827/B0b/science#step1/20240313T064306Z/logs/calibrate/890937/c3d7cf3b-9465-4739-88db-4bcf820001c0_calibrate_890937_2.stdout parsl.dataflow.dflow INFO: Standard error for task 9 available at /mmfs1/home/stevengs/dirac/DEEP/submit/DEEP/20190827/B0b/science#step1/20240313T064306Z/submit/DEEP/20190827/B0b/science#step1/20240313T064306Z/logs/calibrate/890937/c3d7cf3b-9465-4739-88db-4bcf820001c0_calibrate_890937_2.stderr parsl.dataflow.dflow INFO: Standard output for task 8 available at /mmfs1/home/stevengs/dirac/DEEP/submit/DEEP/20190827/B0b/science#step1/20240313T064306Z/submit/DEEP/20190827/B0b/science#step1/20240313T064306Z/logs/characterizeImage/890937/1f16a1d8-2ee4-4f0b-b820-9d0f913898ba_characterizeImage_890937_2.stdout parsl.dataflow.dflow INFO: Standard error for task 8 available at /mmfs1/home/stevengs/dirac/DEEP/submit/DEEP/20190827/B0b/science#step1/20240313T064306Z/submit/DEEP/20190827/B0b/science#step1/20240313T064306Z/logs/characterizeImage/890937/1f16a1d8-2ee4-4f0b-b820-9d0f913898ba_characterizeImage_890937_2.stderr parsl.app.errors DEBUG: Reraising exception of type parsl.dataflow.dflow DEBUG: Task 4 try 1 failed parsl.dataflow.dflow ERROR: Task 4 failed after 1 retry attempts Traceback (most recent call last): File "/mmfs1/gscratch/dirac/shared/opt/conda/envs/lsst-scipipe-8.0.0/lib/python3.11/site-packages/parsl/dataflow/dflow.py", line 300, in handle_exec_update res = self._unwrap_remote_exception_wrapper(future) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ File "/mmfs1/gscratch/dirac/shared/opt/conda/envs/lsst-scipipe-8.0.0/lib/python3.11/site-packages/parsl/dataflow/dflow.py", line 566, in _unwrap_remote_exception_wrapper result.reraise() File "/mmfs1/gscratch/dirac/shared/opt/conda/envs/lsst-scipipe-8.0.0/lib/python3.11/site-packages/parsl/app/errors.py", line 123, in reraise reraise(t, v, v.__traceback__) File "/mmfs1/gscratch/dirac/shared/opt/conda/envs/lsst-scipipe-8.0.0/lib/python3.11/site-packages/six.py", line 719, in reraise raise value File "/mmfs1/gscratch/dirac/shared/opt/conda/envs/lsst-scipipe-8.0.0/lib/python3.11/site-packages/parsl/app/errors.py", line 146, in wrapper return func(*args, **kwargs) ^^^^^^^^^^^^^^^^^ File "/mmfs1/gscratch/dirac/shared/opt/conda/envs/lsst-scipipe-8.0.0/lib/python3.11/site-packages/parsl/app/bash.py", line 86, in remote_side_bash_executor raise pe.BashExitFailure(func_name, proc.returncode) ^^^^^^^^^^^^^^^^^ parsl.app.errors.BashExitFailure: bash_app characterizeImage failed with unix exit code 1 parsl.dataflow.memoization DEBUG: Storing app cache entry c8402befe8520c3274b69ed563a18646 with result from task 4 parsl.dataflow.dflow WARNING: No tasks checkpointed so far in this run. Please ensure caching is enabled parsl.dataflow.dflow INFO: Task 5 failed due to dependency failure parsl.dataflow.dflow DEBUG: Task 5 try 0 failed parsl.dataflow.dflow INFO: Task 5 failed due to dependency failure so skipping retries parsl.dataflow.memoization ERROR: Attempting to update app cache entry but hashsum is not a string key parsl.dataflow.dflow WARNING: No tasks checkpointed so far in this run. Please ensure caching is enabled parsl.dataflow.dflow INFO: Standard output for task 5 available at /mmfs1/home/stevengs/dirac/DEEP/submit/DEEP/20190827/B0b/science#step1/20240313T064306Z/submit/DEEP/20190827/B0b/science#step1/20240313T064306Z/logs/calibrate/890936/b0b38f3d-6ab1-4b72-b0c1-750a6e44f47d_calibrate_890936_61.stdout parsl.dataflow.dflow INFO: Standard error for task 5 available at /mmfs1/home/stevengs/dirac/DEEP/submit/DEEP/20190827/B0b/science#step1/20240313T064306Z/submit/DEEP/20190827/B0b/science#step1/20240313T064306Z/logs/calibrate/890936/b0b38f3d-6ab1-4b72-b0c1-750a6e44f47d_calibrate_890936_61.stderr parsl.dataflow.dflow INFO: Standard output for task 4 available at /mmfs1/home/stevengs/dirac/DEEP/submit/DEEP/20190827/B0b/science#step1/20240313T064306Z/submit/DEEP/20190827/B0b/science#step1/20240313T064306Z/logs/characterizeImage/890936/aed96731-2ed8-4b64-b635-469dd34d0331_characterizeImage_890936_61.stdout parsl.dataflow.dflow INFO: Standard error for task 4 available at /mmfs1/home/stevengs/dirac/DEEP/submit/DEEP/20190827/B0b/science#step1/20240313T064306Z/submit/DEEP/20190827/B0b/science#step1/20240313T064306Z/logs/characterizeImage/890936/aed96731-2ed8-4b64-b635-469dd34d0331_characterizeImage_890936_61.stderr parsl.dataflow.strategy DEBUG: general strategy starting with strategy_type simple for 1 executors parsl.dataflow.strategy DEBUG: Strategizing for executor multi parsl.dataflow.strategy DEBUG: Slot ratio calculation: active_slots = 4, active_tasks = 13 parsl.dataflow.strategy DEBUG: Executor multi has 13 active tasks, 4/0 running/pending blocks, and 5 connected workers parsl.dataflow.strategy DEBUG: Strategy case 3: no changes necessary to current block load parsl.process_loggers DEBUG: Normal ending for _general_strategy on thread JobStatusPoller-Timer-Thread-23372821068240 parsl.app.errors DEBUG: Reraising exception of type parsl.dataflow.dflow DEBUG: Task 10 try 1 failed parsl.dataflow.dflow ERROR: Task 10 failed after 1 retry attempts Traceback (most recent call last): File "/mmfs1/gscratch/dirac/shared/opt/conda/envs/lsst-scipipe-8.0.0/lib/python3.11/site-packages/parsl/dataflow/dflow.py", line 300, in handle_exec_update res = self._unwrap_remote_exception_wrapper(future) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ File "/mmfs1/gscratch/dirac/shared/opt/conda/envs/lsst-scipipe-8.0.0/lib/python3.11/site-packages/parsl/dataflow/dflow.py", line 566, in _unwrap_remote_exception_wrapper result.reraise() File "/mmfs1/gscratch/dirac/shared/opt/conda/envs/lsst-scipipe-8.0.0/lib/python3.11/site-packages/parsl/app/errors.py", line 123, in reraise reraise(t, v, v.__traceback__) File "/mmfs1/gscratch/dirac/shared/opt/conda/envs/lsst-scipipe-8.0.0/lib/python3.11/site-packages/six.py", line 719, in reraise raise value File "/mmfs1/gscratch/dirac/shared/opt/conda/envs/lsst-scipipe-8.0.0/lib/python3.11/site-packages/parsl/app/errors.py", line 146, in wrapper return func(*args, **kwargs) ^^^^^^^^^^^^^^^^^ File "/mmfs1/gscratch/dirac/shared/opt/conda/envs/lsst-scipipe-8.0.0/lib/python3.11/site-packages/parsl/app/bash.py", line 86, in remote_side_bash_executor raise pe.BashExitFailure(func_name, proc.returncode) ^^^^^^^^^^^^^^^^^ parsl.app.errors.BashExitFailure: bash_app characterizeImage failed with unix exit code 1 parsl.dataflow.memoization DEBUG: Storing app cache entry 7573b578282609e4c18db779aca250b0 with result from task 10 parsl.dataflow.dflow WARNING: No tasks checkpointed so far in this run. Please ensure caching is enabled parsl.dataflow.dflow INFO: Task 11 failed due to dependency failure parsl.dataflow.dflow DEBUG: Task 11 try 0 failed parsl.dataflow.dflow INFO: Task 11 failed due to dependency failure so skipping retries parsl.dataflow.memoization ERROR: Attempting to update app cache entry but hashsum is not a string key parsl.dataflow.dflow WARNING: No tasks checkpointed so far in this run. Please ensure caching is enabled parsl.dataflow.dflow INFO: Standard output for task 11 available at /mmfs1/home/stevengs/dirac/DEEP/submit/DEEP/20190827/B0b/science#step1/20240313T064306Z/submit/DEEP/20190827/B0b/science#step1/20240313T064306Z/logs/calibrate/890942/413c5f7b-d3d8-4283-8cc2-a8e3be2caa7d_calibrate_890942_2.stdout parsl.dataflow.dflow INFO: Standard error for task 11 available at /mmfs1/home/stevengs/dirac/DEEP/submit/DEEP/20190827/B0b/science#step1/20240313T064306Z/submit/DEEP/20190827/B0b/science#step1/20240313T064306Z/logs/calibrate/890942/413c5f7b-d3d8-4283-8cc2-a8e3be2caa7d_calibrate_890942_2.stderr parsl.dataflow.dflow INFO: Standard output for task 10 available at /mmfs1/home/stevengs/dirac/DEEP/submit/DEEP/20190827/B0b/science#step1/20240313T064306Z/submit/DEEP/20190827/B0b/science#step1/20240313T064306Z/logs/characterizeImage/890942/2ce7a7e4-5757-4164-8090-046ef96f1a00_characterizeImage_890942_2.stdout parsl.dataflow.dflow INFO: Standard error for task 10 available at /mmfs1/home/stevengs/dirac/DEEP/submit/DEEP/20190827/B0b/science#step1/20240313T064306Z/submit/DEEP/20190827/B0b/science#step1/20240313T064306Z/logs/characterizeImage/890942/2ce7a7e4-5757-4164-8090-046ef96f1a00_characterizeImage_890942_2.stderr parsl.app.errors DEBUG: Reraising exception of type parsl.dataflow.dflow DEBUG: Task 12 try 1 failed parsl.dataflow.dflow ERROR: Task 12 failed after 1 retry attempts Traceback (most recent call last): File "/mmfs1/gscratch/dirac/shared/opt/conda/envs/lsst-scipipe-8.0.0/lib/python3.11/site-packages/parsl/dataflow/dflow.py", line 300, in handle_exec_update res = self._unwrap_remote_exception_wrapper(future) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ File "/mmfs1/gscratch/dirac/shared/opt/conda/envs/lsst-scipipe-8.0.0/lib/python3.11/site-packages/parsl/dataflow/dflow.py", line 566, in _unwrap_remote_exception_wrapper result.reraise() File "/mmfs1/gscratch/dirac/shared/opt/conda/envs/lsst-scipipe-8.0.0/lib/python3.11/site-packages/parsl/app/errors.py", line 123, in reraise reraise(t, v, v.__traceback__) File "/mmfs1/gscratch/dirac/shared/opt/conda/envs/lsst-scipipe-8.0.0/lib/python3.11/site-packages/six.py", line 719, in reraise raise value File "/mmfs1/gscratch/dirac/shared/opt/conda/envs/lsst-scipipe-8.0.0/lib/python3.11/site-packages/parsl/app/errors.py", line 146, in wrapper return func(*args, **kwargs) ^^^^^^^^^^^^^^^^^ File "/mmfs1/gscratch/dirac/shared/opt/conda/envs/lsst-scipipe-8.0.0/lib/python3.11/site-packages/parsl/app/bash.py", line 86, in remote_side_bash_executor raise pe.BashExitFailure(func_name, proc.returncode) ^^^^^^^^^^^^^^^^^ parsl.app.errors.BashExitFailure: bash_app characterizeImage failed with unix exit code 1 parsl.dataflow.memoization DEBUG: Storing app cache entry 6d6bf02f714bd4f7218acd2d623618ae with result from task 12 parsl.dataflow.dflow WARNING: No tasks checkpointed so far in this run. Please ensure caching is enabled parsl.dataflow.dflow INFO: Task 13 failed due to dependency failure parsl.dataflow.dflow DEBUG: Task 13 try 0 failed parsl.dataflow.dflow INFO: Task 13 failed due to dependency failure so skipping retries parsl.dataflow.memoization ERROR: Attempting to update app cache entry but hashsum is not a string key parsl.dataflow.dflow WARNING: No tasks checkpointed so far in this run. Please ensure caching is enabled parsl.dataflow.dflow INFO: Standard output for task 13 available at /mmfs1/home/stevengs/dirac/DEEP/submit/DEEP/20190827/B0b/science#step1/20240313T064306Z/submit/DEEP/20190827/B0b/science#step1/20240313T064306Z/logs/calibrate/890936/1704fef6-5443-449d-ac1a-e932f9070305_calibrate_890936_2.stdout parsl.dataflow.dflow INFO: Standard error for task 13 available at /mmfs1/home/stevengs/dirac/DEEP/submit/DEEP/20190827/B0b/science#step1/20240313T064306Z/submit/DEEP/20190827/B0b/science#step1/20240313T064306Z/logs/calibrate/890936/1704fef6-5443-449d-ac1a-e932f9070305_calibrate_890936_2.stderr parsl.dataflow.dflow INFO: Standard output for task 12 available at /mmfs1/home/stevengs/dirac/DEEP/submit/DEEP/20190827/B0b/science#step1/20240313T064306Z/submit/DEEP/20190827/B0b/science#step1/20240313T064306Z/logs/characterizeImage/890936/37f041e3-382c-455c-bf65-05022dc758bf_characterizeImage_890936_2.stdout parsl.dataflow.dflow INFO: Standard error for task 12 available at /mmfs1/home/stevengs/dirac/DEEP/submit/DEEP/20190827/B0b/science#step1/20240313T064306Z/submit/DEEP/20190827/B0b/science#step1/20240313T064306Z/logs/characterizeImage/890936/37f041e3-382c-455c-bf65-05022dc758bf_characterizeImage_890936_2.stderr parsl.dataflow.strategy DEBUG: general strategy starting with strategy_type simple for 1 executors parsl.dataflow.strategy DEBUG: Strategizing for executor multi parsl.dataflow.strategy DEBUG: Slot ratio calculation: active_slots = 4, active_tasks = 11 parsl.dataflow.strategy DEBUG: Executor multi has 11 active tasks, 4/0 running/pending blocks, and 5 connected workers parsl.dataflow.strategy DEBUG: Strategy case 3: no changes necessary to current block load parsl.process_loggers DEBUG: Normal ending for _general_strategy on thread JobStatusPoller-Timer-Thread-23372821068240 parsl.app.errors DEBUG: Reraising exception of type parsl.dataflow.dflow DEBUG: Task 0 try 1 failed parsl.dataflow.dflow ERROR: Task 0 failed after 1 retry attempts Traceback (most recent call last): File "/mmfs1/gscratch/dirac/shared/opt/conda/envs/lsst-scipipe-8.0.0/lib/python3.11/site-packages/parsl/dataflow/dflow.py", line 300, in handle_exec_update res = self._unwrap_remote_exception_wrapper(future) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ File "/mmfs1/gscratch/dirac/shared/opt/conda/envs/lsst-scipipe-8.0.0/lib/python3.11/site-packages/parsl/dataflow/dflow.py", line 566, in _unwrap_remote_exception_wrapper result.reraise() File "/mmfs1/gscratch/dirac/shared/opt/conda/envs/lsst-scipipe-8.0.0/lib/python3.11/site-packages/parsl/app/errors.py", line 123, in reraise reraise(t, v, v.__traceback__) File "/mmfs1/gscratch/dirac/shared/opt/conda/envs/lsst-scipipe-8.0.0/lib/python3.11/site-packages/six.py", line 719, in reraise raise value File "/mmfs1/gscratch/dirac/shared/opt/conda/envs/lsst-scipipe-8.0.0/lib/python3.11/site-packages/parsl/app/errors.py", line 146, in wrapper return func(*args, **kwargs) ^^^^^^^^^^^^^^^^^ File "/mmfs1/gscratch/dirac/shared/opt/conda/envs/lsst-scipipe-8.0.0/lib/python3.11/site-packages/parsl/app/bash.py", line 86, in remote_side_bash_executor raise pe.BashExitFailure(func_name, proc.returncode) ^^^^^^^^^^^^^^^^^ parsl.app.errors.BashExitFailure: bash_app calibrate failed with unix exit code 1 parsl.dataflow.memoization DEBUG: Storing app cache entry 066974e73335b84bd5a3e4da3c94b901 with result from task 0 parsl.dataflow.dflow WARNING: No tasks checkpointed so far in this run. Please ensure caching is enabled parsl.dataflow.dflow INFO: Standard output for task 0 available at /mmfs1/home/stevengs/dirac/DEEP/submit/DEEP/20190827/B0b/science#step1/20240313T064306Z/submit/DEEP/20190827/B0b/science#step1/20240313T064306Z/logs/calibrate/890939/d6874642-13c5-415c-929b-08bc9969eaf1_calibrate_890939_61.stdout parsl.dataflow.dflow INFO: Standard error for task 0 available at /mmfs1/home/stevengs/dirac/DEEP/submit/DEEP/20190827/B0b/science#step1/20240313T064306Z/submit/DEEP/20190827/B0b/science#step1/20240313T064306Z/logs/calibrate/890939/d6874642-13c5-415c-929b-08bc9969eaf1_calibrate_890939_61.stderr parsl.app.errors DEBUG: Reraising exception of type parsl.dataflow.dflow DEBUG: Task 3 try 1 failed parsl.dataflow.dflow ERROR: Task 3 failed after 1 retry attempts Traceback (most recent call last): File "/mmfs1/gscratch/dirac/shared/opt/conda/envs/lsst-scipipe-8.0.0/lib/python3.11/site-packages/parsl/dataflow/dflow.py", line 300, in handle_exec_update res = self._unwrap_remote_exception_wrapper(future) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ File "/mmfs1/gscratch/dirac/shared/opt/conda/envs/lsst-scipipe-8.0.0/lib/python3.11/site-packages/parsl/dataflow/dflow.py", line 566, in _unwrap_remote_exception_wrapper result.reraise() File "/mmfs1/gscratch/dirac/shared/opt/conda/envs/lsst-scipipe-8.0.0/lib/python3.11/site-packages/parsl/app/errors.py", line 123, in reraise reraise(t, v, v.__traceback__) File "/mmfs1/gscratch/dirac/shared/opt/conda/envs/lsst-scipipe-8.0.0/lib/python3.11/site-packages/six.py", line 719, in reraise raise value File "/mmfs1/gscratch/dirac/shared/opt/conda/envs/lsst-scipipe-8.0.0/lib/python3.11/site-packages/parsl/app/errors.py", line 146, in wrapper return func(*args, **kwargs) ^^^^^^^^^^^^^^^^^ File "/mmfs1/gscratch/dirac/shared/opt/conda/envs/lsst-scipipe-8.0.0/lib/python3.11/site-packages/parsl/app/bash.py", line 86, in remote_side_bash_executor raise pe.BashExitFailure(func_name, proc.returncode) ^^^^^^^^^^^^^^^^^ parsl.app.errors.BashExitFailure: bash_app calibrate failed with unix exit code 1 parsl.dataflow.memoization DEBUG: Storing app cache entry c79aac85763efa2a323bf02a99d5064a with result from task 3 parsl.dataflow.dflow WARNING: No tasks checkpointed so far in this run. Please ensure caching is enabled parsl.dataflow.dflow INFO: Standard output for task 3 available at /mmfs1/home/stevengs/dirac/DEEP/submit/DEEP/20190827/B0b/science#step1/20240313T064306Z/submit/DEEP/20190827/B0b/science#step1/20240313T064306Z/logs/calibrate/890937/68cc28c3-61c8-4d74-a6e4-b86339a9769b_calibrate_890937_61.stdout parsl.dataflow.dflow INFO: Standard error for task 3 available at /mmfs1/home/stevengs/dirac/DEEP/submit/DEEP/20190827/B0b/science#step1/20240313T064306Z/submit/DEEP/20190827/B0b/science#step1/20240313T064306Z/logs/calibrate/890937/68cc28c3-61c8-4d74-a6e4-b86339a9769b_calibrate_890937_61.stderr parsl.app.errors DEBUG: Reraising exception of type parsl.dataflow.dflow DEBUG: Task 1 try 1 failed parsl.dataflow.dflow ERROR: Task 1 failed after 1 retry attempts Traceback (most recent call last): File "/mmfs1/gscratch/dirac/shared/opt/conda/envs/lsst-scipipe-8.0.0/lib/python3.11/site-packages/parsl/dataflow/dflow.py", line 300, in handle_exec_update res = self._unwrap_remote_exception_wrapper(future) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ File "/mmfs1/gscratch/dirac/shared/opt/conda/envs/lsst-scipipe-8.0.0/lib/python3.11/site-packages/parsl/dataflow/dflow.py", line 566, in _unwrap_remote_exception_wrapper result.reraise() File "/mmfs1/gscratch/dirac/shared/opt/conda/envs/lsst-scipipe-8.0.0/lib/python3.11/site-packages/parsl/app/errors.py", line 123, in reraise reraise(t, v, v.__traceback__) File "/mmfs1/gscratch/dirac/shared/opt/conda/envs/lsst-scipipe-8.0.0/lib/python3.11/site-packages/six.py", line 719, in reraise raise value File "/mmfs1/gscratch/dirac/shared/opt/conda/envs/lsst-scipipe-8.0.0/lib/python3.11/site-packages/parsl/app/errors.py", line 146, in wrapper return func(*args, **kwargs) ^^^^^^^^^^^^^^^^^ File "/mmfs1/gscratch/dirac/shared/opt/conda/envs/lsst-scipipe-8.0.0/lib/python3.11/site-packages/parsl/app/bash.py", line 86, in remote_side_bash_executor raise pe.BashExitFailure(func_name, proc.returncode) ^^^^^^^^^^^^^^^^^ parsl.app.errors.BashExitFailure: bash_app calibrate failed with unix exit code 1 parsl.dataflow.memoization DEBUG: Storing app cache entry 195b7a20d72e3f8ba2243ab3dc63d265 with result from task 1 parsl.dataflow.dflow WARNING: No tasks checkpointed so far in this run. Please ensure caching is enabled parsl.dataflow.dflow INFO: Standard output for task 1 available at /mmfs1/home/stevengs/dirac/DEEP/submit/DEEP/20190827/B0b/science#step1/20240313T064306Z/submit/DEEP/20190827/B0b/science#step1/20240313T064306Z/logs/calibrate/890938/8c16f2c6-023d-4a22-b2ab-74808e864101_calibrate_890938_61.stdout parsl.dataflow.dflow INFO: Standard error for task 1 available at /mmfs1/home/stevengs/dirac/DEEP/submit/DEEP/20190827/B0b/science#step1/20240313T064306Z/submit/DEEP/20190827/B0b/science#step1/20240313T064306Z/logs/calibrate/890938/8c16f2c6-023d-4a22-b2ab-74808e864101_calibrate_890938_61.stderr parsl.app.errors DEBUG: Reraising exception of type parsl.dataflow.dflow DEBUG: Task 14 try 1 failed parsl.dataflow.dflow ERROR: Task 14 failed after 1 retry attempts Traceback (most recent call last): File "/mmfs1/gscratch/dirac/shared/opt/conda/envs/lsst-scipipe-8.0.0/lib/python3.11/site-packages/parsl/dataflow/dflow.py", line 300, in handle_exec_update res = self._unwrap_remote_exception_wrapper(future) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ File "/mmfs1/gscratch/dirac/shared/opt/conda/envs/lsst-scipipe-8.0.0/lib/python3.11/site-packages/parsl/dataflow/dflow.py", line 566, in _unwrap_remote_exception_wrapper result.reraise() File "/mmfs1/gscratch/dirac/shared/opt/conda/envs/lsst-scipipe-8.0.0/lib/python3.11/site-packages/parsl/app/errors.py", line 123, in reraise reraise(t, v, v.__traceback__) File "/mmfs1/gscratch/dirac/shared/opt/conda/envs/lsst-scipipe-8.0.0/lib/python3.11/site-packages/six.py", line 719, in reraise raise value File "/mmfs1/gscratch/dirac/shared/opt/conda/envs/lsst-scipipe-8.0.0/lib/python3.11/site-packages/parsl/app/errors.py", line 146, in wrapper return func(*args, **kwargs) ^^^^^^^^^^^^^^^^^ File "/mmfs1/gscratch/dirac/shared/opt/conda/envs/lsst-scipipe-8.0.0/lib/python3.11/site-packages/parsl/app/bash.py", line 86, in remote_side_bash_executor raise pe.BashExitFailure(func_name, proc.returncode) ^^^^^^^^^^^^^^^^^ parsl.app.errors.BashExitFailure: bash_app characterizeImage failed with unix exit code 1 parsl.dataflow.memoization DEBUG: Storing app cache entry f57d9a889b0d0cefa53a0b5335ff6c8d with result from task 14 parsl.dataflow.dflow WARNING: No tasks checkpointed so far in this run. Please ensure caching is enabled parsl.dataflow.dflow INFO: Task 15 failed due to dependency failure parsl.dataflow.dflow DEBUG: Task 15 try 0 failed parsl.dataflow.dflow INFO: Task 15 failed due to dependency failure so skipping retries parsl.dataflow.memoization ERROR: Attempting to update app cache entry but hashsum is not a string key parsl.dataflow.dflow WARNING: No tasks checkpointed so far in this run. Please ensure caching is enabled parsl.dataflow.dflow INFO: Standard output for task 15 available at /mmfs1/home/stevengs/dirac/DEEP/submit/DEEP/20190827/B0b/science#step1/20240313T064306Z/submit/DEEP/20190827/B0b/science#step1/20240313T064306Z/logs/calibrate/890939/99bc7662-2d6f-4ec2-b90a-bfba53954118_calibrate_890939_2.stdout parsl.dataflow.dflow INFO: Standard error for task 15 available at /mmfs1/home/stevengs/dirac/DEEP/submit/DEEP/20190827/B0b/science#step1/20240313T064306Z/submit/DEEP/20190827/B0b/science#step1/20240313T064306Z/logs/calibrate/890939/99bc7662-2d6f-4ec2-b90a-bfba53954118_calibrate_890939_2.stderr parsl.dataflow.dflow INFO: Standard output for task 14 available at /mmfs1/home/stevengs/dirac/DEEP/submit/DEEP/20190827/B0b/science#step1/20240313T064306Z/submit/DEEP/20190827/B0b/science#step1/20240313T064306Z/logs/characterizeImage/890939/41749cfd-040e-4f5d-b162-dd0362b5f651_characterizeImage_890939_2.stdout parsl.dataflow.dflow INFO: Standard error for task 14 available at /mmfs1/home/stevengs/dirac/DEEP/submit/DEEP/20190827/B0b/science#step1/20240313T064306Z/submit/DEEP/20190827/B0b/science#step1/20240313T064306Z/logs/characterizeImage/890939/41749cfd-040e-4f5d-b162-dd0362b5f651_characterizeImage_890939_2.stderr 2024-03-13 06:44:36 proc_lsst.multi:146 [INFO] found job 48154 in provider local proc_lsst.multi INFO: found job 48154 in provider local 2024-03-13 06:44:37 proc_lsst.multi:146 [INFO] found job 17008089 in provider astro proc_lsst.multi INFO: found job 17008089 in provider astro parsl.providers.slurm.slurm DEBUG: Executing squeue --noheader --format='%i %t' --job '17008089,17008090,17008091' parsl.providers.slurm.slurm DEBUG: squeue returned 17008089 R 17008090 R 17008091 R parsl.providers.slurm.slurm DEBUG: Updating job 17008089 with slurm status R to parsl state JobState.RUNNING parsl.providers.slurm.slurm DEBUG: Updating job 17008090 with slurm status R to parsl state JobState.RUNNING parsl.providers.slurm.slurm DEBUG: Updating job 17008091 with slurm status R to parsl state JobState.RUNNING 2024-03-13 06:44:37 proc_lsst.multi:146 [INFO] found job 17008090 in provider astro proc_lsst.multi INFO: found job 17008090 in provider astro parsl.providers.slurm.slurm DEBUG: Executing squeue --noheader --format='%i %t' --job '17008089,17008090,17008091' parsl.providers.slurm.slurm DEBUG: squeue returned 17008089 R 17008090 R 17008091 R parsl.providers.slurm.slurm DEBUG: Updating job 17008089 with slurm status R to parsl state JobState.RUNNING parsl.providers.slurm.slurm DEBUG: Updating job 17008090 with slurm status R to parsl state JobState.RUNNING parsl.providers.slurm.slurm DEBUG: Updating job 17008091 with slurm status R to parsl state JobState.RUNNING 2024-03-13 06:44:38 proc_lsst.multi:146 [INFO] found job 17008091 in provider astro proc_lsst.multi INFO: found job 17008091 in provider astro parsl.providers.slurm.slurm DEBUG: Executing squeue --noheader --format='%i %t' --job '17008089,17008090,17008091' parsl.providers.slurm.slurm DEBUG: squeue returned 17008089 R 17008090 R 17008091 R parsl.providers.slurm.slurm DEBUG: Updating job 17008089 with slurm status R to parsl state JobState.RUNNING parsl.providers.slurm.slurm DEBUG: Updating job 17008090 with slurm status R to parsl state JobState.RUNNING parsl.providers.slurm.slurm DEBUG: Updating job 17008091 with slurm status R to parsl state JobState.RUNNING parsl.dataflow.strategy DEBUG: general strategy starting with strategy_type simple for 1 executors parsl.dataflow.strategy DEBUG: Strategizing for executor multi parsl.dataflow.strategy DEBUG: Slot ratio calculation: active_slots = 4, active_tasks = 7 parsl.dataflow.strategy DEBUG: Executor multi has 7 active tasks, 4/0 running/pending blocks, and 5 connected workers parsl.dataflow.strategy DEBUG: Strategy case 3: no changes necessary to current block load parsl.process_loggers DEBUG: Normal ending for _general_strategy on thread JobStatusPoller-Timer-Thread-23372821068240 parsl.app.errors DEBUG: Reraising exception of type parsl.dataflow.dflow DEBUG: Task 16 try 1 failed parsl.dataflow.dflow ERROR: Task 16 failed after 1 retry attempts Traceback (most recent call last): File "/mmfs1/gscratch/dirac/shared/opt/conda/envs/lsst-scipipe-8.0.0/lib/python3.11/site-packages/parsl/dataflow/dflow.py", line 300, in handle_exec_update res = self._unwrap_remote_exception_wrapper(future) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ File "/mmfs1/gscratch/dirac/shared/opt/conda/envs/lsst-scipipe-8.0.0/lib/python3.11/site-packages/parsl/dataflow/dflow.py", line 566, in _unwrap_remote_exception_wrapper result.reraise() File "/mmfs1/gscratch/dirac/shared/opt/conda/envs/lsst-scipipe-8.0.0/lib/python3.11/site-packages/parsl/app/errors.py", line 123, in reraise reraise(t, v, v.__traceback__) File "/mmfs1/gscratch/dirac/shared/opt/conda/envs/lsst-scipipe-8.0.0/lib/python3.11/site-packages/six.py", line 719, in reraise raise value File "/mmfs1/gscratch/dirac/shared/opt/conda/envs/lsst-scipipe-8.0.0/lib/python3.11/site-packages/parsl/app/errors.py", line 146, in wrapper return func(*args, **kwargs) ^^^^^^^^^^^^^^^^^ File "/mmfs1/gscratch/dirac/shared/opt/conda/envs/lsst-scipipe-8.0.0/lib/python3.11/site-packages/parsl/app/bash.py", line 86, in remote_side_bash_executor raise pe.BashExitFailure(func_name, proc.returncode) ^^^^^^^^^^^^^^^^^ parsl.app.errors.BashExitFailure: bash_app characterizeImage failed with unix exit code 1 parsl.dataflow.memoization DEBUG: Storing app cache entry 2b23b109b9eb5318ea3acf54740c7092 with result from task 16 parsl.dataflow.dflow WARNING: No tasks checkpointed so far in this run. Please ensure caching is enabled parsl.dataflow.dflow INFO: Task 17 failed due to dependency failure parsl.dataflow.dflow DEBUG: Task 17 try 0 failed parsl.dataflow.dflow INFO: Task 17 failed due to dependency failure so skipping retries parsl.dataflow.memoization ERROR: Attempting to update app cache entry but hashsum is not a string key parsl.dataflow.dflow WARNING: No tasks checkpointed so far in this run. Please ensure caching is enabled parsl.dataflow.dflow INFO: Standard output for task 17 available at /mmfs1/home/stevengs/dirac/DEEP/submit/DEEP/20190827/B0b/science#step1/20240313T064306Z/submit/DEEP/20190827/B0b/science#step1/20240313T064306Z/logs/calibrate/890943/559b2a78-c2d3-450c-b59a-9528bc606adb_calibrate_890943_2.stdout parsl.dataflow.dflow INFO: Standard error for task 17 available at /mmfs1/home/stevengs/dirac/DEEP/submit/DEEP/20190827/B0b/science#step1/20240313T064306Z/submit/DEEP/20190827/B0b/science#step1/20240313T064306Z/logs/calibrate/890943/559b2a78-c2d3-450c-b59a-9528bc606adb_calibrate_890943_2.stderr parsl.dataflow.dflow INFO: Standard output for task 16 available at /mmfs1/home/stevengs/dirac/DEEP/submit/DEEP/20190827/B0b/science#step1/20240313T064306Z/submit/DEEP/20190827/B0b/science#step1/20240313T064306Z/logs/characterizeImage/890943/7ac1413d-ee0a-45df-976a-f933bed31e3a_characterizeImage_890943_2.stdout parsl.dataflow.dflow INFO: Standard error for task 16 available at /mmfs1/home/stevengs/dirac/DEEP/submit/DEEP/20190827/B0b/science#step1/20240313T064306Z/submit/DEEP/20190827/B0b/science#step1/20240313T064306Z/logs/characterizeImage/890943/7ac1413d-ee0a-45df-976a-f933bed31e3a_characterizeImage_890943_2.stderr parsl.app.errors DEBUG: Reraising exception of type parsl.dataflow.dflow DEBUG: Task 20 try 1 failed parsl.dataflow.dflow ERROR: Task 20 failed after 1 retry attempts Traceback (most recent call last): File "/mmfs1/gscratch/dirac/shared/opt/conda/envs/lsst-scipipe-8.0.0/lib/python3.11/site-packages/parsl/dataflow/dflow.py", line 300, in handle_exec_update res = self._unwrap_remote_exception_wrapper(future) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ File "/mmfs1/gscratch/dirac/shared/opt/conda/envs/lsst-scipipe-8.0.0/lib/python3.11/site-packages/parsl/dataflow/dflow.py", line 566, in _unwrap_remote_exception_wrapper result.reraise() File "/mmfs1/gscratch/dirac/shared/opt/conda/envs/lsst-scipipe-8.0.0/lib/python3.11/site-packages/parsl/app/errors.py", line 123, in reraise reraise(t, v, v.__traceback__) File "/mmfs1/gscratch/dirac/shared/opt/conda/envs/lsst-scipipe-8.0.0/lib/python3.11/site-packages/six.py", line 719, in reraise raise value File "/mmfs1/gscratch/dirac/shared/opt/conda/envs/lsst-scipipe-8.0.0/lib/python3.11/site-packages/parsl/app/errors.py", line 146, in wrapper return func(*args, **kwargs) ^^^^^^^^^^^^^^^^^ File "/mmfs1/gscratch/dirac/shared/opt/conda/envs/lsst-scipipe-8.0.0/lib/python3.11/site-packages/parsl/app/bash.py", line 86, in remote_side_bash_executor raise pe.BashExitFailure(func_name, proc.returncode) ^^^^^^^^^^^^^^^^^ parsl.app.errors.BashExitFailure: bash_app characterizeImage failed with unix exit code 1 parsl.dataflow.memoization DEBUG: Storing app cache entry 6b1e40f6fdf43ea8731b5b32c55c7b9b with result from task 20 parsl.dataflow.dflow WARNING: No tasks checkpointed so far in this run. Please ensure caching is enabled parsl.dataflow.dflow INFO: Task 21 failed due to dependency failure parsl.dataflow.dflow DEBUG: Task 21 try 0 failed parsl.dataflow.dflow INFO: Task 21 failed due to dependency failure so skipping retries parsl.dataflow.memoization ERROR: Attempting to update app cache entry but hashsum is not a string key parsl.dataflow.dflow WARNING: No tasks checkpointed so far in this run. Please ensure caching is enabled parsl.dataflow.dflow INFO: Standard output for task 21 available at /mmfs1/home/stevengs/dirac/DEEP/submit/DEEP/20190827/B0b/science#step1/20240313T064306Z/submit/DEEP/20190827/B0b/science#step1/20240313T064306Z/logs/calibrate/891064/5d561c31-ce54-4ec0-baa8-9b8acdf44b77_calibrate_891064_2.stdout parsl.dataflow.dflow INFO: Standard error for task 21 available at /mmfs1/home/stevengs/dirac/DEEP/submit/DEEP/20190827/B0b/science#step1/20240313T064306Z/submit/DEEP/20190827/B0b/science#step1/20240313T064306Z/logs/calibrate/891064/5d561c31-ce54-4ec0-baa8-9b8acdf44b77_calibrate_891064_2.stderr parsl.dataflow.dflow INFO: Standard output for task 20 available at /mmfs1/home/stevengs/dirac/DEEP/submit/DEEP/20190827/B0b/science#step1/20240313T064306Z/submit/DEEP/20190827/B0b/science#step1/20240313T064306Z/logs/characterizeImage/891064/8dd5bcbc-1a01-4986-8f24-79dd541b8098_characterizeImage_891064_2.stdout parsl.dataflow.dflow INFO: Standard error for task 20 available at /mmfs1/home/stevengs/dirac/DEEP/submit/DEEP/20190827/B0b/science#step1/20240313T064306Z/submit/DEEP/20190827/B0b/science#step1/20240313T064306Z/logs/characterizeImage/891064/8dd5bcbc-1a01-4986-8f24-79dd541b8098_characterizeImage_891064_2.stderr parsl.app.errors DEBUG: Reraising exception of type parsl.dataflow.dflow DEBUG: Task 18 try 1 failed parsl.dataflow.dflow ERROR: Task 18 failed after 1 retry attempts Traceback (most recent call last): File "/mmfs1/gscratch/dirac/shared/opt/conda/envs/lsst-scipipe-8.0.0/lib/python3.11/site-packages/parsl/dataflow/dflow.py", line 300, in handle_exec_update res = self._unwrap_remote_exception_wrapper(future) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ File "/mmfs1/gscratch/dirac/shared/opt/conda/envs/lsst-scipipe-8.0.0/lib/python3.11/site-packages/parsl/dataflow/dflow.py", line 566, in _unwrap_remote_exception_wrapper result.reraise() File "/mmfs1/gscratch/dirac/shared/opt/conda/envs/lsst-scipipe-8.0.0/lib/python3.11/site-packages/parsl/app/errors.py", line 123, in reraise reraise(t, v, v.__traceback__) File "/mmfs1/gscratch/dirac/shared/opt/conda/envs/lsst-scipipe-8.0.0/lib/python3.11/site-packages/six.py", line 719, in reraise raise value File "/mmfs1/gscratch/dirac/shared/opt/conda/envs/lsst-scipipe-8.0.0/lib/python3.11/site-packages/parsl/app/errors.py", line 146, in wrapper return func(*args, **kwargs) ^^^^^^^^^^^^^^^^^ File "/mmfs1/gscratch/dirac/shared/opt/conda/envs/lsst-scipipe-8.0.0/lib/python3.11/site-packages/parsl/app/bash.py", line 86, in remote_side_bash_executor raise pe.BashExitFailure(func_name, proc.returncode) ^^^^^^^^^^^^^^^^^ parsl.app.errors.BashExitFailure: bash_app characterizeImage failed with unix exit code 1 parsl.dataflow.memoization DEBUG: Storing app cache entry d6e90bb14cb84d111216874eb2510706 with result from task 18 parsl.dataflow.dflow WARNING: No tasks checkpointed so far in this run. Please ensure caching is enabled parsl.dataflow.dflow INFO: Task 19 failed due to dependency failure parsl.dataflow.dflow DEBUG: Task 19 try 0 failed parsl.dataflow.dflow INFO: Task 19 failed due to dependency failure so skipping retries parsl.dataflow.memoization ERROR: Attempting to update app cache entry but hashsum is not a string key parsl.dataflow.dflow WARNING: No tasks checkpointed so far in this run. Please ensure caching is enabled parsl.dataflow.dflow INFO: Standard output for task 19 available at /mmfs1/home/stevengs/dirac/DEEP/submit/DEEP/20190827/B0b/science#step1/20240313T064306Z/submit/DEEP/20190827/B0b/science#step1/20240313T064306Z/logs/calibrate/891062/0f45a08c-7310-422e-9816-4aa7bb4d2fc7_calibrate_891062_2.stdout parsl.dataflow.dflow INFO: Standard error for task 19 available at /mmfs1/home/stevengs/dirac/DEEP/submit/DEEP/20190827/B0b/science#step1/20240313T064306Z/submit/DEEP/20190827/B0b/science#step1/20240313T064306Z/logs/calibrate/891062/0f45a08c-7310-422e-9816-4aa7bb4d2fc7_calibrate_891062_2.stderr parsl.dataflow.dflow INFO: Standard output for task 18 available at /mmfs1/home/stevengs/dirac/DEEP/submit/DEEP/20190827/B0b/science#step1/20240313T064306Z/submit/DEEP/20190827/B0b/science#step1/20240313T064306Z/logs/characterizeImage/891062/325ce171-0a6e-4427-b0b2-9ed9f8b3e611_characterizeImage_891062_2.stdout parsl.dataflow.dflow INFO: Standard error for task 18 available at /mmfs1/home/stevengs/dirac/DEEP/submit/DEEP/20190827/B0b/science#step1/20240313T064306Z/submit/DEEP/20190827/B0b/science#step1/20240313T064306Z/logs/characterizeImage/891062/325ce171-0a6e-4427-b0b2-9ed9f8b3e611_characterizeImage_891062_2.stderr parsl.app.errors DEBUG: Reraising exception of type parsl.dataflow.dflow DEBUG: Task 22 try 1 failed parsl.dataflow.dflow ERROR: Task 22 failed after 1 retry attempts Traceback (most recent call last): File "/mmfs1/gscratch/dirac/shared/opt/conda/envs/lsst-scipipe-8.0.0/lib/python3.11/site-packages/parsl/dataflow/dflow.py", line 300, in handle_exec_update res = self._unwrap_remote_exception_wrapper(future) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ File "/mmfs1/gscratch/dirac/shared/opt/conda/envs/lsst-scipipe-8.0.0/lib/python3.11/site-packages/parsl/dataflow/dflow.py", line 566, in _unwrap_remote_exception_wrapper result.reraise() File "/mmfs1/gscratch/dirac/shared/opt/conda/envs/lsst-scipipe-8.0.0/lib/python3.11/site-packages/parsl/app/errors.py", line 123, in reraise reraise(t, v, v.__traceback__) File "/mmfs1/gscratch/dirac/shared/opt/conda/envs/lsst-scipipe-8.0.0/lib/python3.11/site-packages/six.py", line 719, in reraise raise value File "/mmfs1/gscratch/dirac/shared/opt/conda/envs/lsst-scipipe-8.0.0/lib/python3.11/site-packages/parsl/app/errors.py", line 146, in wrapper return func(*args, **kwargs) ^^^^^^^^^^^^^^^^^ File "/mmfs1/gscratch/dirac/shared/opt/conda/envs/lsst-scipipe-8.0.0/lib/python3.11/site-packages/parsl/app/bash.py", line 86, in remote_side_bash_executor raise pe.BashExitFailure(func_name, proc.returncode) ^^^^^^^^^^^^^^^^^ parsl.app.errors.BashExitFailure: bash_app characterizeImage failed with unix exit code 1 parsl.dataflow.memoization DEBUG: Storing app cache entry 60a77a506e43ab04aa31ee7315e3803a with result from task 22 parsl.dataflow.dflow WARNING: No tasks checkpointed so far in this run. Please ensure caching is enabled parsl.dataflow.dflow INFO: Task 23 failed due to dependency failure parsl.dataflow.dflow DEBUG: Task 23 try 0 failed parsl.dataflow.dflow INFO: Task 23 failed due to dependency failure so skipping retries parsl.dataflow.memoization ERROR: Attempting to update app cache entry but hashsum is not a string key parsl.dataflow.dflow WARNING: No tasks checkpointed so far in this run. Please ensure caching is enabled parsl.dataflow.dflow INFO: Standard output for task 23 available at /mmfs1/home/stevengs/dirac/DEEP/submit/DEEP/20190827/B0b/science#step1/20240313T064306Z/submit/DEEP/20190827/B0b/science#step1/20240313T064306Z/logs/calibrate/891063/88805368-562c-4480-8291-c47848842ea7_calibrate_891063_2.stdout parsl.dataflow.dflow INFO: Standard error for task 23 available at /mmfs1/home/stevengs/dirac/DEEP/submit/DEEP/20190827/B0b/science#step1/20240313T064306Z/submit/DEEP/20190827/B0b/science#step1/20240313T064306Z/logs/calibrate/891063/88805368-562c-4480-8291-c47848842ea7_calibrate_891063_2.stderr parsl.dataflow.dflow INFO: Standard output for task 22 available at /mmfs1/home/stevengs/dirac/DEEP/submit/DEEP/20190827/B0b/science#step1/20240313T064306Z/submit/DEEP/20190827/B0b/science#step1/20240313T064306Z/logs/characterizeImage/891063/b5eda604-1ff6-4bec-932e-7249b2fd2d68_characterizeImage_891063_2.stdout parsl.dataflow.dflow INFO: Standard error for task 22 available at /mmfs1/home/stevengs/dirac/DEEP/submit/DEEP/20190827/B0b/science#step1/20240313T064306Z/submit/DEEP/20190827/B0b/science#step1/20240313T064306Z/logs/characterizeImage/891063/b5eda604-1ff6-4bec-932e-7249b2fd2d68_characterizeImage_891063_2.stderr parsl.dataflow.strategy DEBUG: general strategy starting with strategy_type simple for 1 executors parsl.dataflow.strategy DEBUG: Strategizing for executor multi parsl.dataflow.strategy DEBUG: Slot ratio calculation: active_slots = 4, active_tasks = 3 parsl.dataflow.strategy DEBUG: Executor multi has 3 active tasks, 4/0 running/pending blocks, and 5 connected workers parsl.dataflow.strategy DEBUG: Strategy case 4b: more slots than tasks parsl.dataflow.strategy DEBUG: This strategy does not support scaling in parsl.process_loggers DEBUG: Normal ending for _general_strategy on thread JobStatusPoller-Timer-Thread-23372821068240 parsl.app.errors DEBUG: Reraising exception of type parsl.dataflow.dflow DEBUG: Task 24 try 1 failed parsl.dataflow.dflow ERROR: Task 24 failed after 1 retry attempts Traceback (most recent call last): File "/mmfs1/gscratch/dirac/shared/opt/conda/envs/lsst-scipipe-8.0.0/lib/python3.11/site-packages/parsl/dataflow/dflow.py", line 300, in handle_exec_update res = self._unwrap_remote_exception_wrapper(future) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ File "/mmfs1/gscratch/dirac/shared/opt/conda/envs/lsst-scipipe-8.0.0/lib/python3.11/site-packages/parsl/dataflow/dflow.py", line 566, in _unwrap_remote_exception_wrapper result.reraise() File "/mmfs1/gscratch/dirac/shared/opt/conda/envs/lsst-scipipe-8.0.0/lib/python3.11/site-packages/parsl/app/errors.py", line 123, in reraise reraise(t, v, v.__traceback__) File "/mmfs1/gscratch/dirac/shared/opt/conda/envs/lsst-scipipe-8.0.0/lib/python3.11/site-packages/six.py", line 719, in reraise raise value File "/mmfs1/gscratch/dirac/shared/opt/conda/envs/lsst-scipipe-8.0.0/lib/python3.11/site-packages/parsl/app/errors.py", line 146, in wrapper return func(*args, **kwargs) ^^^^^^^^^^^^^^^^^ File "/mmfs1/gscratch/dirac/shared/opt/conda/envs/lsst-scipipe-8.0.0/lib/python3.11/site-packages/parsl/app/bash.py", line 86, in remote_side_bash_executor raise pe.BashExitFailure(func_name, proc.returncode) ^^^^^^^^^^^^^^^^^ parsl.app.errors.BashExitFailure: bash_app characterizeImage failed with unix exit code 1 parsl.dataflow.memoization DEBUG: Storing app cache entry e756fbf308d81e5fe99ac55a69e044a1 with result from task 24 parsl.dataflow.dflow WARNING: No tasks checkpointed so far in this run. Please ensure caching is enabled parsl.dataflow.dflow INFO: Task 25 failed due to dependency failure parsl.dataflow.dflow DEBUG: Task 25 try 0 failed parsl.dataflow.dflow INFO: Task 25 failed due to dependency failure so skipping retries parsl.dataflow.memoization ERROR: Attempting to update app cache entry but hashsum is not a string key parsl.dataflow.dflow WARNING: No tasks checkpointed so far in this run. Please ensure caching is enabled parsl.dataflow.dflow INFO: Standard output for task 25 available at /mmfs1/home/stevengs/dirac/DEEP/submit/DEEP/20190827/B0b/science#step1/20240313T064306Z/submit/DEEP/20190827/B0b/science#step1/20240313T064306Z/logs/calibrate/890938/24e7a208-47f2-480b-aea6-0ec0e3bb6720_calibrate_890938_2.stdout parsl.dataflow.dflow INFO: Standard error for task 25 available at /mmfs1/home/stevengs/dirac/DEEP/submit/DEEP/20190827/B0b/science#step1/20240313T064306Z/submit/DEEP/20190827/B0b/science#step1/20240313T064306Z/logs/calibrate/890938/24e7a208-47f2-480b-aea6-0ec0e3bb6720_calibrate_890938_2.stderr parsl.dataflow.dflow INFO: Standard output for task 24 available at /mmfs1/home/stevengs/dirac/DEEP/submit/DEEP/20190827/B0b/science#step1/20240313T064306Z/submit/DEEP/20190827/B0b/science#step1/20240313T064306Z/logs/characterizeImage/890938/0c8ec344-4794-4861-9dde-58d7a0ccae6b_characterizeImage_890938_2.stdout parsl.dataflow.dflow INFO: Standard error for task 24 available at /mmfs1/home/stevengs/dirac/DEEP/submit/DEEP/20190827/B0b/science#step1/20240313T064306Z/submit/DEEP/20190827/B0b/science#step1/20240313T064306Z/logs/characterizeImage/890938/0c8ec344-4794-4861-9dde-58d7a0ccae6b_characterizeImage_890938_2.stderr parsl.app.errors DEBUG: Reraising exception of type parsl.dataflow.dflow DEBUG: Task 26 try 1 failed parsl.dataflow.dflow ERROR: Task 26 failed after 1 retry attempts Traceback (most recent call last): File "/mmfs1/gscratch/dirac/shared/opt/conda/envs/lsst-scipipe-8.0.0/lib/python3.11/site-packages/parsl/dataflow/dflow.py", line 300, in handle_exec_update res = self._unwrap_remote_exception_wrapper(future) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ File "/mmfs1/gscratch/dirac/shared/opt/conda/envs/lsst-scipipe-8.0.0/lib/python3.11/site-packages/parsl/dataflow/dflow.py", line 566, in _unwrap_remote_exception_wrapper result.reraise() File "/mmfs1/gscratch/dirac/shared/opt/conda/envs/lsst-scipipe-8.0.0/lib/python3.11/site-packages/parsl/app/errors.py", line 123, in reraise reraise(t, v, v.__traceback__) File "/mmfs1/gscratch/dirac/shared/opt/conda/envs/lsst-scipipe-8.0.0/lib/python3.11/site-packages/six.py", line 719, in reraise raise value File "/mmfs1/gscratch/dirac/shared/opt/conda/envs/lsst-scipipe-8.0.0/lib/python3.11/site-packages/parsl/app/errors.py", line 146, in wrapper return func(*args, **kwargs) ^^^^^^^^^^^^^^^^^ File "/mmfs1/gscratch/dirac/shared/opt/conda/envs/lsst-scipipe-8.0.0/lib/python3.11/site-packages/parsl/app/bash.py", line 86, in remote_side_bash_executor raise pe.BashExitFailure(func_name, proc.returncode) ^^^^^^^^^^^^^^^^^ parsl.app.errors.BashExitFailure: bash_app characterizeImage failed with unix exit code 1 parsl.dataflow.memoization DEBUG: Storing app cache entry 7d829c3909add40875ca5697aa79a662 with result from task 26 parsl.dataflow.dflow WARNING: No tasks checkpointed so far in this run. Please ensure caching is enabled parsl.dataflow.dflow INFO: Task 27 failed due to dependency failure parsl.dataflow.dflow DEBUG: Task 27 try 0 failed parsl.dataflow.dflow INFO: Task 27 failed due to dependency failure so skipping retries parsl.dataflow.memoization ERROR: Attempting to update app cache entry but hashsum is not a string key parsl.dataflow.dflow WARNING: No tasks checkpointed so far in this run. Please ensure caching is enabled parsl.dataflow.dflow INFO: Standard output for task 27 available at /mmfs1/home/stevengs/dirac/DEEP/submit/DEEP/20190827/B0b/science#step1/20240313T064306Z/submit/DEEP/20190827/B0b/science#step1/20240313T064306Z/logs/calibrate/890941/f54b41ce-2df9-4831-84f1-4c9808232cf0_calibrate_890941_2.stdout parsl.dataflow.dflow INFO: Standard error for task 27 available at /mmfs1/home/stevengs/dirac/DEEP/submit/DEEP/20190827/B0b/science#step1/20240313T064306Z/submit/DEEP/20190827/B0b/science#step1/20240313T064306Z/logs/calibrate/890941/f54b41ce-2df9-4831-84f1-4c9808232cf0_calibrate_890941_2.stderr parsl.dataflow.dflow INFO: Standard output for task 26 available at /mmfs1/home/stevengs/dirac/DEEP/submit/DEEP/20190827/B0b/science#step1/20240313T064306Z/submit/DEEP/20190827/B0b/science#step1/20240313T064306Z/logs/characterizeImage/890941/fc71c8fe-4ced-4b33-9a13-6f8e1cd7a9a1_characterizeImage_890941_2.stdout parsl.dataflow.dflow INFO: Standard error for task 26 available at /mmfs1/home/stevengs/dirac/DEEP/submit/DEEP/20190827/B0b/science#step1/20240313T064306Z/submit/DEEP/20190827/B0b/science#step1/20240313T064306Z/logs/characterizeImage/890941/fc71c8fe-4ced-4b33-9a13-6f8e1cd7a9a1_characterizeImage_890941_2.stderr parsl.dataflow.strategy DEBUG: general strategy starting with strategy_type simple for 1 executors parsl.dataflow.strategy DEBUG: Strategizing for executor multi parsl.dataflow.strategy DEBUG: Slot ratio calculation: active_slots = 4, active_tasks = 1 parsl.dataflow.strategy DEBUG: Executor multi has 1 active tasks, 4/0 running/pending blocks, and 5 connected workers parsl.dataflow.strategy DEBUG: Strategy case 4b: more slots than tasks parsl.dataflow.strategy DEBUG: This strategy does not support scaling in parsl.process_loggers DEBUG: Normal ending for _general_strategy on thread JobStatusPoller-Timer-Thread-23372821068240 parsl.dataflow.strategy DEBUG: general strategy starting with strategy_type simple for 1 executors parsl.dataflow.strategy DEBUG: Strategizing for executor multi parsl.dataflow.strategy DEBUG: Slot ratio calculation: active_slots = 4, active_tasks = 1 parsl.dataflow.strategy DEBUG: Executor multi has 1 active tasks, 4/0 running/pending blocks, and 5 connected workers parsl.dataflow.strategy DEBUG: Strategy case 4b: more slots than tasks parsl.dataflow.strategy DEBUG: This strategy does not support scaling in parsl.process_loggers DEBUG: Normal ending for _general_strategy on thread JobStatusPoller-Timer-Thread-23372821068240 parsl.app.errors DEBUG: Reraising exception of type parsl.dataflow.dflow DEBUG: Task 2 try 1 failed parsl.dataflow.dflow ERROR: Task 2 failed after 1 retry attempts Traceback (most recent call last): File "/mmfs1/gscratch/dirac/shared/opt/conda/envs/lsst-scipipe-8.0.0/lib/python3.11/site-packages/parsl/dataflow/dflow.py", line 300, in handle_exec_update res = self._unwrap_remote_exception_wrapper(future) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ File "/mmfs1/gscratch/dirac/shared/opt/conda/envs/lsst-scipipe-8.0.0/lib/python3.11/site-packages/parsl/dataflow/dflow.py", line 566, in _unwrap_remote_exception_wrapper result.reraise() File "/mmfs1/gscratch/dirac/shared/opt/conda/envs/lsst-scipipe-8.0.0/lib/python3.11/site-packages/parsl/app/errors.py", line 123, in reraise reraise(t, v, v.__traceback__) File "/mmfs1/gscratch/dirac/shared/opt/conda/envs/lsst-scipipe-8.0.0/lib/python3.11/site-packages/six.py", line 719, in reraise raise value File "/mmfs1/gscratch/dirac/shared/opt/conda/envs/lsst-scipipe-8.0.0/lib/python3.11/site-packages/parsl/app/errors.py", line 146, in wrapper return func(*args, **kwargs) ^^^^^^^^^^^^^^^^^ File "/mmfs1/gscratch/dirac/shared/opt/conda/envs/lsst-scipipe-8.0.0/lib/python3.11/site-packages/parsl/app/bash.py", line 86, in remote_side_bash_executor raise pe.BashExitFailure(func_name, proc.returncode) ^^^^^^^^^^^^^^^^^ parsl.app.errors.BashExitFailure: bash_app calibrate failed with unix exit code 1 parsl.dataflow.memoization DEBUG: Storing app cache entry 1917517d4584c3803db50e1fe8f53203 with result from task 2 parsl.dataflow.dflow INFO: DFK cleanup initiated parsl.dataflow.dflow INFO: Summary of tasks in DFK: parsl.dataflow.dflow INFO: Tasks in state States.unsched: 0 parsl.dataflow.dflow INFO: Tasks in state States.pending: 0 parsl.dataflow.dflow INFO: Tasks in state States.running: 0 parsl.dataflow.dflow INFO: Tasks in state States.exec_done: 0 parsl.dataflow.dflow INFO: Tasks in state States.failed: 16 parsl.dataflow.dflow INFO: Tasks in state States.dep_fail: 12 parsl.dataflow.dflow INFO: Tasks in state States.launched: 0 parsl.dataflow.dflow INFO: Tasks in state States.fail_retryable: 0 parsl.dataflow.dflow INFO: Tasks in state States.memo_done: 0 parsl.dataflow.dflow INFO: Tasks in state States.joining: 0 parsl.dataflow.dflow INFO: Tasks in state States.running_ended: 0 parsl.dataflow.dflow INFO: End of summary parsl.dataflow.dflow WARNING: No tasks checkpointed so far in this run. Please ensure caching is enabled parsl.dataflow.dflow INFO: Standard output for task 2 available at /mmfs1/home/stevengs/dirac/DEEP/submit/DEEP/20190827/B0b/science#step1/20240313T064306Z/submit/DEEP/20190827/B0b/science#step1/20240313T064306Z/logs/calibrate/890940/5e0b598b-92cc-40e4-8974-08534fbf3894_calibrate_890940_61.stdout parsl.dataflow.dflow INFO: Standard error for task 2 available at /mmfs1/home/stevengs/dirac/DEEP/submit/DEEP/20190827/B0b/science#step1/20240313T064306Z/submit/DEEP/20190827/B0b/science#step1/20240313T064306Z/logs/calibrate/890940/5e0b598b-92cc-40e4-8974-08534fbf3894_calibrate_890940_61.stderr parsl.dataflow.dflow WARNING: No tasks checkpointed so far in this run. Please ensure caching is enabled parsl.dataflow.dflow INFO: Closing job status poller parsl.dataflow.dflow INFO: Terminated job status poller parsl.dataflow.dflow INFO: Scaling in and shutting down executors parsl.dataflow.dflow INFO: Scaling in executor multi parsl.executors.high_throughput.executor DEBUG: Scale in called, blocks=4, block_ids=[] parsl.executors.high_throughput.executor DEBUG: Scale in selecting from 4 blocks parsl.executors.high_throughput.executor DEBUG: Sending hold to manager: edf5b07c3118 parsl.executors.high_throughput.executor DEBUG: Sent hold request to manager: edf5b07c3118 parsl.executors.high_throughput.executor DEBUG: Sending hold to manager: 39cf322bcae4 parsl.executors.high_throughput.executor DEBUG: Sent hold request to manager: 39cf322bcae4 parsl.executors.high_throughput.executor DEBUG: Sending hold to manager: 53f633cc8d5b parsl.executors.high_throughput.executor DEBUG: Sent hold request to manager: 53f633cc8d5b parsl.executors.high_throughput.executor DEBUG: Sending hold to manager: fb9c4a6b8633 parsl.executors.high_throughput.executor DEBUG: Sent hold request to manager: fb9c4a6b8633 2024-03-13 06:44:54 proc_lsst.multi:146 [INFO] found job 17008089 in provider astro proc_lsst.multi INFO: found job 17008089 in provider astro 2024-03-13 06:44:54 proc_lsst.multi:201 [INFO] cancelling 17008089 on provider astro proc_lsst.multi INFO: cancelling 17008089 on provider astro 2024-03-13 06:44:54 proc_lsst.multi:146 [INFO] found job 48154 in provider local proc_lsst.multi INFO: found job 48154 in provider local 2024-03-13 06:44:54 proc_lsst.multi:201 [INFO] cancelling 48154 on provider local proc_lsst.multi INFO: cancelling 48154 on provider local parsl.providers.local.local DEBUG: Terminating job/proc_id: 48154 2024-03-13 06:44:54 proc_lsst.multi:146 [INFO] found job 17008091 in provider astro proc_lsst.multi INFO: found job 17008091 in provider astro 2024-03-13 06:44:54 proc_lsst.multi:201 [INFO] cancelling 17008091 on provider astro proc_lsst.multi INFO: cancelling 17008091 on provider astro 2024-03-13 06:44:54 proc_lsst.multi:146 [INFO] found job 17008090 in provider astro proc_lsst.multi INFO: found job 17008090 in provider astro 2024-03-13 06:44:54 proc_lsst.multi:201 [INFO] cancelling 17008090 on provider astro proc_lsst.multi INFO: cancelling 17008090 on provider astro parsl.dataflow.dflow INFO: Shutting down executor multi 2024-03-13 06:44:54 proc_lsst.multi:40 [INFO] Cancelling all provider resources proc_lsst.multi INFO: Cancelling all provider resources 2024-03-13 06:44:54 proc_lsst.multi:47 [INFO] new jobs since last cancel ['17008089', '17008090', '17008091', '48154'] proc_lsst.multi INFO: new jobs since last cancel ['17008089', '17008090', '17008091', '48154'] 2024-03-13 06:44:54 proc_lsst.multi:146 [INFO] found job 17008089 in provider astro proc_lsst.multi INFO: found job 17008089 in provider astro 2024-03-13 06:44:54 proc_lsst.multi:201 [INFO] cancelling 17008089 on provider astro proc_lsst.multi INFO: cancelling 17008089 on provider astro 2024-03-13 06:44:54 proc_lsst.multi:146 [INFO] found job 17008090 in provider astro proc_lsst.multi INFO: found job 17008090 in provider astro 2024-03-13 06:44:54 proc_lsst.multi:201 [INFO] cancelling 17008090 on provider astro proc_lsst.multi INFO: cancelling 17008090 on provider astro 2024-03-13 06:44:55 proc_lsst.multi:146 [INFO] found job 17008091 in provider astro proc_lsst.multi INFO: found job 17008091 in provider astro 2024-03-13 06:44:55 proc_lsst.multi:201 [INFO] cancelling 17008091 on provider astro proc_lsst.multi INFO: cancelling 17008091 on provider astro 2024-03-13 06:44:55 proc_lsst.multi:146 [INFO] found job 48154 in provider local proc_lsst.multi INFO: found job 48154 in provider local 2024-03-13 06:44:55 proc_lsst.multi:201 [INFO] cancelling 48154 on provider local proc_lsst.multi INFO: cancelling 48154 on provider local parsl.providers.local.local DEBUG: Terminating job/proc_id: 48154 parsl.providers.local.local WARNING: Failed to kill PID: 48154 and child processes on local 2024-03-13 06:44:56 proc_lsst.multi:50 [INFO] no new jobs since last cancel, resuming executor shutdown proc_lsst.multi INFO: no new jobs since last cancel, resuming executor shutdown parsl.executors.high_throughput.executor INFO: Attempting HighThroughputExecutor shutdown parsl.executors.high_throughput.executor INFO: Finished HighThroughputExecutor shutdown attempt parsl.dataflow.dflow INFO: Shut down executor multi parsl.dataflow.dflow INFO: Shutting down executor _parsl_internal parsl.executors.threads DEBUG: Shutting down executor, which involves waiting for running tasks to complete parsl.executors.threads DEBUG: Done with executor shutdown parsl.dataflow.dflow INFO: Shut down executor _parsl_internal parsl.dataflow.dflow INFO: Terminated executors parsl.dataflow.dflow INFO: DFK cleanup complete parsl.process_loggers DEBUG: Normal ending for cleanup on thread MainThread lsst.ctrl.bps.submit INFO: Completed submitting to a workflow management system: Took 92.1282 seconds lsst.ctrl.bps.drivers INFO: Run 'DEEP_20190827_B0b_science#step1_20240313T064306Z' submitted for execution with id 'None' lsst.ctrl.bps.drivers INFO: Completed submit stage: Took 92.1346 seconds; current memory usage: 0.346 Gibyte, delta: 0.001 Gibyte, peak delta: 0.001 Gibyte lsst.ctrl.bps.drivers INFO: Completed entire submission process: Took 102.1629 seconds; current memory usage: 0.346 Gibyte, delta: 0.206 Gibyte, peak delta: 0.206 Gibyte lsst.ctrl.bps.drivers INFO: Peak memory usage for bps process 0.346 Gibyte (main), 0.346 Gibyte (largest child process) Run Id: None Run Name: DEEP_20190827_B0b_science#step1_20240313T064306Z parsl.dataflow.dflow INFO: python process is exiting, but DFK has already been cleaned up real 1m44.218s user 0m21.181s sys 0m4.837s