+ PROC_LSST_SUBMIT_PATH=/mmfs1/home/stevengs/dirac/DEEP/submit + bps submit /gscratch/dirac/shared/opt/proc_lsst/pipelines/submit.yaml -b /mmfs1/home/stevengs/dirac/DEEP/repo -i DEEP/20190504/A1c --output-run DEEP/20190504/A1c/science#step1/20240313T004552Z --qgraph pipeline.qgraph lsst.ctrl.bps.drivers INFO: DISCLAIMER: All values regarding memory consumption reported below are approximate and may not accurately reflect actual memory usage by the bps process. lsst.ctrl.bps.drivers INFO: Starting submission process lsst.ctrl.bps.drivers INFO: Initializing execution environment lsst.ctrl.bps.drivers INFO: Initializing execution environment completed: Took 1.6728 seconds; current memory usage: 0.178 Gibyte, delta: 0.041 Gibyte, peak delta: 0.054 Gibyte lsst.ctrl.bps.drivers INFO: Peak memory usage for bps process 0.191 Gibyte (main), 0.000 Gibyte (largest child process) lsst.ctrl.bps.drivers INFO: Starting acquire stage (generating and/or reading quantum graph) lsst.ctrl.bps.pre_transform INFO: Copying quantum graph from 'pipeline.qgraph' lsst.ctrl.bps.pre_transform INFO: Completed copying quantum graph: Took 0.0029 seconds lsst.ctrl.bps.pre_transform INFO: Backing up quantum graph from '/mmfs1/home/stevengs/dirac/DEEP/submit/DEEP/20190504/A1c/science#step1/20240313T004552Z/submit/DEEP/20190504/A1c/science#step1/20240313T004552Z/pipeline.qgraph' lsst.ctrl.bps.pre_transform INFO: Completed backing up quantum graph: Took 0.0058 seconds lsst.ctrl.bps.pre_transform INFO: /mmfs1/gscratch/dirac/shared/opt/conda/envs/lsst-scipipe-8.0.0/share/eups/Linux64/ctrl_mpexec/g1ce94f1343+c79f27626b/bin/pipetask --long-log --log-level=VERBOSE update-graph-run /mmfs1/home/stevengs/dirac/DEEP/submit/DEEP/20190504/A1c/science#step1/20240313T004552Z/submit/DEEP/20190504/A1c/science#step1/20240313T004552Z/pipeline_orig.qgraph DEEP/20190504/A1c/science#step1/20240313T004552Z /mmfs1/home/stevengs/dirac/DEEP/submit/DEEP/20190504/A1c/science#step1/20240313T004552Z/submit/DEEP/20190504/A1c/science#step1/20240313T004552Z/pipeline.qgraph lsst.ctrl.bps.pre_transform INFO: Reading quantum graph from '/mmfs1/home/stevengs/dirac/DEEP/submit/DEEP/20190504/A1c/science#step1/20240313T004552Z/submit/DEEP/20190504/A1c/science#step1/20240313T004552Z/pipeline.qgraph' lsst.ctrl.bps.pre_transform INFO: Completed reading quantum graph: Took 5.6264 seconds lsst.ctrl.bps.drivers INFO: Acquire stage completed: Took 13.4032 seconds; current memory usage: 0.337 Gibyte, delta: 0.159 Gibyte, peak delta: 0.147 Gibyte lsst.ctrl.bps.drivers INFO: Peak memory usage for bps process 0.337 Gibyte (main), 0.328 Gibyte (largest child process) lsst.ctrl.bps.drivers INFO: Starting cluster stage (grouping quanta into jobs) lsst.ctrl.bps.drivers INFO: Cluster stage completed: Took 0.0122 seconds; current memory usage: 0.337 Gibyte, delta: 0.000 Gibyte, peak delta: 0.000 Gibyte lsst.ctrl.bps.drivers INFO: Peak memory usage for bps process 0.337 Gibyte (main), 0.328 Gibyte (largest child process) lsst.ctrl.bps.drivers INFO: ClusteredQuantumGraph contains 39 cluster(s) lsst.ctrl.bps.drivers INFO: Starting transform stage (creating generic workflow) lsst.ctrl.bps.drivers INFO: Generic workflow name 'DEEP_20190504_A1c_science#step1_20240313T004552Z' lsst.ctrl.bps.drivers INFO: Transform stage completed: Took 0.0389 seconds; current memory usage: 0.337 Gibyte, delta: 0.000 Gibyte, peak delta: 0.000 Gibyte lsst.ctrl.bps.drivers INFO: Peak memory usage for bps process 0.337 Gibyte (main), 0.328 Gibyte (largest child process) lsst.ctrl.bps.drivers INFO: GenericWorkflow contains 41 job(s) (including final) lsst.ctrl.bps.drivers INFO: Starting prepare stage (creating specific implementation of workflow) parsl.addresses ERROR: Ignoring failure to fetch address from interface eno2 Traceback (most recent call last): File "/mmfs1/gscratch/dirac/shared/opt/conda/envs/lsst-scipipe-8.0.0/lib/python3.11/site-packages/parsl/addresses.py", line 111, in get_all_addresses s_addresses.add(address_by_interface(interface)) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ File "/mmfs1/gscratch/dirac/shared/opt/conda/envs/lsst-scipipe-8.0.0/lib/python3.11/site-packages/typeguard/__init__.py", line 1033, in wrapper retval = func(*args, **kwargs) ^^^^^^^^^^^^^^^^^^^^^ File "/mmfs1/gscratch/dirac/shared/opt/conda/envs/lsst-scipipe-8.0.0/lib/python3.11/site-packages/parsl/addresses.py", line 93, in address_by_interface return socket.inet_ntoa(fcntl.ioctl( ^^^^^^^^^^^^ OSError: [Errno 99] Cannot assign requested address lsst.ctrl.bps.parsl INFO: Writing workflow with ID=/mmfs1/home/stevengs/dirac/DEEP/submit/DEEP/20190504/A1c/science#step1/20240313T004552Z/submit/DEEP/20190504/A1c/science#step1/20240313T004552Z lsst.ctrl.bps.drivers INFO: Prepare stage completed: Took 0.1274 seconds; current memory usage: 0.338 Gibyte, delta: 0.001 Gibyte, peak delta: 0.001 Gibyte lsst.ctrl.bps.drivers INFO: Peak memory usage for bps process 0.338 Gibyte (main), 0.328 Gibyte (largest child process) lsst.ctrl.bps.drivers INFO: Starting submit stage lsst.ctrl.bps.submit INFO: Submitting run to a workflow management system for execution parsl.dataflow.rundirs DEBUG: Parsl run initializing in rundir: runinfo/000 parsl.dataflow.dflow INFO: Starting DataFlowKernel with config Config( app_cache=True, checkpoint_files=None, checkpoint_mode='task_exit', checkpoint_period=None, executors=[MultiHighThroughputExecutor()], garbage_collect=True, initialize_logging=True, internal_tasks_max_threads=10, max_idletime=120.0, monitoring=None, retries=1, retry_handler=None, run_dir='runinfo', strategy='simple', usage_tracking=False ) parsl.dataflow.dflow INFO: Parsl version: 2023.06.12 parsl.usage_tracking.usage DEBUG: Tracking status: False parsl.dataflow.dflow INFO: Run id is: 0c855455-40cf-42d4-af99-a08f3024448f parsl.dataflow.dflow DEBUG: Considering candidate for workflow name: /mmfs1/gscratch/dirac/shared/opt/conda/envs/lsst-scipipe-8.0.0/lib/python3.11/site-packages/parsl/dataflow/dflow.py parsl.dataflow.dflow DEBUG: Considering candidate for workflow name: /mmfs1/gscratch/dirac/shared/opt/conda/envs/lsst-scipipe-8.0.0/lib/python3.11/site-packages/typeguard/__init__.py parsl.dataflow.dflow DEBUG: Considering candidate for workflow name: /mmfs1/gscratch/dirac/shared/opt/conda/envs/lsst-scipipe-8.0.0/lib/python3.11/site-packages/parsl/dataflow/dflow.py parsl.dataflow.dflow DEBUG: Considering candidate for workflow name: /mmfs1/gscratch/dirac/shared/opt/conda/envs/lsst-scipipe-8.0.0/lib/python3.11/site-packages/typeguard/__init__.py parsl.dataflow.dflow DEBUG: Considering candidate for workflow name: /mmfs1/gscratch/dirac/shared/opt/conda/envs/lsst-scipipe-8.0.0/share/eups/Linux64/ctrl_bps_parsl/g145af14111+5b908e21bc/python/lsst/ctrl/bps/parsl/workflow.py parsl.dataflow.dflow DEBUG: Using workflow.py as workflow name parsl.dataflow.memoization INFO: App caching initialized parsl.dataflow.strategy DEBUG: Scaling strategy: simple parsl.executors.high_throughput.executor DEBUG: Starting queue management thread parsl.executors.high_throughput.executor DEBUG: queue management worker starting parsl.executors.high_throughput.executor DEBUG: Started queue management thread Submit dir: /mmfs1/home/stevengs/dirac/DEEP/submit/DEEP/20190504/A1c/science#step1/20240313T004552Z/submit/DEEP/20190504/A1c/science#step1/20240313T004552Z parsl.executors.high_throughput.executor DEBUG: Created management thread: parsl.executors.high_throughput.executor DEBUG: Launch command: process_worker_pool.py -a 169.254.95.120,127.0.0.1,10.64.129.9,n3009,10.64.65.9,198.48.92.26 -p 0 -c 1.0 -m None --poll 10 --task_port=54335 --result_port=54097 --logdir=/mmfs1/gscratch/dirac/DEEP/submit/DEEP/20190504/A1c/science#step1/20240313T004552Z/runinfo/000/multi --block_id={block_id} --hb_period=30 --hb_threshold=120 --cpu-affinity none --available-accelerators --start-method spawn parsl.executors.high_throughput.executor DEBUG: Starting HighThroughputExecutor with provider: parsl.executors.status_handling INFO: Scaling out by 1 blocks parsl.executors.status_handling INFO: Allocated block ID 0 parsl.executors.status_handling DEBUG: Submitting to provider with job_name parsl.multi.block-0 2024-03-13 00:46:33 proc_lsst.multi:153 [INFO] [multi] got submit process_worker_pool.py -a 169.254.95.120,127.0.0.1,10.64.129.9,n3009,10.64.65.9,198.48.92.26 -p 0 -c 1.0 -m None --poll 10 --task_port=54335 --result_port=54097 --logdir=/mmfs1/gscratch/dirac/DEEP/submit/DEEP/20190504/A1c/science#step1/20240313T004552Z/runinfo/000/multi --block_id=0 --hb_period=30 --hb_threshold=120 --cpu-affinity none --available-accelerators --start-method spawn 1 parsl.multi.block-0 proc_lsst.multi INFO: [multi] got submit process_worker_pool.py -a 169.254.95.120,127.0.0.1,10.64.129.9,n3009,10.64.65.9,198.48.92.26 -p 0 -c 1.0 -m None --poll 10 --task_port=54335 --result_port=54097 --logdir=/mmfs1/gscratch/dirac/DEEP/submit/DEEP/20190504/A1c/science#step1/20240313T004552Z/runinfo/000/multi --block_id=0 --hb_period=30 --hb_threshold=120 --cpu-affinity none --available-accelerators --start-method spawn 1 parsl.multi.block-0 2024-03-13 00:46:33 proc_lsst.multi:162 [INFO] [multi] local process_worker_pool.py -a 169.254.95.120,127.0.0.1,10.64.129.9,n3009,10.64.65.9,198.48.92.26 -p 0 -c 1.0 -m None --poll 10 --task_port=54335 --result_port=54097 --logdir=/mmfs1/gscratch/dirac/DEEP/submit/DEEP/20190504/A1c/science#step1/20240313T004552Z/runinfo/000/multi --block_id=0 --hb_period=30 --hb_threshold=120 --cpu-affinity none --available-accelerators --start-method spawn --max_workers 2 proc_lsst.multi INFO: [multi] local process_worker_pool.py -a 169.254.95.120,127.0.0.1,10.64.129.9,n3009,10.64.65.9,198.48.92.26 -p 0 -c 1.0 -m None --poll 10 --task_port=54335 --result_port=54097 --logdir=/mmfs1/gscratch/dirac/DEEP/submit/DEEP/20190504/A1c/science#step1/20240313T004552Z/runinfo/000/multi --block_id=0 --hb_period=30 --hb_threshold=120 --cpu-affinity none --available-accelerators --start-method spawn --max_workers 2 2024-03-13 00:46:33 proc_lsst.multi:163 [INFO] [multi] len(self.providers[provider].resources) 0 proc_lsst.multi INFO: [multi] len(self.providers[provider].resources) 0 2024-03-13 00:46:33 proc_lsst.multi:164 [INFO] [multi] self.providers[provider].max_blocks 1 proc_lsst.multi INFO: [multi] self.providers[provider].max_blocks 1 2024-03-13 00:46:33 proc_lsst.multi:166 [INFO] [multi] submitting process_worker_pool.py -a 169.254.95.120,127.0.0.1,10.64.129.9,n3009,10.64.65.9,198.48.92.26 -p 0 -c 1.0 -m None --poll 10 --task_port=54335 --result_port=54097 --logdir=/mmfs1/gscratch/dirac/DEEP/submit/DEEP/20190504/A1c/science#step1/20240313T004552Z/runinfo/000/multi --block_id=0 --hb_period=30 --hb_threshold=120 --cpu-affinity none --available-accelerators --start-method spawn --max_workers 2 to local proc_lsst.multi INFO: [multi] submitting process_worker_pool.py -a 169.254.95.120,127.0.0.1,10.64.129.9,n3009,10.64.65.9,198.48.92.26 -p 0 -c 1.0 -m None --poll 10 --task_port=54335 --result_port=54097 --logdir=/mmfs1/gscratch/dirac/DEEP/submit/DEEP/20190504/A1c/science#step1/20240313T004552Z/runinfo/000/multi --block_id=0 --hb_period=30 --hb_threshold=120 --cpu-affinity none --available-accelerators --start-method spawn --max_workers 2 to local parsl.providers.local.local DEBUG: Launching in remote mode 2024-03-13 00:46:33 proc_lsst.multi:170 [INFO] [multi] job_id 45623 proc_lsst.multi INFO: [multi] job_id 45623 2024-03-13 00:46:33 proc_lsst.multi:171 [INFO] [multi] len(self.providers[provider].resources) 1 proc_lsst.multi INFO: [multi] len(self.providers[provider].resources) 1 2024-03-13 00:46:33 proc_lsst.multi:172 [INFO] [multi] self.providers[provider].max_blocks 1 proc_lsst.multi INFO: [multi] self.providers[provider].max_blocks 1 2024-03-13 00:46:33 proc_lsst.multi:178 [INFO] [multi] provider local accepted submit and returned 45623 proc_lsst.multi INFO: [multi] provider local accepted submit and returned 45623 parsl.executors.status_handling DEBUG: Launched block 0 on executor multi with job ID 45623 parsl.dataflow.job_status_poller DEBUG: Adding executor multi parsl.dataflow.dflow DEBUG: Task 0 will be sent to executor multi parsl.dataflow.dflow DEBUG: Adding output dependencies parsl.dataflow.dflow INFO: Task 0 submitted for App calibrate, not waiting on any dependency parsl.dataflow.dflow DEBUG: Task 0 set to pending state with AppFuture: parsl.dataflow.memoization DEBUG: Ignoring these kwargs for checkpointing: ['stderr', 'stdout'] parsl.dataflow.memoization DEBUG: Ignoring kwarg stderr parsl.dataflow.memoization DEBUG: Ignoring kwarg stdout parsl.dataflow.memoization DEBUG: Task 0 has memoization hash 511835a8acc02d39e3b6d1d2225bb1a4 parsl.dataflow.memoization INFO: Task 0 had no result in cache parsl.executors.high_throughput.executor DEBUG: Pushing function .wrapper at 0x14a0a59be980> to queue with args ("'${CTRL_MPEXEC_DIR}/bin/pipetask --long-log --log-level=VERBOSE run-qbb /mmfs1/home/stevengs/dirac/D...",) parsl.dataflow.dflow INFO: Parsl task 0 try 0 launched on executor multi with executor id 1 parsl.dataflow.dflow INFO: Standard output for task 0 available at /mmfs1/home/stevengs/dirac/DEEP/submit/DEEP/20190504/A1c/science#step1/20240313T004552Z/submit/DEEP/20190504/A1c/science#step1/20240313T004552Z/logs/calibrate/855240/46c0754b-7817-4760-9c47-d4bff5a7fcea_calibrate_855240_61.stdout parsl.dataflow.dflow INFO: Standard error for task 0 available at /mmfs1/home/stevengs/dirac/DEEP/submit/DEEP/20190504/A1c/science#step1/20240313T004552Z/submit/DEEP/20190504/A1c/science#step1/20240313T004552Z/logs/calibrate/855240/46c0754b-7817-4760-9c47-d4bff5a7fcea_calibrate_855240_61.stderr parsl.dataflow.dflow DEBUG: Task 1 will be sent to executor multi parsl.dataflow.dflow DEBUG: Adding output dependencies parsl.dataflow.dflow INFO: Task 1 submitted for App calibrate, not waiting on any dependency parsl.dataflow.dflow DEBUG: Task 1 set to pending state with AppFuture: parsl.dataflow.memoization DEBUG: Ignoring these kwargs for checkpointing: ['stderr', 'stdout'] parsl.dataflow.memoization DEBUG: Ignoring kwarg stderr parsl.dataflow.memoization DEBUG: Ignoring kwarg stdout parsl.dataflow.memoization DEBUG: Task 1 has memoization hash 12f351a87d7e6897258f8bd74860e3af parsl.dataflow.memoization INFO: Task 1 had no result in cache parsl.executors.high_throughput.executor DEBUG: Pushing function .wrapper at 0x14a0a59beca0> to queue with args ("'${CTRL_MPEXEC_DIR}/bin/pipetask --long-log --log-level=VERBOSE run-qbb /mmfs1/home/stevengs/dirac/D...",) parsl.dataflow.dflow INFO: Parsl task 1 try 0 launched on executor multi with executor id 2 parsl.dataflow.dflow INFO: Standard output for task 1 available at /mmfs1/home/stevengs/dirac/DEEP/submit/DEEP/20190504/A1c/science#step1/20240313T004552Z/submit/DEEP/20190504/A1c/science#step1/20240313T004552Z/logs/calibrate/855238/cfa4072c-df06-4208-872e-34ec571e2d12_calibrate_855238_61.stdout parsl.dataflow.dflow INFO: Standard error for task 1 available at /mmfs1/home/stevengs/dirac/DEEP/submit/DEEP/20190504/A1c/science#step1/20240313T004552Z/submit/DEEP/20190504/A1c/science#step1/20240313T004552Z/logs/calibrate/855238/cfa4072c-df06-4208-872e-34ec571e2d12_calibrate_855238_61.stderr parsl.dataflow.dflow DEBUG: Task 2 will be sent to executor multi parsl.dataflow.dflow DEBUG: Adding output dependencies parsl.dataflow.dflow INFO: Task 2 submitted for App calibrate, not waiting on any dependency parsl.dataflow.dflow DEBUG: Task 2 set to pending state with AppFuture: parsl.dataflow.memoization DEBUG: Ignoring these kwargs for checkpointing: ['stderr', 'stdout'] parsl.dataflow.memoization DEBUG: Ignoring kwarg stderr parsl.dataflow.memoization DEBUG: Ignoring kwarg stdout parsl.dataflow.memoization DEBUG: Task 2 has memoization hash e7aa3ba2175fddb779a27d0452be8cc6 parsl.dataflow.memoization INFO: Task 2 had no result in cache parsl.executors.high_throughput.executor DEBUG: Pushing function .wrapper at 0x14a0a59befc0> to queue with args ("'${CTRL_MPEXEC_DIR}/bin/pipetask --long-log --log-level=VERBOSE run-qbb /mmfs1/home/stevengs/dirac/D...",) parsl.dataflow.dflow INFO: Parsl task 2 try 0 launched on executor multi with executor id 3 parsl.dataflow.dflow INFO: Standard output for task 2 available at /mmfs1/home/stevengs/dirac/DEEP/submit/DEEP/20190504/A1c/science#step1/20240313T004552Z/submit/DEEP/20190504/A1c/science#step1/20240313T004552Z/logs/calibrate/855236/efd8283a-61fc-45e6-94ab-003aa8739b27_calibrate_855236_61.stdout parsl.dataflow.dflow INFO: Standard error for task 2 available at /mmfs1/home/stevengs/dirac/DEEP/submit/DEEP/20190504/A1c/science#step1/20240313T004552Z/submit/DEEP/20190504/A1c/science#step1/20240313T004552Z/logs/calibrate/855236/efd8283a-61fc-45e6-94ab-003aa8739b27_calibrate_855236_61.stderr parsl.dataflow.dflow DEBUG: Task 3 will be sent to executor multi parsl.dataflow.dflow DEBUG: Adding output dependencies parsl.dataflow.dflow INFO: Task 3 submitted for App calibrate, not waiting on any dependency parsl.dataflow.dflow DEBUG: Task 3 set to pending state with AppFuture: parsl.dataflow.memoization DEBUG: Ignoring these kwargs for checkpointing: ['stderr', 'stdout'] parsl.dataflow.memoization DEBUG: Ignoring kwarg stderr parsl.dataflow.memoization DEBUG: Ignoring kwarg stdout parsl.dataflow.memoization DEBUG: Task 3 has memoization hash 8c01d02f1b63178e7b043bc26f1e657d parsl.dataflow.memoization INFO: Task 3 had no result in cache parsl.executors.high_throughput.executor DEBUG: Pushing function .wrapper at 0x14a0a59bee80> to queue with args ("'${CTRL_MPEXEC_DIR}/bin/pipetask --long-log --log-level=VERBOSE run-qbb /mmfs1/home/stevengs/dirac/D...",) parsl.dataflow.dflow INFO: Parsl task 3 try 0 launched on executor multi with executor id 4 parsl.dataflow.dflow INFO: Standard output for task 3 available at /mmfs1/home/stevengs/dirac/DEEP/submit/DEEP/20190504/A1c/science#step1/20240313T004552Z/submit/DEEP/20190504/A1c/science#step1/20240313T004552Z/logs/calibrate/855232/8431b289-f9a9-4bbc-99eb-756a86d9f419_calibrate_855232_61.stdout parsl.dataflow.dflow INFO: Standard error for task 3 available at /mmfs1/home/stevengs/dirac/DEEP/submit/DEEP/20190504/A1c/science#step1/20240313T004552Z/submit/DEEP/20190504/A1c/science#step1/20240313T004552Z/logs/calibrate/855232/8431b289-f9a9-4bbc-99eb-756a86d9f419_calibrate_855232_61.stderr parsl.dataflow.dflow DEBUG: Task 4 will be sent to executor multi parsl.dataflow.dflow DEBUG: Adding output dependencies parsl.dataflow.dflow INFO: Task 4 submitted for App calibrate, not waiting on any dependency parsl.dataflow.dflow DEBUG: Task 4 set to pending state with AppFuture: parsl.dataflow.memoization DEBUG: Ignoring these kwargs for checkpointing: ['stderr', 'stdout'] parsl.dataflow.memoization DEBUG: Ignoring kwarg stderr parsl.dataflow.memoization DEBUG: Ignoring kwarg stdout parsl.dataflow.memoization DEBUG: Task 4 has memoization hash 7ac17bf061ed83a782dd7ceca00ba027 parsl.dataflow.memoization INFO: Task 4 had no result in cache parsl.executors.high_throughput.executor DEBUG: Pushing function .wrapper at 0x14a0a59bef20> to queue with args ("'${CTRL_MPEXEC_DIR}/bin/pipetask --long-log --log-level=VERBOSE run-qbb /mmfs1/home/stevengs/dirac/D...",) parsl.dataflow.dflow INFO: Parsl task 4 try 0 launched on executor multi with executor id 5 parsl.dataflow.dflow INFO: Standard output for task 4 available at /mmfs1/home/stevengs/dirac/DEEP/submit/DEEP/20190504/A1c/science#step1/20240313T004552Z/submit/DEEP/20190504/A1c/science#step1/20240313T004552Z/logs/calibrate/855239/afabbcb2-21b2-42ac-9a80-a7ca4dd3e01f_calibrate_855239_61.stdout parsl.dataflow.dflow INFO: Standard error for task 4 available at /mmfs1/home/stevengs/dirac/DEEP/submit/DEEP/20190504/A1c/science#step1/20240313T004552Z/submit/DEEP/20190504/A1c/science#step1/20240313T004552Z/logs/calibrate/855239/afabbcb2-21b2-42ac-9a80-a7ca4dd3e01f_calibrate_855239_61.stderr parsl.dataflow.dflow DEBUG: Task 5 will be sent to executor multi parsl.dataflow.dflow DEBUG: Adding output dependencies parsl.dataflow.dflow INFO: Task 5 submitted for App calibrate, not waiting on any dependency parsl.dataflow.dflow DEBUG: Task 5 set to pending state with AppFuture: parsl.dataflow.memoization DEBUG: Ignoring these kwargs for checkpointing: ['stderr', 'stdout'] parsl.dataflow.memoization DEBUG: Ignoring kwarg stderr parsl.dataflow.memoization DEBUG: Ignoring kwarg stdout parsl.dataflow.memoization DEBUG: Task 5 has memoization hash 35e6072f8e12c92b613d91f28ff6e335 parsl.dataflow.memoization INFO: Task 5 had no result in cache parsl.executors.high_throughput.executor DEBUG: Pushing function .wrapper at 0x14a0a59bf100> to queue with args ("'${CTRL_MPEXEC_DIR}/bin/pipetask --long-log --log-level=VERBOSE run-qbb /mmfs1/home/stevengs/dirac/D...",) parsl.dataflow.dflow INFO: Parsl task 5 try 0 launched on executor multi with executor id 6 parsl.dataflow.dflow INFO: Standard output for task 5 available at /mmfs1/home/stevengs/dirac/DEEP/submit/DEEP/20190504/A1c/science#step1/20240313T004552Z/submit/DEEP/20190504/A1c/science#step1/20240313T004552Z/logs/calibrate/855233/9cb72634-0004-4cf3-abd5-f5d161501a82_calibrate_855233_61.stdout parsl.dataflow.dflow INFO: Standard error for task 5 available at /mmfs1/home/stevengs/dirac/DEEP/submit/DEEP/20190504/A1c/science#step1/20240313T004552Z/submit/DEEP/20190504/A1c/science#step1/20240313T004552Z/logs/calibrate/855233/9cb72634-0004-4cf3-abd5-f5d161501a82_calibrate_855233_61.stderr parsl.dataflow.dflow DEBUG: Task 6 will be sent to executor multi parsl.dataflow.dflow DEBUG: Adding output dependencies parsl.dataflow.dflow INFO: Task 6 submitted for App calibrate, not waiting on any dependency parsl.dataflow.dflow DEBUG: Task 6 set to pending state with AppFuture: parsl.dataflow.memoization DEBUG: Ignoring these kwargs for checkpointing: ['stderr', 'stdout'] parsl.dataflow.memoization DEBUG: Ignoring kwarg stderr parsl.dataflow.memoization DEBUG: Ignoring kwarg stdout parsl.dataflow.memoization DEBUG: Task 6 has memoization hash 4ee1b38b81c097313c1a0968eaf59c68 parsl.dataflow.memoization INFO: Task 6 had no result in cache parsl.executors.high_throughput.executor DEBUG: Pushing function .wrapper at 0x14a0a59bf2e0> to queue with args ("'${CTRL_MPEXEC_DIR}/bin/pipetask --long-log --log-level=VERBOSE run-qbb /mmfs1/home/stevengs/dirac/D...",) parsl.dataflow.dflow INFO: Parsl task 6 try 0 launched on executor multi with executor id 7 parsl.dataflow.dflow INFO: Standard output for task 6 available at /mmfs1/home/stevengs/dirac/DEEP/submit/DEEP/20190504/A1c/science#step1/20240313T004552Z/submit/DEEP/20190504/A1c/science#step1/20240313T004552Z/logs/calibrate/855242/77c764d1-8a76-4375-b57a-c3586164fc7d_calibrate_855242_61.stdout parsl.dataflow.dflow INFO: Standard error for task 6 available at /mmfs1/home/stevengs/dirac/DEEP/submit/DEEP/20190504/A1c/science#step1/20240313T004552Z/submit/DEEP/20190504/A1c/science#step1/20240313T004552Z/logs/calibrate/855242/77c764d1-8a76-4375-b57a-c3586164fc7d_calibrate_855242_61.stderr parsl.dataflow.dflow DEBUG: Task 7 will be sent to executor multi parsl.dataflow.dflow DEBUG: Adding output dependencies parsl.dataflow.dflow INFO: Task 7 submitted for App calibrate, not waiting on any dependency parsl.dataflow.dflow DEBUG: Task 7 set to pending state with AppFuture: parsl.dataflow.memoization DEBUG: Ignoring these kwargs for checkpointing: ['stderr', 'stdout'] parsl.dataflow.memoization DEBUG: Ignoring kwarg stderr parsl.dataflow.memoization DEBUG: Ignoring kwarg stdout parsl.dataflow.memoization DEBUG: Task 7 has memoization hash 1311e7b153c8c919a4c655b0827f90d9 parsl.dataflow.memoization INFO: Task 7 had no result in cache parsl.executors.high_throughput.executor DEBUG: Pushing function .wrapper at 0x14a0a59be340> to queue with args ("'${CTRL_MPEXEC_DIR}/bin/pipetask --long-log --log-level=VERBOSE run-qbb /mmfs1/home/stevengs/dirac/D...",) parsl.dataflow.dflow INFO: Parsl task 7 try 0 launched on executor multi with executor id 8 parsl.dataflow.dflow INFO: Standard output for task 7 available at /mmfs1/home/stevengs/dirac/DEEP/submit/DEEP/20190504/A1c/science#step1/20240313T004552Z/submit/DEEP/20190504/A1c/science#step1/20240313T004552Z/logs/calibrate/855235/ee291926-868e-4af4-9a7c-688cd03d4a89_calibrate_855235_61.stdout parsl.dataflow.dflow INFO: Standard error for task 7 available at /mmfs1/home/stevengs/dirac/DEEP/submit/DEEP/20190504/A1c/science#step1/20240313T004552Z/submit/DEEP/20190504/A1c/science#step1/20240313T004552Z/logs/calibrate/855235/ee291926-868e-4af4-9a7c-688cd03d4a89_calibrate_855235_61.stderr parsl.dataflow.dflow DEBUG: Task 8 will be sent to executor multi parsl.dataflow.dflow DEBUG: Adding output dependencies parsl.dataflow.dflow INFO: Task 8 submitted for App calibrate, not waiting on any dependency parsl.dataflow.dflow DEBUG: Task 8 set to pending state with AppFuture: parsl.dataflow.memoization DEBUG: Ignoring these kwargs for checkpointing: ['stderr', 'stdout'] parsl.dataflow.memoization DEBUG: Ignoring kwarg stderr parsl.dataflow.memoization DEBUG: Ignoring kwarg stdout parsl.dataflow.memoization DEBUG: Task 8 has memoization hash aa318d7095aa430df1e119729183fe2f parsl.dataflow.memoization INFO: Task 8 had no result in cache parsl.executors.high_throughput.executor DEBUG: Pushing function .wrapper at 0x14a0a59beb60> to queue with args ("'${CTRL_MPEXEC_DIR}/bin/pipetask --long-log --log-level=VERBOSE run-qbb /mmfs1/home/stevengs/dirac/D...",) parsl.dataflow.dflow INFO: Parsl task 8 try 0 launched on executor multi with executor id 9 parsl.dataflow.dflow INFO: Standard output for task 8 available at /mmfs1/home/stevengs/dirac/DEEP/submit/DEEP/20190504/A1c/science#step1/20240313T004552Z/submit/DEEP/20190504/A1c/science#step1/20240313T004552Z/logs/calibrate/855154/cceb2f44-be57-4f20-9192-4ebc5906f4c0_calibrate_855154_61.stdout parsl.dataflow.dflow INFO: Standard error for task 8 available at /mmfs1/home/stevengs/dirac/DEEP/submit/DEEP/20190504/A1c/science#step1/20240313T004552Z/submit/DEEP/20190504/A1c/science#step1/20240313T004552Z/logs/calibrate/855154/cceb2f44-be57-4f20-9192-4ebc5906f4c0_calibrate_855154_61.stderr parsl.dataflow.dflow DEBUG: Task 9 will be sent to executor multi parsl.dataflow.dflow DEBUG: Adding output dependencies parsl.dataflow.dflow INFO: Task 9 submitted for App characterizeImage, not waiting on any dependency parsl.dataflow.dflow DEBUG: Task 9 set to pending state with AppFuture: parsl.dataflow.memoization DEBUG: Ignoring these kwargs for checkpointing: ['stderr', 'stdout'] parsl.dataflow.memoization DEBUG: Ignoring kwarg stderr parsl.dataflow.memoization DEBUG: Ignoring kwarg stdout parsl.dataflow.memoization DEBUG: Task 9 has memoization hash c4cd801f9c64185dde7dd1142756316e parsl.dataflow.memoization INFO: Task 9 had no result in cache parsl.executors.high_throughput.executor DEBUG: Pushing function .wrapper at 0x14a0a59beac0> to queue with args ("'${CTRL_MPEXEC_DIR}/bin/pipetask --long-log --log-level=VERBOSE run-qbb /mmfs1/home/stevengs/dirac/D...",) parsl.dataflow.dflow INFO: Parsl task 9 try 0 launched on executor multi with executor id 10 parsl.dataflow.dflow INFO: Standard output for task 9 available at /mmfs1/home/stevengs/dirac/DEEP/submit/DEEP/20190504/A1c/science#step1/20240313T004552Z/submit/DEEP/20190504/A1c/science#step1/20240313T004552Z/logs/characterizeImage/855155/2086e029-091d-4067-95a7-9c7b4cca13bd_characterizeImage_855155_2.stdout parsl.dataflow.dflow INFO: Standard error for task 9 available at /mmfs1/home/stevengs/dirac/DEEP/submit/DEEP/20190504/A1c/science#step1/20240313T004552Z/submit/DEEP/20190504/A1c/science#step1/20240313T004552Z/logs/characterizeImage/855155/2086e029-091d-4067-95a7-9c7b4cca13bd_characterizeImage_855155_2.stderr parsl.dataflow.dflow DEBUG: Task 10 will be sent to executor multi parsl.dataflow.dflow DEBUG: Adding output dependencies parsl.dataflow.dflow INFO: Task 10 submitted for App calibrate, waiting on task 9 parsl.dataflow.dflow DEBUG: Task 10 set to pending state with AppFuture: parsl.dataflow.dflow DEBUG: Task 10 has outstanding dependencies, so launch_if_ready skipping parsl.dataflow.dflow DEBUG: Task 11 will be sent to executor multi parsl.dataflow.dflow DEBUG: Adding output dependencies parsl.dataflow.dflow INFO: Task 11 submitted for App characterizeImage, not waiting on any dependency parsl.dataflow.dflow DEBUG: Task 11 set to pending state with AppFuture: parsl.dataflow.memoization DEBUG: Ignoring these kwargs for checkpointing: ['stderr', 'stdout'] parsl.dataflow.memoization DEBUG: Ignoring kwarg stderr parsl.dataflow.memoization DEBUG: Ignoring kwarg stdout parsl.dataflow.memoization DEBUG: Task 11 has memoization hash cf7d32aa1f863d471ab92eb8df888da0 parsl.dataflow.memoization INFO: Task 11 had no result in cache parsl.executors.high_throughput.executor DEBUG: Pushing function .wrapper at 0x14a0a59bf060> to queue with args ("'${CTRL_MPEXEC_DIR}/bin/pipetask --long-log --log-level=VERBOSE run-qbb /mmfs1/home/stevengs/dirac/D...",) parsl.dataflow.dflow INFO: Parsl task 11 try 0 launched on executor multi with executor id 11 parsl.dataflow.dflow INFO: Standard output for task 11 available at /mmfs1/home/stevengs/dirac/DEEP/submit/DEEP/20190504/A1c/science#step1/20240313T004552Z/submit/DEEP/20190504/A1c/science#step1/20240313T004552Z/logs/characterizeImage/855232/347540aa-acb4-4849-9813-60eb3425dbad_characterizeImage_855232_2.stdout parsl.dataflow.dflow INFO: Standard error for task 11 available at /mmfs1/home/stevengs/dirac/DEEP/submit/DEEP/20190504/A1c/science#step1/20240313T004552Z/submit/DEEP/20190504/A1c/science#step1/20240313T004552Z/logs/characterizeImage/855232/347540aa-acb4-4849-9813-60eb3425dbad_characterizeImage_855232_2.stderr parsl.dataflow.dflow DEBUG: Task 12 will be sent to executor multi parsl.dataflow.dflow DEBUG: Adding output dependencies parsl.dataflow.dflow INFO: Task 12 submitted for App calibrate, waiting on task 11 parsl.dataflow.dflow DEBUG: Task 12 set to pending state with AppFuture: parsl.dataflow.dflow DEBUG: Task 12 has outstanding dependencies, so launch_if_ready skipping parsl.dataflow.dflow DEBUG: Task 13 will be sent to executor multi parsl.dataflow.dflow DEBUG: Adding output dependencies parsl.dataflow.dflow INFO: Task 13 submitted for App characterizeImage, not waiting on any dependency parsl.dataflow.dflow DEBUG: Task 13 set to pending state with AppFuture: parsl.dataflow.memoization DEBUG: Ignoring these kwargs for checkpointing: ['stderr', 'stdout'] parsl.dataflow.memoization DEBUG: Ignoring kwarg stderr parsl.dataflow.memoization DEBUG: Ignoring kwarg stdout parsl.dataflow.memoization DEBUG: Task 13 has memoization hash 6a9ce1f72afaeac33952c5e7ad7b2553 parsl.dataflow.memoization INFO: Task 13 had no result in cache parsl.executors.high_throughput.executor DEBUG: Pushing function .wrapper at 0x14a0a59bfa60> to queue with args ("'${CTRL_MPEXEC_DIR}/bin/pipetask --long-log --log-level=VERBOSE run-qbb /mmfs1/home/stevengs/dirac/D...",) parsl.dataflow.dflow INFO: Parsl task 13 try 0 launched on executor multi with executor id 12 parsl.dataflow.dflow INFO: Standard output for task 13 available at /mmfs1/home/stevengs/dirac/DEEP/submit/DEEP/20190504/A1c/science#step1/20240313T004552Z/submit/DEEP/20190504/A1c/science#step1/20240313T004552Z/logs/characterizeImage/855233/6e2c2141-714d-4569-9ba9-525ba24a832e_characterizeImage_855233_2.stdout parsl.dataflow.dflow INFO: Standard error for task 13 available at /mmfs1/home/stevengs/dirac/DEEP/submit/DEEP/20190504/A1c/science#step1/20240313T004552Z/submit/DEEP/20190504/A1c/science#step1/20240313T004552Z/logs/characterizeImage/855233/6e2c2141-714d-4569-9ba9-525ba24a832e_characterizeImage_855233_2.stderr parsl.dataflow.dflow DEBUG: Task 14 will be sent to executor multi parsl.dataflow.dflow DEBUG: Adding output dependencies parsl.dataflow.dflow INFO: Task 14 submitted for App calibrate, waiting on task 13 parsl.dataflow.dflow DEBUG: Task 14 set to pending state with AppFuture: parsl.dataflow.dflow DEBUG: Task 14 has outstanding dependencies, so launch_if_ready skipping parsl.dataflow.dflow DEBUG: Task 15 will be sent to executor multi parsl.dataflow.dflow DEBUG: Adding output dependencies parsl.dataflow.dflow INFO: Task 15 submitted for App characterizeImage, not waiting on any dependency parsl.dataflow.dflow DEBUG: Task 15 set to pending state with AppFuture: parsl.dataflow.memoization DEBUG: Ignoring these kwargs for checkpointing: ['stderr', 'stdout'] parsl.dataflow.memoization DEBUG: Ignoring kwarg stderr parsl.dataflow.memoization DEBUG: Ignoring kwarg stdout parsl.dataflow.memoization DEBUG: Task 15 has memoization hash 543e14b89f388ed808ff36e721483df2 parsl.dataflow.memoization INFO: Task 15 had no result in cache parsl.executors.high_throughput.executor DEBUG: Pushing function .wrapper at 0x14a0a59bfba0> to queue with args ("'${CTRL_MPEXEC_DIR}/bin/pipetask --long-log --log-level=VERBOSE run-qbb /mmfs1/home/stevengs/dirac/D...",) parsl.dataflow.dflow INFO: Parsl task 15 try 0 launched on executor multi with executor id 13 parsl.dataflow.dflow INFO: Standard output for task 15 available at /mmfs1/home/stevengs/dirac/DEEP/submit/DEEP/20190504/A1c/science#step1/20240313T004552Z/submit/DEEP/20190504/A1c/science#step1/20240313T004552Z/logs/characterizeImage/855242/b137b096-0665-4f22-aacc-50f63e7957d6_characterizeImage_855242_2.stdout parsl.dataflow.dflow INFO: Standard error for task 15 available at /mmfs1/home/stevengs/dirac/DEEP/submit/DEEP/20190504/A1c/science#step1/20240313T004552Z/submit/DEEP/20190504/A1c/science#step1/20240313T004552Z/logs/characterizeImage/855242/b137b096-0665-4f22-aacc-50f63e7957d6_characterizeImage_855242_2.stderr parsl.dataflow.dflow DEBUG: Task 16 will be sent to executor multi parsl.dataflow.dflow DEBUG: Adding output dependencies parsl.dataflow.dflow INFO: Task 16 submitted for App calibrate, waiting on task 15 parsl.dataflow.dflow DEBUG: Task 16 set to pending state with AppFuture: parsl.dataflow.dflow DEBUG: Task 16 has outstanding dependencies, so launch_if_ready skipping parsl.dataflow.dflow DEBUG: Task 17 will be sent to executor multi parsl.dataflow.dflow DEBUG: Adding output dependencies parsl.dataflow.dflow INFO: Task 17 submitted for App characterizeImage, not waiting on any dependency parsl.dataflow.dflow DEBUG: Task 17 set to pending state with AppFuture: parsl.dataflow.memoization DEBUG: Ignoring these kwargs for checkpointing: ['stderr', 'stdout'] parsl.dataflow.memoization DEBUG: Ignoring kwarg stderr parsl.dataflow.memoization DEBUG: Ignoring kwarg stdout parsl.dataflow.memoization DEBUG: Task 17 has memoization hash 3d34f213e14b3d59740fb65a96697482 parsl.dataflow.memoization INFO: Task 17 had no result in cache parsl.executors.high_throughput.executor DEBUG: Pushing function .wrapper at 0x14a0a59bfec0> to queue with args ("'${CTRL_MPEXEC_DIR}/bin/pipetask --long-log --log-level=VERBOSE run-qbb /mmfs1/home/stevengs/dirac/D...",) parsl.dataflow.dflow INFO: Parsl task 17 try 0 launched on executor multi with executor id 14 parsl.dataflow.dflow INFO: Standard output for task 17 available at /mmfs1/home/stevengs/dirac/DEEP/submit/DEEP/20190504/A1c/science#step1/20240313T004552Z/submit/DEEP/20190504/A1c/science#step1/20240313T004552Z/logs/characterizeImage/855156/6fa5b72d-f0ed-4553-bdef-d175805bcce4_characterizeImage_855156_2.stdout parsl.dataflow.dflow INFO: Standard error for task 17 available at /mmfs1/home/stevengs/dirac/DEEP/submit/DEEP/20190504/A1c/science#step1/20240313T004552Z/submit/DEEP/20190504/A1c/science#step1/20240313T004552Z/logs/characterizeImage/855156/6fa5b72d-f0ed-4553-bdef-d175805bcce4_characterizeImage_855156_2.stderr parsl.dataflow.dflow DEBUG: Task 18 will be sent to executor multi parsl.dataflow.dflow DEBUG: Adding output dependencies parsl.dataflow.dflow INFO: Task 18 submitted for App calibrate, waiting on task 17 parsl.dataflow.dflow DEBUG: Task 18 set to pending state with AppFuture: parsl.dataflow.dflow DEBUG: Task 18 has outstanding dependencies, so launch_if_ready skipping parsl.dataflow.dflow DEBUG: Task 19 will be sent to executor multi parsl.dataflow.dflow DEBUG: Adding output dependencies parsl.dataflow.dflow INFO: Task 19 submitted for App characterizeImage, not waiting on any dependency parsl.dataflow.dflow DEBUG: Task 19 set to pending state with AppFuture: parsl.dataflow.memoization DEBUG: Ignoring these kwargs for checkpointing: ['stderr', 'stdout'] parsl.dataflow.memoization DEBUG: Ignoring kwarg stderr parsl.dataflow.memoization DEBUG: Ignoring kwarg stdout parsl.dataflow.memoization DEBUG: Task 19 has memoization hash cfb31922038d574e4c7bf5532b3cefff parsl.dataflow.memoization INFO: Task 19 had no result in cache parsl.executors.high_throughput.executor DEBUG: Pushing function .wrapper at 0x14a0a5a140e0> to queue with args ("'${CTRL_MPEXEC_DIR}/bin/pipetask --long-log --log-level=VERBOSE run-qbb /mmfs1/home/stevengs/dirac/D...",) parsl.dataflow.dflow INFO: Parsl task 19 try 0 launched on executor multi with executor id 15 parsl.dataflow.dflow INFO: Standard output for task 19 available at /mmfs1/home/stevengs/dirac/DEEP/submit/DEEP/20190504/A1c/science#step1/20240313T004552Z/submit/DEEP/20190504/A1c/science#step1/20240313T004552Z/logs/characterizeImage/855239/9aa0bc06-a00f-4190-9d5e-56a0bc89d175_characterizeImage_855239_2.stdout parsl.dataflow.dflow INFO: Standard error for task 19 available at /mmfs1/home/stevengs/dirac/DEEP/submit/DEEP/20190504/A1c/science#step1/20240313T004552Z/submit/DEEP/20190504/A1c/science#step1/20240313T004552Z/logs/characterizeImage/855239/9aa0bc06-a00f-4190-9d5e-56a0bc89d175_characterizeImage_855239_2.stderr parsl.dataflow.dflow DEBUG: Task 20 will be sent to executor multi parsl.dataflow.dflow DEBUG: Adding output dependencies parsl.dataflow.dflow INFO: Task 20 submitted for App calibrate, waiting on task 19 parsl.dataflow.dflow DEBUG: Task 20 set to pending state with AppFuture: parsl.dataflow.dflow DEBUG: Task 20 has outstanding dependencies, so launch_if_ready skipping parsl.dataflow.dflow DEBUG: Task 21 will be sent to executor multi parsl.dataflow.dflow DEBUG: Adding output dependencies parsl.dataflow.dflow INFO: Task 21 submitted for App characterizeImage, not waiting on any dependency parsl.dataflow.dflow DEBUG: Task 21 set to pending state with AppFuture: parsl.dataflow.memoization DEBUG: Ignoring these kwargs for checkpointing: ['stderr', 'stdout'] parsl.dataflow.memoization DEBUG: Ignoring kwarg stderr parsl.dataflow.memoization DEBUG: Ignoring kwarg stdout parsl.dataflow.memoization DEBUG: Task 21 has memoization hash e5bff4b603a3fb4276ac94ffaccbec95 parsl.dataflow.memoization INFO: Task 21 had no result in cache parsl.executors.high_throughput.executor DEBUG: Pushing function .wrapper at 0x14a0a5a14180> to queue with args ("'${CTRL_MPEXEC_DIR}/bin/pipetask --long-log --log-level=VERBOSE run-qbb /mmfs1/home/stevengs/dirac/D...",) parsl.dataflow.dflow INFO: Parsl task 21 try 0 launched on executor multi with executor id 16 parsl.dataflow.dflow INFO: Standard output for task 21 available at /mmfs1/home/stevengs/dirac/DEEP/submit/DEEP/20190504/A1c/science#step1/20240313T004552Z/submit/DEEP/20190504/A1c/science#step1/20240313T004552Z/logs/characterizeImage/855235/1a05a316-e1c2-484e-9bc6-793456bd113e_characterizeImage_855235_2.stdout parsl.dataflow.dflow INFO: Standard error for task 21 available at /mmfs1/home/stevengs/dirac/DEEP/submit/DEEP/20190504/A1c/science#step1/20240313T004552Z/submit/DEEP/20190504/A1c/science#step1/20240313T004552Z/logs/characterizeImage/855235/1a05a316-e1c2-484e-9bc6-793456bd113e_characterizeImage_855235_2.stderr parsl.dataflow.dflow DEBUG: Task 22 will be sent to executor multi parsl.dataflow.dflow DEBUG: Adding output dependencies parsl.dataflow.dflow INFO: Task 22 submitted for App calibrate, waiting on task 21 parsl.dataflow.dflow DEBUG: Task 22 set to pending state with AppFuture: parsl.dataflow.dflow DEBUG: Task 22 has outstanding dependencies, so launch_if_ready skipping parsl.dataflow.dflow DEBUG: Task 23 will be sent to executor multi parsl.dataflow.dflow DEBUG: Adding output dependencies parsl.dataflow.dflow INFO: Task 23 submitted for App characterizeImage, not waiting on any dependency parsl.dataflow.dflow DEBUG: Task 23 set to pending state with AppFuture: parsl.dataflow.memoization DEBUG: Ignoring these kwargs for checkpointing: ['stderr', 'stdout'] parsl.dataflow.memoization DEBUG: Ignoring kwarg stderr parsl.dataflow.memoization DEBUG: Ignoring kwarg stdout parsl.dataflow.memoization DEBUG: Task 23 has memoization hash 06b6c076abfa94b9413c6d6e7e064e18 parsl.dataflow.memoization INFO: Task 23 had no result in cache parsl.executors.high_throughput.executor DEBUG: Pushing function .wrapper at 0x14a0a5a142c0> to queue with args ("'${CTRL_MPEXEC_DIR}/bin/pipetask --long-log --log-level=VERBOSE run-qbb /mmfs1/home/stevengs/dirac/D...",) parsl.dataflow.dflow INFO: Parsl task 23 try 0 launched on executor multi with executor id 17 parsl.dataflow.dflow INFO: Standard output for task 23 available at /mmfs1/home/stevengs/dirac/DEEP/submit/DEEP/20190504/A1c/science#step1/20240313T004552Z/submit/DEEP/20190504/A1c/science#step1/20240313T004552Z/logs/characterizeImage/855231/0006b709-d908-4273-8054-8b0ac74bbaa1_characterizeImage_855231_2.stdout parsl.dataflow.dflow INFO: Standard error for task 23 available at /mmfs1/home/stevengs/dirac/DEEP/submit/DEEP/20190504/A1c/science#step1/20240313T004552Z/submit/DEEP/20190504/A1c/science#step1/20240313T004552Z/logs/characterizeImage/855231/0006b709-d908-4273-8054-8b0ac74bbaa1_characterizeImage_855231_2.stderr parsl.dataflow.dflow DEBUG: Task 24 will be sent to executor multi parsl.dataflow.dflow DEBUG: Adding output dependencies parsl.dataflow.dflow INFO: Task 24 submitted for App calibrate, waiting on task 23 parsl.dataflow.dflow DEBUG: Task 24 set to pending state with AppFuture: parsl.dataflow.dflow DEBUG: Task 24 has outstanding dependencies, so launch_if_ready skipping parsl.dataflow.dflow DEBUG: Task 25 will be sent to executor multi parsl.dataflow.dflow DEBUG: Adding output dependencies parsl.dataflow.dflow INFO: Task 25 submitted for App characterizeImage, not waiting on any dependency parsl.dataflow.dflow DEBUG: Task 25 set to pending state with AppFuture: parsl.dataflow.memoization DEBUG: Ignoring these kwargs for checkpointing: ['stderr', 'stdout'] parsl.dataflow.memoization DEBUG: Ignoring kwarg stderr parsl.dataflow.memoization DEBUG: Ignoring kwarg stdout parsl.dataflow.memoization DEBUG: Task 25 has memoization hash bfdecf62681aa65ffe26b17739ad7ac2 parsl.dataflow.memoization INFO: Task 25 had no result in cache parsl.executors.high_throughput.executor DEBUG: Pushing function .wrapper at 0x14a0a5a144a0> to queue with args ("'${CTRL_MPEXEC_DIR}/bin/pipetask --long-log --log-level=VERBOSE run-qbb /mmfs1/home/stevengs/dirac/D...",) parsl.dataflow.dflow INFO: Parsl task 25 try 0 launched on executor multi with executor id 18 parsl.dataflow.dflow INFO: Standard output for task 25 available at /mmfs1/home/stevengs/dirac/DEEP/submit/DEEP/20190504/A1c/science#step1/20240313T004552Z/submit/DEEP/20190504/A1c/science#step1/20240313T004552Z/logs/characterizeImage/855237/3964ae79-23a7-439d-900c-c00b7d8f7d45_characterizeImage_855237_2.stdout parsl.dataflow.dflow INFO: Standard error for task 25 available at /mmfs1/home/stevengs/dirac/DEEP/submit/DEEP/20190504/A1c/science#step1/20240313T004552Z/submit/DEEP/20190504/A1c/science#step1/20240313T004552Z/logs/characterizeImage/855237/3964ae79-23a7-439d-900c-c00b7d8f7d45_characterizeImage_855237_2.stderr parsl.dataflow.dflow DEBUG: Task 26 will be sent to executor multi parsl.dataflow.dflow DEBUG: Adding output dependencies parsl.dataflow.dflow INFO: Task 26 submitted for App calibrate, waiting on task 25 parsl.dataflow.dflow DEBUG: Task 26 set to pending state with AppFuture: parsl.dataflow.dflow DEBUG: Task 26 has outstanding dependencies, so launch_if_ready skipping parsl.dataflow.dflow DEBUG: Task 27 will be sent to executor multi parsl.dataflow.dflow DEBUG: Adding output dependencies parsl.dataflow.dflow INFO: Task 27 submitted for App characterizeImage, not waiting on any dependency parsl.dataflow.dflow DEBUG: Task 27 set to pending state with AppFuture: parsl.dataflow.memoization DEBUG: Ignoring these kwargs for checkpointing: ['stderr', 'stdout'] parsl.dataflow.memoization DEBUG: Ignoring kwarg stderr parsl.dataflow.memoization DEBUG: Ignoring kwarg stdout parsl.dataflow.memoization DEBUG: Task 27 has memoization hash 44db16090aa4aaf3a35355dc07aebd18 parsl.dataflow.memoization INFO: Task 27 had no result in cache parsl.executors.high_throughput.executor DEBUG: Pushing function .wrapper at 0x14a0a5a14680> to queue with args ("'${CTRL_MPEXEC_DIR}/bin/pipetask --long-log --log-level=VERBOSE run-qbb /mmfs1/home/stevengs/dirac/D...",) parsl.dataflow.dflow INFO: Parsl task 27 try 0 launched on executor multi with executor id 19 parsl.dataflow.dflow INFO: Standard output for task 27 available at /mmfs1/home/stevengs/dirac/DEEP/submit/DEEP/20190504/A1c/science#step1/20240313T004552Z/submit/DEEP/20190504/A1c/science#step1/20240313T004552Z/logs/characterizeImage/855234/6020d23d-ca0d-4fc1-919a-75bb54614ea8_characterizeImage_855234_2.stdout parsl.dataflow.dflow INFO: Standard error for task 27 available at /mmfs1/home/stevengs/dirac/DEEP/submit/DEEP/20190504/A1c/science#step1/20240313T004552Z/submit/DEEP/20190504/A1c/science#step1/20240313T004552Z/logs/characterizeImage/855234/6020d23d-ca0d-4fc1-919a-75bb54614ea8_characterizeImage_855234_2.stderr parsl.dataflow.dflow DEBUG: Task 28 will be sent to executor multi parsl.dataflow.dflow DEBUG: Adding output dependencies parsl.dataflow.dflow INFO: Task 28 submitted for App calibrate, waiting on task 27 parsl.dataflow.dflow DEBUG: Task 28 set to pending state with AppFuture: parsl.dataflow.dflow DEBUG: Task 28 has outstanding dependencies, so launch_if_ready skipping parsl.dataflow.dflow DEBUG: Task 29 will be sent to executor multi parsl.dataflow.dflow DEBUG: Adding output dependencies parsl.dataflow.dflow INFO: Task 29 submitted for App characterizeImage, not waiting on any dependency parsl.dataflow.dflow DEBUG: Task 29 set to pending state with AppFuture: parsl.dataflow.memoization DEBUG: Ignoring these kwargs for checkpointing: ['stderr', 'stdout'] parsl.dataflow.memoization DEBUG: Ignoring kwarg stderr parsl.dataflow.memoization DEBUG: Ignoring kwarg stdout parsl.dataflow.memoization DEBUG: Task 29 has memoization hash b84bc45020e63591ded62a69b711a669 parsl.dataflow.memoization INFO: Task 29 had no result in cache parsl.executors.high_throughput.executor DEBUG: Pushing function .wrapper at 0x14a0a5a14a40> to queue with args ("'${CTRL_MPEXEC_DIR}/bin/pipetask --long-log --log-level=VERBOSE run-qbb /mmfs1/home/stevengs/dirac/D...",) parsl.dataflow.dflow INFO: Parsl task 29 try 0 launched on executor multi with executor id 20 parsl.dataflow.dflow INFO: Standard output for task 29 available at /mmfs1/home/stevengs/dirac/DEEP/submit/DEEP/20190504/A1c/science#step1/20240313T004552Z/submit/DEEP/20190504/A1c/science#step1/20240313T004552Z/logs/characterizeImage/855240/a66aed64-17b6-4dfc-ab12-497daf3266cd_characterizeImage_855240_2.stdout parsl.dataflow.dflow INFO: Standard error for task 29 available at /mmfs1/home/stevengs/dirac/DEEP/submit/DEEP/20190504/A1c/science#step1/20240313T004552Z/submit/DEEP/20190504/A1c/science#step1/20240313T004552Z/logs/characterizeImage/855240/a66aed64-17b6-4dfc-ab12-497daf3266cd_characterizeImage_855240_2.stderr parsl.dataflow.dflow DEBUG: Task 30 will be sent to executor multi parsl.dataflow.dflow DEBUG: Adding output dependencies parsl.dataflow.dflow INFO: Task 30 submitted for App calibrate, waiting on task 29 parsl.dataflow.dflow DEBUG: Task 30 set to pending state with AppFuture: parsl.dataflow.dflow DEBUG: Task 30 has outstanding dependencies, so launch_if_ready skipping parsl.dataflow.dflow DEBUG: Task 31 will be sent to executor multi parsl.dataflow.dflow DEBUG: Adding output dependencies parsl.dataflow.dflow INFO: Task 31 submitted for App characterizeImage, not waiting on any dependency parsl.dataflow.dflow DEBUG: Task 31 set to pending state with AppFuture: parsl.dataflow.memoization DEBUG: Ignoring these kwargs for checkpointing: ['stderr', 'stdout'] parsl.dataflow.memoization DEBUG: Ignoring kwarg stderr parsl.dataflow.memoization DEBUG: Ignoring kwarg stdout parsl.dataflow.memoization DEBUG: Task 31 has memoization hash a067278971b1cbc6f9ee7563d832fe8d parsl.dataflow.memoization INFO: Task 31 had no result in cache parsl.executors.high_throughput.executor DEBUG: Pushing function .wrapper at 0x14a0a5a14860> to queue with args ("'${CTRL_MPEXEC_DIR}/bin/pipetask --long-log --log-level=VERBOSE run-qbb /mmfs1/home/stevengs/dirac/D...",) parsl.dataflow.dflow INFO: Parsl task 31 try 0 launched on executor multi with executor id 21 parsl.dataflow.dflow INFO: Standard output for task 31 available at /mmfs1/home/stevengs/dirac/DEEP/submit/DEEP/20190504/A1c/science#step1/20240313T004552Z/submit/DEEP/20190504/A1c/science#step1/20240313T004552Z/logs/characterizeImage/855238/4605cee3-743a-493d-83f5-fdf545d07467_characterizeImage_855238_2.stdout parsl.dataflow.dflow INFO: Standard error for task 31 available at /mmfs1/home/stevengs/dirac/DEEP/submit/DEEP/20190504/A1c/science#step1/20240313T004552Z/submit/DEEP/20190504/A1c/science#step1/20240313T004552Z/logs/characterizeImage/855238/4605cee3-743a-493d-83f5-fdf545d07467_characterizeImage_855238_2.stderr parsl.dataflow.dflow DEBUG: Task 32 will be sent to executor multi parsl.dataflow.dflow DEBUG: Adding output dependencies parsl.dataflow.dflow INFO: Task 32 submitted for App calibrate, waiting on task 31 parsl.dataflow.dflow DEBUG: Task 32 set to pending state with AppFuture: parsl.dataflow.dflow DEBUG: Task 32 has outstanding dependencies, so launch_if_ready skipping parsl.dataflow.dflow DEBUG: Task 33 will be sent to executor multi parsl.dataflow.dflow DEBUG: Adding output dependencies parsl.dataflow.dflow INFO: Task 33 submitted for App characterizeImage, not waiting on any dependency parsl.dataflow.dflow DEBUG: Task 33 set to pending state with AppFuture: parsl.dataflow.memoization DEBUG: Ignoring these kwargs for checkpointing: ['stderr', 'stdout'] parsl.dataflow.memoization DEBUG: Ignoring kwarg stderr parsl.dataflow.memoization DEBUG: Ignoring kwarg stdout parsl.dataflow.memoization DEBUG: Task 33 has memoization hash c55044e0b9b4ddf47a2be94fdc8cf8ba parsl.dataflow.memoization INFO: Task 33 had no result in cache parsl.executors.high_throughput.executor DEBUG: Pushing function .wrapper at 0x14a0a5a14c20> to queue with args ("'${CTRL_MPEXEC_DIR}/bin/pipetask --long-log --log-level=VERBOSE run-qbb /mmfs1/home/stevengs/dirac/D...",) parsl.dataflow.dflow INFO: Parsl task 33 try 0 launched on executor multi with executor id 22 parsl.dataflow.dflow INFO: Standard output for task 33 available at /mmfs1/home/stevengs/dirac/DEEP/submit/DEEP/20190504/A1c/science#step1/20240313T004552Z/submit/DEEP/20190504/A1c/science#step1/20240313T004552Z/logs/characterizeImage/855154/4f4a56ad-0482-49cf-afc8-b2c863ede055_characterizeImage_855154_2.stdout parsl.dataflow.dflow INFO: Standard error for task 33 available at /mmfs1/home/stevengs/dirac/DEEP/submit/DEEP/20190504/A1c/science#step1/20240313T004552Z/submit/DEEP/20190504/A1c/science#step1/20240313T004552Z/logs/characterizeImage/855154/4f4a56ad-0482-49cf-afc8-b2c863ede055_characterizeImage_855154_2.stderr parsl.dataflow.dflow DEBUG: Task 34 will be sent to executor multi parsl.dataflow.dflow DEBUG: Adding output dependencies parsl.dataflow.dflow INFO: Task 34 submitted for App calibrate, waiting on task 33 parsl.dataflow.dflow DEBUG: Task 34 set to pending state with AppFuture: parsl.dataflow.dflow DEBUG: Task 34 has outstanding dependencies, so launch_if_ready skipping parsl.dataflow.dflow DEBUG: Task 35 will be sent to executor multi parsl.dataflow.dflow DEBUG: Adding output dependencies parsl.dataflow.dflow INFO: Task 35 submitted for App characterizeImage, not waiting on any dependency parsl.dataflow.dflow DEBUG: Task 35 set to pending state with AppFuture: parsl.dataflow.memoization DEBUG: Ignoring these kwargs for checkpointing: ['stderr', 'stdout'] parsl.dataflow.memoization DEBUG: Ignoring kwarg stderr parsl.dataflow.memoization DEBUG: Ignoring kwarg stdout parsl.dataflow.memoization DEBUG: Task 35 has memoization hash 0bdca18ddf82aa9cb0ccc4e02c2c9e8e parsl.dataflow.memoization INFO: Task 35 had no result in cache parsl.executors.high_throughput.executor DEBUG: Pushing function .wrapper at 0x14a0a5a14e00> to queue with args ("'${CTRL_MPEXEC_DIR}/bin/pipetask --long-log --log-level=VERBOSE run-qbb /mmfs1/home/stevengs/dirac/D...",) parsl.dataflow.dflow INFO: Parsl task 35 try 0 launched on executor multi with executor id 23 parsl.dataflow.dflow INFO: Standard output for task 35 available at /mmfs1/home/stevengs/dirac/DEEP/submit/DEEP/20190504/A1c/science#step1/20240313T004552Z/submit/DEEP/20190504/A1c/science#step1/20240313T004552Z/logs/characterizeImage/855241/d8762014-da79-4bd3-963c-08e730b4ba96_characterizeImage_855241_2.stdout parsl.dataflow.dflow INFO: Standard error for task 35 available at /mmfs1/home/stevengs/dirac/DEEP/submit/DEEP/20190504/A1c/science#step1/20240313T004552Z/submit/DEEP/20190504/A1c/science#step1/20240313T004552Z/logs/characterizeImage/855241/d8762014-da79-4bd3-963c-08e730b4ba96_characterizeImage_855241_2.stderr parsl.dataflow.dflow DEBUG: Task 36 will be sent to executor multi parsl.dataflow.dflow DEBUG: Adding output dependencies parsl.dataflow.dflow INFO: Task 36 submitted for App calibrate, waiting on task 35 parsl.dataflow.dflow DEBUG: Task 36 set to pending state with AppFuture: parsl.dataflow.dflow DEBUG: Task 36 has outstanding dependencies, so launch_if_ready skipping parsl.dataflow.dflow DEBUG: Task 37 will be sent to executor multi parsl.dataflow.dflow DEBUG: Adding output dependencies parsl.dataflow.dflow INFO: Task 37 submitted for App characterizeImage, not waiting on any dependency parsl.dataflow.dflow DEBUG: Task 37 set to pending state with AppFuture: parsl.dataflow.memoization DEBUG: Ignoring these kwargs for checkpointing: ['stderr', 'stdout'] parsl.dataflow.memoization DEBUG: Ignoring kwarg stderr parsl.dataflow.memoization DEBUG: Ignoring kwarg stdout parsl.dataflow.memoization DEBUG: Task 37 has memoization hash 3bda4ff7e2ebbb3b265ec953a4b02846 parsl.dataflow.memoization INFO: Task 37 had no result in cache parsl.executors.high_throughput.executor DEBUG: Pushing function .wrapper at 0x14a0a5a14fe0> to queue with args ("'${CTRL_MPEXEC_DIR}/bin/pipetask --long-log --log-level=VERBOSE run-qbb /mmfs1/home/stevengs/dirac/D...",) parsl.dataflow.dflow INFO: Parsl task 37 try 0 launched on executor multi with executor id 24 parsl.dataflow.dflow INFO: Standard output for task 37 available at /mmfs1/home/stevengs/dirac/DEEP/submit/DEEP/20190504/A1c/science#step1/20240313T004552Z/submit/DEEP/20190504/A1c/science#step1/20240313T004552Z/logs/characterizeImage/855236/d0f18884-6a7f-417a-98c5-2ab6d11b2f68_characterizeImage_855236_2.stdout parsl.dataflow.dflow INFO: Standard error for task 37 available at /mmfs1/home/stevengs/dirac/DEEP/submit/DEEP/20190504/A1c/science#step1/20240313T004552Z/submit/DEEP/20190504/A1c/science#step1/20240313T004552Z/logs/characterizeImage/855236/d0f18884-6a7f-417a-98c5-2ab6d11b2f68_characterizeImage_855236_2.stderr parsl.dataflow.dflow DEBUG: Task 38 will be sent to executor multi parsl.dataflow.dflow DEBUG: Adding output dependencies parsl.dataflow.dflow INFO: Task 38 submitted for App calibrate, waiting on task 37 parsl.dataflow.dflow DEBUG: Task 38 set to pending state with AppFuture: parsl.dataflow.dflow DEBUG: Task 38 has outstanding dependencies, so launch_if_ready skipping 2024-03-13 00:46:34 proc_lsst.multi:146 [INFO] found job 45623 in provider local proc_lsst.multi INFO: found job 45623 in provider local parsl.dataflow.strategy DEBUG: general strategy starting with strategy_type simple for 1 executors parsl.dataflow.strategy DEBUG: Strategizing for executor multi parsl.dataflow.strategy DEBUG: Slot ratio calculation: active_slots = 1, active_tasks = 24 parsl.dataflow.strategy DEBUG: Executor multi has 24 active tasks, 1/0 running/pending blocks, and 0 connected workers parsl.dataflow.strategy DEBUG: Strategy case 2: slots are overloaded - (slot_ratio = active_slots/active_tasks) < parallelism parsl.dataflow.strategy DEBUG: Strategy case 2b: active_blocks 1 < max_blocks 312 so scaling out parsl.dataflow.strategy DEBUG: Requesting 5 more blocks parsl.executors.status_handling INFO: Scaling out by 5 blocks parsl.executors.status_handling INFO: Allocated block ID 1 parsl.executors.status_handling DEBUG: Submitting to provider with job_name parsl.multi.block-1 2024-03-13 00:46:34 proc_lsst.multi:153 [INFO] [multi] got submit process_worker_pool.py -a 169.254.95.120,127.0.0.1,10.64.129.9,n3009,10.64.65.9,198.48.92.26 -p 0 -c 1.0 -m None --poll 10 --task_port=54335 --result_port=54097 --logdir=/mmfs1/gscratch/dirac/DEEP/submit/DEEP/20190504/A1c/science#step1/20240313T004552Z/runinfo/000/multi --block_id=1 --hb_period=30 --hb_threshold=120 --cpu-affinity none --available-accelerators --start-method spawn 1 parsl.multi.block-1 proc_lsst.multi INFO: [multi] got submit process_worker_pool.py -a 169.254.95.120,127.0.0.1,10.64.129.9,n3009,10.64.65.9,198.48.92.26 -p 0 -c 1.0 -m None --poll 10 --task_port=54335 --result_port=54097 --logdir=/mmfs1/gscratch/dirac/DEEP/submit/DEEP/20190504/A1c/science#step1/20240313T004552Z/runinfo/000/multi --block_id=1 --hb_period=30 --hb_threshold=120 --cpu-affinity none --available-accelerators --start-method spawn 1 parsl.multi.block-1 2024-03-13 00:46:34 proc_lsst.multi:162 [INFO] [multi] local process_worker_pool.py -a 169.254.95.120,127.0.0.1,10.64.129.9,n3009,10.64.65.9,198.48.92.26 -p 0 -c 1.0 -m None --poll 10 --task_port=54335 --result_port=54097 --logdir=/mmfs1/gscratch/dirac/DEEP/submit/DEEP/20190504/A1c/science#step1/20240313T004552Z/runinfo/000/multi --block_id=1 --hb_period=30 --hb_threshold=120 --cpu-affinity none --available-accelerators --start-method spawn --max_workers 2 proc_lsst.multi INFO: [multi] local process_worker_pool.py -a 169.254.95.120,127.0.0.1,10.64.129.9,n3009,10.64.65.9,198.48.92.26 -p 0 -c 1.0 -m None --poll 10 --task_port=54335 --result_port=54097 --logdir=/mmfs1/gscratch/dirac/DEEP/submit/DEEP/20190504/A1c/science#step1/20240313T004552Z/runinfo/000/multi --block_id=1 --hb_period=30 --hb_threshold=120 --cpu-affinity none --available-accelerators --start-method spawn --max_workers 2 2024-03-13 00:46:34 proc_lsst.multi:163 [INFO] [multi] len(self.providers[provider].resources) 1 proc_lsst.multi INFO: [multi] len(self.providers[provider].resources) 1 2024-03-13 00:46:34 proc_lsst.multi:164 [INFO] [multi] self.providers[provider].max_blocks 1 proc_lsst.multi INFO: [multi] self.providers[provider].max_blocks 1 2024-03-13 00:46:34 proc_lsst.multi:162 [INFO] [multi] astro process_worker_pool.py -a 169.254.95.120,127.0.0.1,10.64.129.9,n3009,10.64.65.9,198.48.92.26 -p 0 -c 1.0 -m None --poll 10 --task_port=54335 --result_port=54097 --logdir=/mmfs1/gscratch/dirac/DEEP/submit/DEEP/20190504/A1c/science#step1/20240313T004552Z/runinfo/000/multi --block_id=1 --hb_period=30 --hb_threshold=120 --cpu-affinity none --available-accelerators --start-method spawn proc_lsst.multi INFO: [multi] astro process_worker_pool.py -a 169.254.95.120,127.0.0.1,10.64.129.9,n3009,10.64.65.9,198.48.92.26 -p 0 -c 1.0 -m None --poll 10 --task_port=54335 --result_port=54097 --logdir=/mmfs1/gscratch/dirac/DEEP/submit/DEEP/20190504/A1c/science#step1/20240313T004552Z/runinfo/000/multi --block_id=1 --hb_period=30 --hb_threshold=120 --cpu-affinity none --available-accelerators --start-method spawn 2024-03-13 00:46:34 proc_lsst.multi:163 [INFO] [multi] len(self.providers[provider].resources) 0 proc_lsst.multi INFO: [multi] len(self.providers[provider].resources) 0 2024-03-13 00:46:34 proc_lsst.multi:164 [INFO] [multi] self.providers[provider].max_blocks 30 proc_lsst.multi INFO: [multi] self.providers[provider].max_blocks 30 2024-03-13 00:46:34 proc_lsst.multi:166 [INFO] [multi] submitting process_worker_pool.py -a 169.254.95.120,127.0.0.1,10.64.129.9,n3009,10.64.65.9,198.48.92.26 -p 0 -c 1.0 -m None --poll 10 --task_port=54335 --result_port=54097 --logdir=/mmfs1/gscratch/dirac/DEEP/submit/DEEP/20190504/A1c/science#step1/20240313T004552Z/runinfo/000/multi --block_id=1 --hb_period=30 --hb_threshold=120 --cpu-affinity none --available-accelerators --start-method spawn to astro proc_lsst.multi INFO: [multi] submitting process_worker_pool.py -a 169.254.95.120,127.0.0.1,10.64.129.9,n3009,10.64.65.9,198.48.92.26 -p 0 -c 1.0 -m None --poll 10 --task_port=54335 --result_port=54097 --logdir=/mmfs1/gscratch/dirac/DEEP/submit/DEEP/20190504/A1c/science#step1/20240313T004552Z/runinfo/000/multi --block_id=1 --hb_period=30 --hb_threshold=120 --cpu-affinity none --available-accelerators --start-method spawn to astro parsl.providers.slurm.slurm DEBUG: Requesting one block with 1 nodes parsl.providers.slurm.slurm DEBUG: Writing submit script parsl.providers.slurm.slurm DEBUG: moving files 2024-03-13 00:46:34 proc_lsst.multi:170 [INFO] [multi] job_id 17006648 proc_lsst.multi INFO: [multi] job_id 17006648 2024-03-13 00:46:34 proc_lsst.multi:171 [INFO] [multi] len(self.providers[provider].resources) 1 proc_lsst.multi INFO: [multi] len(self.providers[provider].resources) 1 2024-03-13 00:46:34 proc_lsst.multi:172 [INFO] [multi] self.providers[provider].max_blocks 30 proc_lsst.multi INFO: [multi] self.providers[provider].max_blocks 30 2024-03-13 00:46:34 proc_lsst.multi:178 [INFO] [multi] provider astro accepted submit and returned 17006648 proc_lsst.multi INFO: [multi] provider astro accepted submit and returned 17006648 parsl.executors.status_handling DEBUG: Launched block 1 on executor multi with job ID 17006648 parsl.executors.status_handling INFO: Allocated block ID 2 parsl.executors.status_handling DEBUG: Submitting to provider with job_name parsl.multi.block-2 2024-03-13 00:46:34 proc_lsst.multi:153 [INFO] [multi] got submit process_worker_pool.py -a 169.254.95.120,127.0.0.1,10.64.129.9,n3009,10.64.65.9,198.48.92.26 -p 0 -c 1.0 -m None --poll 10 --task_port=54335 --result_port=54097 --logdir=/mmfs1/gscratch/dirac/DEEP/submit/DEEP/20190504/A1c/science#step1/20240313T004552Z/runinfo/000/multi --block_id=2 --hb_period=30 --hb_threshold=120 --cpu-affinity none --available-accelerators --start-method spawn 1 parsl.multi.block-2 proc_lsst.multi INFO: [multi] got submit process_worker_pool.py -a 169.254.95.120,127.0.0.1,10.64.129.9,n3009,10.64.65.9,198.48.92.26 -p 0 -c 1.0 -m None --poll 10 --task_port=54335 --result_port=54097 --logdir=/mmfs1/gscratch/dirac/DEEP/submit/DEEP/20190504/A1c/science#step1/20240313T004552Z/runinfo/000/multi --block_id=2 --hb_period=30 --hb_threshold=120 --cpu-affinity none --available-accelerators --start-method spawn 1 parsl.multi.block-2 2024-03-13 00:46:34 proc_lsst.multi:162 [INFO] [multi] local process_worker_pool.py -a 169.254.95.120,127.0.0.1,10.64.129.9,n3009,10.64.65.9,198.48.92.26 -p 0 -c 1.0 -m None --poll 10 --task_port=54335 --result_port=54097 --logdir=/mmfs1/gscratch/dirac/DEEP/submit/DEEP/20190504/A1c/science#step1/20240313T004552Z/runinfo/000/multi --block_id=2 --hb_period=30 --hb_threshold=120 --cpu-affinity none --available-accelerators --start-method spawn --max_workers 2 proc_lsst.multi INFO: [multi] local process_worker_pool.py -a 169.254.95.120,127.0.0.1,10.64.129.9,n3009,10.64.65.9,198.48.92.26 -p 0 -c 1.0 -m None --poll 10 --task_port=54335 --result_port=54097 --logdir=/mmfs1/gscratch/dirac/DEEP/submit/DEEP/20190504/A1c/science#step1/20240313T004552Z/runinfo/000/multi --block_id=2 --hb_period=30 --hb_threshold=120 --cpu-affinity none --available-accelerators --start-method spawn --max_workers 2 2024-03-13 00:46:34 proc_lsst.multi:163 [INFO] [multi] len(self.providers[provider].resources) 1 proc_lsst.multi INFO: [multi] len(self.providers[provider].resources) 1 2024-03-13 00:46:34 proc_lsst.multi:164 [INFO] [multi] self.providers[provider].max_blocks 1 proc_lsst.multi INFO: [multi] self.providers[provider].max_blocks 1 2024-03-13 00:46:34 proc_lsst.multi:162 [INFO] [multi] astro process_worker_pool.py -a 169.254.95.120,127.0.0.1,10.64.129.9,n3009,10.64.65.9,198.48.92.26 -p 0 -c 1.0 -m None --poll 10 --task_port=54335 --result_port=54097 --logdir=/mmfs1/gscratch/dirac/DEEP/submit/DEEP/20190504/A1c/science#step1/20240313T004552Z/runinfo/000/multi --block_id=2 --hb_period=30 --hb_threshold=120 --cpu-affinity none --available-accelerators --start-method spawn proc_lsst.multi INFO: [multi] astro process_worker_pool.py -a 169.254.95.120,127.0.0.1,10.64.129.9,n3009,10.64.65.9,198.48.92.26 -p 0 -c 1.0 -m None --poll 10 --task_port=54335 --result_port=54097 --logdir=/mmfs1/gscratch/dirac/DEEP/submit/DEEP/20190504/A1c/science#step1/20240313T004552Z/runinfo/000/multi --block_id=2 --hb_period=30 --hb_threshold=120 --cpu-affinity none --available-accelerators --start-method spawn 2024-03-13 00:46:34 proc_lsst.multi:163 [INFO] [multi] len(self.providers[provider].resources) 1 proc_lsst.multi INFO: [multi] len(self.providers[provider].resources) 1 2024-03-13 00:46:34 proc_lsst.multi:164 [INFO] [multi] self.providers[provider].max_blocks 30 proc_lsst.multi INFO: [multi] self.providers[provider].max_blocks 30 2024-03-13 00:46:34 proc_lsst.multi:166 [INFO] [multi] submitting process_worker_pool.py -a 169.254.95.120,127.0.0.1,10.64.129.9,n3009,10.64.65.9,198.48.92.26 -p 0 -c 1.0 -m None --poll 10 --task_port=54335 --result_port=54097 --logdir=/mmfs1/gscratch/dirac/DEEP/submit/DEEP/20190504/A1c/science#step1/20240313T004552Z/runinfo/000/multi --block_id=2 --hb_period=30 --hb_threshold=120 --cpu-affinity none --available-accelerators --start-method spawn to astro proc_lsst.multi INFO: [multi] submitting process_worker_pool.py -a 169.254.95.120,127.0.0.1,10.64.129.9,n3009,10.64.65.9,198.48.92.26 -p 0 -c 1.0 -m None --poll 10 --task_port=54335 --result_port=54097 --logdir=/mmfs1/gscratch/dirac/DEEP/submit/DEEP/20190504/A1c/science#step1/20240313T004552Z/runinfo/000/multi --block_id=2 --hb_period=30 --hb_threshold=120 --cpu-affinity none --available-accelerators --start-method spawn to astro parsl.providers.slurm.slurm DEBUG: Requesting one block with 1 nodes parsl.providers.slurm.slurm DEBUG: Writing submit script parsl.providers.slurm.slurm DEBUG: moving files 2024-03-13 00:46:34 proc_lsst.multi:170 [INFO] [multi] job_id 17006649 proc_lsst.multi INFO: [multi] job_id 17006649 2024-03-13 00:46:34 proc_lsst.multi:171 [INFO] [multi] len(self.providers[provider].resources) 2 proc_lsst.multi INFO: [multi] len(self.providers[provider].resources) 2 2024-03-13 00:46:34 proc_lsst.multi:172 [INFO] [multi] self.providers[provider].max_blocks 30 proc_lsst.multi INFO: [multi] self.providers[provider].max_blocks 30 2024-03-13 00:46:34 proc_lsst.multi:178 [INFO] [multi] provider astro accepted submit and returned 17006649 proc_lsst.multi INFO: [multi] provider astro accepted submit and returned 17006649 parsl.executors.status_handling DEBUG: Launched block 2 on executor multi with job ID 17006649 parsl.executors.status_handling INFO: Allocated block ID 3 parsl.executors.status_handling DEBUG: Submitting to provider with job_name parsl.multi.block-3 2024-03-13 00:46:34 proc_lsst.multi:153 [INFO] [multi] got submit process_worker_pool.py -a 169.254.95.120,127.0.0.1,10.64.129.9,n3009,10.64.65.9,198.48.92.26 -p 0 -c 1.0 -m None --poll 10 --task_port=54335 --result_port=54097 --logdir=/mmfs1/gscratch/dirac/DEEP/submit/DEEP/20190504/A1c/science#step1/20240313T004552Z/runinfo/000/multi --block_id=3 --hb_period=30 --hb_threshold=120 --cpu-affinity none --available-accelerators --start-method spawn 1 parsl.multi.block-3 proc_lsst.multi INFO: [multi] got submit process_worker_pool.py -a 169.254.95.120,127.0.0.1,10.64.129.9,n3009,10.64.65.9,198.48.92.26 -p 0 -c 1.0 -m None --poll 10 --task_port=54335 --result_port=54097 --logdir=/mmfs1/gscratch/dirac/DEEP/submit/DEEP/20190504/A1c/science#step1/20240313T004552Z/runinfo/000/multi --block_id=3 --hb_period=30 --hb_threshold=120 --cpu-affinity none --available-accelerators --start-method spawn 1 parsl.multi.block-3 2024-03-13 00:46:34 proc_lsst.multi:162 [INFO] [multi] local process_worker_pool.py -a 169.254.95.120,127.0.0.1,10.64.129.9,n3009,10.64.65.9,198.48.92.26 -p 0 -c 1.0 -m None --poll 10 --task_port=54335 --result_port=54097 --logdir=/mmfs1/gscratch/dirac/DEEP/submit/DEEP/20190504/A1c/science#step1/20240313T004552Z/runinfo/000/multi --block_id=3 --hb_period=30 --hb_threshold=120 --cpu-affinity none --available-accelerators --start-method spawn --max_workers 2 proc_lsst.multi INFO: [multi] local process_worker_pool.py -a 169.254.95.120,127.0.0.1,10.64.129.9,n3009,10.64.65.9,198.48.92.26 -p 0 -c 1.0 -m None --poll 10 --task_port=54335 --result_port=54097 --logdir=/mmfs1/gscratch/dirac/DEEP/submit/DEEP/20190504/A1c/science#step1/20240313T004552Z/runinfo/000/multi --block_id=3 --hb_period=30 --hb_threshold=120 --cpu-affinity none --available-accelerators --start-method spawn --max_workers 2 2024-03-13 00:46:34 proc_lsst.multi:163 [INFO] [multi] len(self.providers[provider].resources) 1 proc_lsst.multi INFO: [multi] len(self.providers[provider].resources) 1 2024-03-13 00:46:34 proc_lsst.multi:164 [INFO] [multi] self.providers[provider].max_blocks 1 proc_lsst.multi INFO: [multi] self.providers[provider].max_blocks 1 2024-03-13 00:46:34 proc_lsst.multi:162 [INFO] [multi] astro process_worker_pool.py -a 169.254.95.120,127.0.0.1,10.64.129.9,n3009,10.64.65.9,198.48.92.26 -p 0 -c 1.0 -m None --poll 10 --task_port=54335 --result_port=54097 --logdir=/mmfs1/gscratch/dirac/DEEP/submit/DEEP/20190504/A1c/science#step1/20240313T004552Z/runinfo/000/multi --block_id=3 --hb_period=30 --hb_threshold=120 --cpu-affinity none --available-accelerators --start-method spawn proc_lsst.multi INFO: [multi] astro process_worker_pool.py -a 169.254.95.120,127.0.0.1,10.64.129.9,n3009,10.64.65.9,198.48.92.26 -p 0 -c 1.0 -m None --poll 10 --task_port=54335 --result_port=54097 --logdir=/mmfs1/gscratch/dirac/DEEP/submit/DEEP/20190504/A1c/science#step1/20240313T004552Z/runinfo/000/multi --block_id=3 --hb_period=30 --hb_threshold=120 --cpu-affinity none --available-accelerators --start-method spawn 2024-03-13 00:46:34 proc_lsst.multi:163 [INFO] [multi] len(self.providers[provider].resources) 2 proc_lsst.multi INFO: [multi] len(self.providers[provider].resources) 2 2024-03-13 00:46:34 proc_lsst.multi:164 [INFO] [multi] self.providers[provider].max_blocks 30 proc_lsst.multi INFO: [multi] self.providers[provider].max_blocks 30 2024-03-13 00:46:34 proc_lsst.multi:166 [INFO] [multi] submitting process_worker_pool.py -a 169.254.95.120,127.0.0.1,10.64.129.9,n3009,10.64.65.9,198.48.92.26 -p 0 -c 1.0 -m None --poll 10 --task_port=54335 --result_port=54097 --logdir=/mmfs1/gscratch/dirac/DEEP/submit/DEEP/20190504/A1c/science#step1/20240313T004552Z/runinfo/000/multi --block_id=3 --hb_period=30 --hb_threshold=120 --cpu-affinity none --available-accelerators --start-method spawn to astro proc_lsst.multi INFO: [multi] submitting process_worker_pool.py -a 169.254.95.120,127.0.0.1,10.64.129.9,n3009,10.64.65.9,198.48.92.26 -p 0 -c 1.0 -m None --poll 10 --task_port=54335 --result_port=54097 --logdir=/mmfs1/gscratch/dirac/DEEP/submit/DEEP/20190504/A1c/science#step1/20240313T004552Z/runinfo/000/multi --block_id=3 --hb_period=30 --hb_threshold=120 --cpu-affinity none --available-accelerators --start-method spawn to astro parsl.providers.slurm.slurm DEBUG: Requesting one block with 1 nodes parsl.providers.slurm.slurm DEBUG: Writing submit script parsl.providers.slurm.slurm DEBUG: moving files 2024-03-13 00:46:35 proc_lsst.multi:170 [INFO] [multi] job_id 17006650 proc_lsst.multi INFO: [multi] job_id 17006650 2024-03-13 00:46:35 proc_lsst.multi:171 [INFO] [multi] len(self.providers[provider].resources) 3 proc_lsst.multi INFO: [multi] len(self.providers[provider].resources) 3 2024-03-13 00:46:35 proc_lsst.multi:172 [INFO] [multi] self.providers[provider].max_blocks 30 proc_lsst.multi INFO: [multi] self.providers[provider].max_blocks 30 2024-03-13 00:46:35 proc_lsst.multi:178 [INFO] [multi] provider astro accepted submit and returned 17006650 proc_lsst.multi INFO: [multi] provider astro accepted submit and returned 17006650 parsl.executors.status_handling DEBUG: Launched block 3 on executor multi with job ID 17006650 parsl.executors.status_handling INFO: Allocated block ID 4 parsl.executors.status_handling DEBUG: Submitting to provider with job_name parsl.multi.block-4 2024-03-13 00:46:35 proc_lsst.multi:153 [INFO] [multi] got submit process_worker_pool.py -a 169.254.95.120,127.0.0.1,10.64.129.9,n3009,10.64.65.9,198.48.92.26 -p 0 -c 1.0 -m None --poll 10 --task_port=54335 --result_port=54097 --logdir=/mmfs1/gscratch/dirac/DEEP/submit/DEEP/20190504/A1c/science#step1/20240313T004552Z/runinfo/000/multi --block_id=4 --hb_period=30 --hb_threshold=120 --cpu-affinity none --available-accelerators --start-method spawn 1 parsl.multi.block-4 proc_lsst.multi INFO: [multi] got submit process_worker_pool.py -a 169.254.95.120,127.0.0.1,10.64.129.9,n3009,10.64.65.9,198.48.92.26 -p 0 -c 1.0 -m None --poll 10 --task_port=54335 --result_port=54097 --logdir=/mmfs1/gscratch/dirac/DEEP/submit/DEEP/20190504/A1c/science#step1/20240313T004552Z/runinfo/000/multi --block_id=4 --hb_period=30 --hb_threshold=120 --cpu-affinity none --available-accelerators --start-method spawn 1 parsl.multi.block-4 2024-03-13 00:46:35 proc_lsst.multi:162 [INFO] [multi] local process_worker_pool.py -a 169.254.95.120,127.0.0.1,10.64.129.9,n3009,10.64.65.9,198.48.92.26 -p 0 -c 1.0 -m None --poll 10 --task_port=54335 --result_port=54097 --logdir=/mmfs1/gscratch/dirac/DEEP/submit/DEEP/20190504/A1c/science#step1/20240313T004552Z/runinfo/000/multi --block_id=4 --hb_period=30 --hb_threshold=120 --cpu-affinity none --available-accelerators --start-method spawn --max_workers 2 proc_lsst.multi INFO: [multi] local process_worker_pool.py -a 169.254.95.120,127.0.0.1,10.64.129.9,n3009,10.64.65.9,198.48.92.26 -p 0 -c 1.0 -m None --poll 10 --task_port=54335 --result_port=54097 --logdir=/mmfs1/gscratch/dirac/DEEP/submit/DEEP/20190504/A1c/science#step1/20240313T004552Z/runinfo/000/multi --block_id=4 --hb_period=30 --hb_threshold=120 --cpu-affinity none --available-accelerators --start-method spawn --max_workers 2 2024-03-13 00:46:35 proc_lsst.multi:163 [INFO] [multi] len(self.providers[provider].resources) 1 proc_lsst.multi INFO: [multi] len(self.providers[provider].resources) 1 2024-03-13 00:46:35 proc_lsst.multi:164 [INFO] [multi] self.providers[provider].max_blocks 1 proc_lsst.multi INFO: [multi] self.providers[provider].max_blocks 1 2024-03-13 00:46:35 proc_lsst.multi:162 [INFO] [multi] astro process_worker_pool.py -a 169.254.95.120,127.0.0.1,10.64.129.9,n3009,10.64.65.9,198.48.92.26 -p 0 -c 1.0 -m None --poll 10 --task_port=54335 --result_port=54097 --logdir=/mmfs1/gscratch/dirac/DEEP/submit/DEEP/20190504/A1c/science#step1/20240313T004552Z/runinfo/000/multi --block_id=4 --hb_period=30 --hb_threshold=120 --cpu-affinity none --available-accelerators --start-method spawn proc_lsst.multi INFO: [multi] astro process_worker_pool.py -a 169.254.95.120,127.0.0.1,10.64.129.9,n3009,10.64.65.9,198.48.92.26 -p 0 -c 1.0 -m None --poll 10 --task_port=54335 --result_port=54097 --logdir=/mmfs1/gscratch/dirac/DEEP/submit/DEEP/20190504/A1c/science#step1/20240313T004552Z/runinfo/000/multi --block_id=4 --hb_period=30 --hb_threshold=120 --cpu-affinity none --available-accelerators --start-method spawn 2024-03-13 00:46:35 proc_lsst.multi:163 [INFO] [multi] len(self.providers[provider].resources) 3 proc_lsst.multi INFO: [multi] len(self.providers[provider].resources) 3 2024-03-13 00:46:35 proc_lsst.multi:164 [INFO] [multi] self.providers[provider].max_blocks 30 proc_lsst.multi INFO: [multi] self.providers[provider].max_blocks 30 2024-03-13 00:46:35 proc_lsst.multi:166 [INFO] [multi] submitting process_worker_pool.py -a 169.254.95.120,127.0.0.1,10.64.129.9,n3009,10.64.65.9,198.48.92.26 -p 0 -c 1.0 -m None --poll 10 --task_port=54335 --result_port=54097 --logdir=/mmfs1/gscratch/dirac/DEEP/submit/DEEP/20190504/A1c/science#step1/20240313T004552Z/runinfo/000/multi --block_id=4 --hb_period=30 --hb_threshold=120 --cpu-affinity none --available-accelerators --start-method spawn to astro proc_lsst.multi INFO: [multi] submitting process_worker_pool.py -a 169.254.95.120,127.0.0.1,10.64.129.9,n3009,10.64.65.9,198.48.92.26 -p 0 -c 1.0 -m None --poll 10 --task_port=54335 --result_port=54097 --logdir=/mmfs1/gscratch/dirac/DEEP/submit/DEEP/20190504/A1c/science#step1/20240313T004552Z/runinfo/000/multi --block_id=4 --hb_period=30 --hb_threshold=120 --cpu-affinity none --available-accelerators --start-method spawn to astro parsl.providers.slurm.slurm DEBUG: Requesting one block with 1 nodes parsl.providers.slurm.slurm DEBUG: Writing submit script parsl.providers.slurm.slurm DEBUG: moving files 2024-03-13 00:46:35 proc_lsst.multi:170 [INFO] [multi] job_id 17006651 proc_lsst.multi INFO: [multi] job_id 17006651 2024-03-13 00:46:35 proc_lsst.multi:171 [INFO] [multi] len(self.providers[provider].resources) 4 proc_lsst.multi INFO: [multi] len(self.providers[provider].resources) 4 2024-03-13 00:46:35 proc_lsst.multi:172 [INFO] [multi] self.providers[provider].max_blocks 30 proc_lsst.multi INFO: [multi] self.providers[provider].max_blocks 30 2024-03-13 00:46:35 proc_lsst.multi:178 [INFO] [multi] provider astro accepted submit and returned 17006651 proc_lsst.multi INFO: [multi] provider astro accepted submit and returned 17006651 parsl.executors.status_handling DEBUG: Launched block 4 on executor multi with job ID 17006651 parsl.executors.status_handling INFO: Allocated block ID 5 parsl.executors.status_handling DEBUG: Submitting to provider with job_name parsl.multi.block-5 2024-03-13 00:46:35 proc_lsst.multi:153 [INFO] [multi] got submit process_worker_pool.py -a 169.254.95.120,127.0.0.1,10.64.129.9,n3009,10.64.65.9,198.48.92.26 -p 0 -c 1.0 -m None --poll 10 --task_port=54335 --result_port=54097 --logdir=/mmfs1/gscratch/dirac/DEEP/submit/DEEP/20190504/A1c/science#step1/20240313T004552Z/runinfo/000/multi --block_id=5 --hb_period=30 --hb_threshold=120 --cpu-affinity none --available-accelerators --start-method spawn 1 parsl.multi.block-5 proc_lsst.multi INFO: [multi] got submit process_worker_pool.py -a 169.254.95.120,127.0.0.1,10.64.129.9,n3009,10.64.65.9,198.48.92.26 -p 0 -c 1.0 -m None --poll 10 --task_port=54335 --result_port=54097 --logdir=/mmfs1/gscratch/dirac/DEEP/submit/DEEP/20190504/A1c/science#step1/20240313T004552Z/runinfo/000/multi --block_id=5 --hb_period=30 --hb_threshold=120 --cpu-affinity none --available-accelerators --start-method spawn 1 parsl.multi.block-5 2024-03-13 00:46:35 proc_lsst.multi:162 [INFO] [multi] local process_worker_pool.py -a 169.254.95.120,127.0.0.1,10.64.129.9,n3009,10.64.65.9,198.48.92.26 -p 0 -c 1.0 -m None --poll 10 --task_port=54335 --result_port=54097 --logdir=/mmfs1/gscratch/dirac/DEEP/submit/DEEP/20190504/A1c/science#step1/20240313T004552Z/runinfo/000/multi --block_id=5 --hb_period=30 --hb_threshold=120 --cpu-affinity none --available-accelerators --start-method spawn --max_workers 2 proc_lsst.multi INFO: [multi] local process_worker_pool.py -a 169.254.95.120,127.0.0.1,10.64.129.9,n3009,10.64.65.9,198.48.92.26 -p 0 -c 1.0 -m None --poll 10 --task_port=54335 --result_port=54097 --logdir=/mmfs1/gscratch/dirac/DEEP/submit/DEEP/20190504/A1c/science#step1/20240313T004552Z/runinfo/000/multi --block_id=5 --hb_period=30 --hb_threshold=120 --cpu-affinity none --available-accelerators --start-method spawn --max_workers 2 2024-03-13 00:46:35 proc_lsst.multi:163 [INFO] [multi] len(self.providers[provider].resources) 1 proc_lsst.multi INFO: [multi] len(self.providers[provider].resources) 1 2024-03-13 00:46:35 proc_lsst.multi:164 [INFO] [multi] self.providers[provider].max_blocks 1 proc_lsst.multi INFO: [multi] self.providers[provider].max_blocks 1 2024-03-13 00:46:35 proc_lsst.multi:162 [INFO] [multi] astro process_worker_pool.py -a 169.254.95.120,127.0.0.1,10.64.129.9,n3009,10.64.65.9,198.48.92.26 -p 0 -c 1.0 -m None --poll 10 --task_port=54335 --result_port=54097 --logdir=/mmfs1/gscratch/dirac/DEEP/submit/DEEP/20190504/A1c/science#step1/20240313T004552Z/runinfo/000/multi --block_id=5 --hb_period=30 --hb_threshold=120 --cpu-affinity none --available-accelerators --start-method spawn proc_lsst.multi INFO: [multi] astro process_worker_pool.py -a 169.254.95.120,127.0.0.1,10.64.129.9,n3009,10.64.65.9,198.48.92.26 -p 0 -c 1.0 -m None --poll 10 --task_port=54335 --result_port=54097 --logdir=/mmfs1/gscratch/dirac/DEEP/submit/DEEP/20190504/A1c/science#step1/20240313T004552Z/runinfo/000/multi --block_id=5 --hb_period=30 --hb_threshold=120 --cpu-affinity none --available-accelerators --start-method spawn 2024-03-13 00:46:35 proc_lsst.multi:163 [INFO] [multi] len(self.providers[provider].resources) 4 proc_lsst.multi INFO: [multi] len(self.providers[provider].resources) 4 2024-03-13 00:46:35 proc_lsst.multi:164 [INFO] [multi] self.providers[provider].max_blocks 30 proc_lsst.multi INFO: [multi] self.providers[provider].max_blocks 30 2024-03-13 00:46:35 proc_lsst.multi:166 [INFO] [multi] submitting process_worker_pool.py -a 169.254.95.120,127.0.0.1,10.64.129.9,n3009,10.64.65.9,198.48.92.26 -p 0 -c 1.0 -m None --poll 10 --task_port=54335 --result_port=54097 --logdir=/mmfs1/gscratch/dirac/DEEP/submit/DEEP/20190504/A1c/science#step1/20240313T004552Z/runinfo/000/multi --block_id=5 --hb_period=30 --hb_threshold=120 --cpu-affinity none --available-accelerators --start-method spawn to astro proc_lsst.multi INFO: [multi] submitting process_worker_pool.py -a 169.254.95.120,127.0.0.1,10.64.129.9,n3009,10.64.65.9,198.48.92.26 -p 0 -c 1.0 -m None --poll 10 --task_port=54335 --result_port=54097 --logdir=/mmfs1/gscratch/dirac/DEEP/submit/DEEP/20190504/A1c/science#step1/20240313T004552Z/runinfo/000/multi --block_id=5 --hb_period=30 --hb_threshold=120 --cpu-affinity none --available-accelerators --start-method spawn to astro parsl.providers.slurm.slurm DEBUG: Requesting one block with 1 nodes parsl.providers.slurm.slurm DEBUG: Writing submit script parsl.providers.slurm.slurm DEBUG: moving files 2024-03-13 00:46:36 proc_lsst.multi:170 [INFO] [multi] job_id 17006652 proc_lsst.multi INFO: [multi] job_id 17006652 2024-03-13 00:46:36 proc_lsst.multi:171 [INFO] [multi] len(self.providers[provider].resources) 5 proc_lsst.multi INFO: [multi] len(self.providers[provider].resources) 5 2024-03-13 00:46:36 proc_lsst.multi:172 [INFO] [multi] self.providers[provider].max_blocks 30 proc_lsst.multi INFO: [multi] self.providers[provider].max_blocks 30 2024-03-13 00:46:36 proc_lsst.multi:178 [INFO] [multi] provider astro accepted submit and returned 17006652 proc_lsst.multi INFO: [multi] provider astro accepted submit and returned 17006652 parsl.executors.status_handling DEBUG: Launched block 5 on executor multi with job ID 17006652 parsl.process_loggers DEBUG: Normal ending for _general_strategy on thread JobStatusPoller-Timer-Thread-22680214849104 parsl.dataflow.strategy DEBUG: general strategy starting with strategy_type simple for 1 executors parsl.dataflow.strategy DEBUG: Strategizing for executor multi parsl.dataflow.strategy DEBUG: Slot ratio calculation: active_slots = 6, active_tasks = 24 parsl.dataflow.strategy DEBUG: Executor multi has 24 active tasks, 1/5 running/pending blocks, and 4 connected workers parsl.dataflow.strategy DEBUG: Strategy case 3: no changes necessary to current block load parsl.process_loggers DEBUG: Normal ending for _general_strategy on thread JobStatusPoller-Timer-Thread-22680214849104 parsl.dataflow.strategy DEBUG: general strategy starting with strategy_type simple for 1 executors parsl.dataflow.strategy DEBUG: Strategizing for executor multi parsl.dataflow.strategy DEBUG: Slot ratio calculation: active_slots = 6, active_tasks = 24 parsl.dataflow.strategy DEBUG: Executor multi has 24 active tasks, 1/5 running/pending blocks, and 7 connected workers parsl.dataflow.strategy DEBUG: Strategy case 3: no changes necessary to current block load parsl.process_loggers DEBUG: Normal ending for _general_strategy on thread JobStatusPoller-Timer-Thread-22680214849104 parsl.dataflow.strategy DEBUG: general strategy starting with strategy_type simple for 1 executors parsl.dataflow.strategy DEBUG: Strategizing for executor multi parsl.dataflow.strategy DEBUG: Slot ratio calculation: active_slots = 6, active_tasks = 24 parsl.dataflow.strategy DEBUG: Executor multi has 24 active tasks, 1/5 running/pending blocks, and 7 connected workers parsl.dataflow.strategy DEBUG: Strategy case 3: no changes necessary to current block load parsl.process_loggers DEBUG: Normal ending for _general_strategy on thread JobStatusPoller-Timer-Thread-22680214849104 parsl.dataflow.strategy DEBUG: general strategy starting with strategy_type simple for 1 executors parsl.dataflow.strategy DEBUG: Strategizing for executor multi parsl.dataflow.strategy DEBUG: Slot ratio calculation: active_slots = 6, active_tasks = 24 parsl.dataflow.strategy DEBUG: Executor multi has 24 active tasks, 1/5 running/pending blocks, and 7 connected workers parsl.dataflow.strategy DEBUG: Strategy case 3: no changes necessary to current block load parsl.process_loggers DEBUG: Normal ending for _general_strategy on thread JobStatusPoller-Timer-Thread-22680214849104 parsl.dataflow.strategy DEBUG: general strategy starting with strategy_type simple for 1 executors parsl.dataflow.strategy DEBUG: Strategizing for executor multi parsl.dataflow.strategy DEBUG: Slot ratio calculation: active_slots = 6, active_tasks = 24 parsl.dataflow.strategy DEBUG: Executor multi has 24 active tasks, 1/5 running/pending blocks, and 7 connected workers parsl.dataflow.strategy DEBUG: Strategy case 3: no changes necessary to current block load parsl.process_loggers DEBUG: Normal ending for _general_strategy on thread JobStatusPoller-Timer-Thread-22680214849104 2024-03-13 00:47:04 proc_lsst.multi:146 [INFO] found job 45623 in provider local proc_lsst.multi INFO: found job 45623 in provider local 2024-03-13 00:47:04 proc_lsst.multi:146 [INFO] found job 17006648 in provider astro proc_lsst.multi INFO: found job 17006648 in provider astro parsl.providers.slurm.slurm DEBUG: Executing squeue --noheader --format='%i %t' --job '17006648,17006649,17006650,17006651,17006652' parsl.providers.slurm.slurm DEBUG: squeue returned 17006650 R 17006651 R 17006652 R 17006648 R 17006649 R parsl.providers.slurm.slurm DEBUG: Updating job 17006650 with slurm status R to parsl state JobState.RUNNING parsl.providers.slurm.slurm DEBUG: Updating job 17006651 with slurm status R to parsl state JobState.RUNNING parsl.providers.slurm.slurm DEBUG: Updating job 17006652 with slurm status R to parsl state JobState.RUNNING parsl.providers.slurm.slurm DEBUG: Updating job 17006648 with slurm status R to parsl state JobState.RUNNING parsl.providers.slurm.slurm DEBUG: Updating job 17006649 with slurm status R to parsl state JobState.RUNNING 2024-03-13 00:47:05 proc_lsst.multi:146 [INFO] found job 17006649 in provider astro proc_lsst.multi INFO: found job 17006649 in provider astro parsl.providers.slurm.slurm DEBUG: Executing squeue --noheader --format='%i %t' --job '17006648,17006649,17006650,17006651,17006652' parsl.providers.slurm.slurm DEBUG: squeue returned 17006650 R 17006651 R 17006652 R 17006648 R 17006649 R parsl.providers.slurm.slurm DEBUG: Updating job 17006650 with slurm status R to parsl state JobState.RUNNING parsl.providers.slurm.slurm DEBUG: Updating job 17006651 with slurm status R to parsl state JobState.RUNNING parsl.providers.slurm.slurm DEBUG: Updating job 17006652 with slurm status R to parsl state JobState.RUNNING parsl.providers.slurm.slurm DEBUG: Updating job 17006648 with slurm status R to parsl state JobState.RUNNING parsl.providers.slurm.slurm DEBUG: Updating job 17006649 with slurm status R to parsl state JobState.RUNNING 2024-03-13 00:47:05 proc_lsst.multi:146 [INFO] found job 17006650 in provider astro proc_lsst.multi INFO: found job 17006650 in provider astro parsl.providers.slurm.slurm DEBUG: Executing squeue --noheader --format='%i %t' --job '17006648,17006649,17006650,17006651,17006652' parsl.providers.slurm.slurm DEBUG: squeue returned 17006650 R 17006651 R 17006652 R 17006648 R 17006649 R parsl.providers.slurm.slurm DEBUG: Updating job 17006650 with slurm status R to parsl state JobState.RUNNING parsl.providers.slurm.slurm DEBUG: Updating job 17006651 with slurm status R to parsl state JobState.RUNNING parsl.providers.slurm.slurm DEBUG: Updating job 17006652 with slurm status R to parsl state JobState.RUNNING parsl.providers.slurm.slurm DEBUG: Updating job 17006648 with slurm status R to parsl state JobState.RUNNING parsl.providers.slurm.slurm DEBUG: Updating job 17006649 with slurm status R to parsl state JobState.RUNNING 2024-03-13 00:47:06 proc_lsst.multi:146 [INFO] found job 17006651 in provider astro proc_lsst.multi INFO: found job 17006651 in provider astro parsl.providers.slurm.slurm DEBUG: Executing squeue --noheader --format='%i %t' --job '17006648,17006649,17006650,17006651,17006652' parsl.providers.slurm.slurm DEBUG: squeue returned 17006650 R 17006651 R 17006652 R 17006648 R 17006649 R parsl.providers.slurm.slurm DEBUG: Updating job 17006650 with slurm status R to parsl state JobState.RUNNING parsl.providers.slurm.slurm DEBUG: Updating job 17006651 with slurm status R to parsl state JobState.RUNNING parsl.providers.slurm.slurm DEBUG: Updating job 17006652 with slurm status R to parsl state JobState.RUNNING parsl.providers.slurm.slurm DEBUG: Updating job 17006648 with slurm status R to parsl state JobState.RUNNING parsl.providers.slurm.slurm DEBUG: Updating job 17006649 with slurm status R to parsl state JobState.RUNNING 2024-03-13 00:47:06 proc_lsst.multi:146 [INFO] found job 17006652 in provider astro proc_lsst.multi INFO: found job 17006652 in provider astro parsl.providers.slurm.slurm DEBUG: Executing squeue --noheader --format='%i %t' --job '17006648,17006649,17006650,17006651,17006652' parsl.providers.slurm.slurm DEBUG: squeue returned 17006650 R 17006651 R 17006652 R 17006648 R 17006649 R parsl.providers.slurm.slurm DEBUG: Updating job 17006650 with slurm status R to parsl state JobState.RUNNING parsl.providers.slurm.slurm DEBUG: Updating job 17006651 with slurm status R to parsl state JobState.RUNNING parsl.providers.slurm.slurm DEBUG: Updating job 17006652 with slurm status R to parsl state JobState.RUNNING parsl.providers.slurm.slurm DEBUG: Updating job 17006648 with slurm status R to parsl state JobState.RUNNING parsl.providers.slurm.slurm DEBUG: Updating job 17006649 with slurm status R to parsl state JobState.RUNNING parsl.dataflow.strategy DEBUG: general strategy starting with strategy_type simple for 1 executors parsl.dataflow.strategy DEBUG: Strategizing for executor multi parsl.dataflow.strategy DEBUG: Slot ratio calculation: active_slots = 6, active_tasks = 24 parsl.dataflow.strategy DEBUG: Executor multi has 24 active tasks, 6/0 running/pending blocks, and 7 connected workers parsl.dataflow.strategy DEBUG: Strategy case 3: no changes necessary to current block load parsl.process_loggers DEBUG: Normal ending for _general_strategy on thread JobStatusPoller-Timer-Thread-22680214849104 parsl.dataflow.strategy DEBUG: general strategy starting with strategy_type simple for 1 executors parsl.dataflow.strategy DEBUG: Strategizing for executor multi parsl.dataflow.strategy DEBUG: Slot ratio calculation: active_slots = 6, active_tasks = 24 parsl.dataflow.strategy DEBUG: Executor multi has 24 active tasks, 6/0 running/pending blocks, and 7 connected workers parsl.dataflow.strategy DEBUG: Strategy case 3: no changes necessary to current block load parsl.process_loggers DEBUG: Normal ending for _general_strategy on thread JobStatusPoller-Timer-Thread-22680214849104 parsl.dataflow.strategy DEBUG: general strategy starting with strategy_type simple for 1 executors parsl.dataflow.strategy DEBUG: Strategizing for executor multi parsl.dataflow.strategy DEBUG: Slot ratio calculation: active_slots = 6, active_tasks = 24 parsl.dataflow.strategy DEBUG: Executor multi has 24 active tasks, 6/0 running/pending blocks, and 7 connected workers parsl.dataflow.strategy DEBUG: Strategy case 3: no changes necessary to current block load parsl.process_loggers DEBUG: Normal ending for _general_strategy on thread JobStatusPoller-Timer-Thread-22680214849104 parsl.dataflow.strategy DEBUG: general strategy starting with strategy_type simple for 1 executors parsl.dataflow.strategy DEBUG: Strategizing for executor multi parsl.dataflow.strategy DEBUG: Slot ratio calculation: active_slots = 6, active_tasks = 24 parsl.dataflow.strategy DEBUG: Executor multi has 24 active tasks, 6/0 running/pending blocks, and 7 connected workers parsl.dataflow.strategy DEBUG: Strategy case 3: no changes necessary to current block load parsl.process_loggers DEBUG: Normal ending for _general_strategy on thread JobStatusPoller-Timer-Thread-22680214849104 parsl.dataflow.strategy DEBUG: general strategy starting with strategy_type simple for 1 executors parsl.dataflow.strategy DEBUG: Strategizing for executor multi parsl.dataflow.strategy DEBUG: Slot ratio calculation: active_slots = 6, active_tasks = 24 parsl.dataflow.strategy DEBUG: Executor multi has 24 active tasks, 6/0 running/pending blocks, and 7 connected workers parsl.dataflow.strategy DEBUG: Strategy case 3: no changes necessary to current block load parsl.process_loggers DEBUG: Normal ending for _general_strategy on thread JobStatusPoller-Timer-Thread-22680214849104 parsl.dataflow.strategy DEBUG: general strategy starting with strategy_type simple for 1 executors parsl.dataflow.strategy DEBUG: Strategizing for executor multi parsl.dataflow.strategy DEBUG: Slot ratio calculation: active_slots = 6, active_tasks = 24 parsl.dataflow.strategy DEBUG: Executor multi has 24 active tasks, 6/0 running/pending blocks, and 7 connected workers parsl.dataflow.strategy DEBUG: Strategy case 3: no changes necessary to current block load parsl.process_loggers DEBUG: Normal ending for _general_strategy on thread JobStatusPoller-Timer-Thread-22680214849104 parsl.app.errors DEBUG: Reraising exception of type parsl.dataflow.dflow DEBUG: Task 0 try 0 failed parsl.dataflow.dflow INFO: Task 0 marked for retry parsl.dataflow.dflow INFO: Standard output for task 0 available at /mmfs1/home/stevengs/dirac/DEEP/submit/DEEP/20190504/A1c/science#step1/20240313T004552Z/submit/DEEP/20190504/A1c/science#step1/20240313T004552Z/logs/calibrate/855240/46c0754b-7817-4760-9c47-d4bff5a7fcea_calibrate_855240_61.stdout parsl.dataflow.dflow INFO: Standard error for task 0 available at /mmfs1/home/stevengs/dirac/DEEP/submit/DEEP/20190504/A1c/science#step1/20240313T004552Z/submit/DEEP/20190504/A1c/science#step1/20240313T004552Z/logs/calibrate/855240/46c0754b-7817-4760-9c47-d4bff5a7fcea_calibrate_855240_61.stderr parsl.dataflow.memoization DEBUG: Ignoring these kwargs for checkpointing: ['stderr', 'stdout'] parsl.dataflow.memoization DEBUG: Ignoring kwarg stderr parsl.dataflow.memoization DEBUG: Ignoring kwarg stdout parsl.dataflow.memoization DEBUG: Task 0 has memoization hash 511835a8acc02d39e3b6d1d2225bb1a4 parsl.dataflow.memoization INFO: Task 0 had no result in cache parsl.executors.high_throughput.executor DEBUG: Pushing function .wrapper at 0x14a0a59be980> to queue with args ("'${CTRL_MPEXEC_DIR}/bin/pipetask --long-log --log-level=VERBOSE run-qbb /mmfs1/home/stevengs/dirac/D...",) parsl.dataflow.dflow INFO: Parsl task 0 try 1 launched on executor multi with executor id 25 parsl.dataflow.dflow INFO: Standard output for task 0 available at /mmfs1/home/stevengs/dirac/DEEP/submit/DEEP/20190504/A1c/science#step1/20240313T004552Z/submit/DEEP/20190504/A1c/science#step1/20240313T004552Z/logs/calibrate/855240/46c0754b-7817-4760-9c47-d4bff5a7fcea_calibrate_855240_61.stdout parsl.dataflow.dflow INFO: Standard error for task 0 available at /mmfs1/home/stevengs/dirac/DEEP/submit/DEEP/20190504/A1c/science#step1/20240313T004552Z/submit/DEEP/20190504/A1c/science#step1/20240313T004552Z/logs/calibrate/855240/46c0754b-7817-4760-9c47-d4bff5a7fcea_calibrate_855240_61.stderr parsl.app.errors DEBUG: Reraising exception of type parsl.dataflow.dflow DEBUG: Task 6 try 0 failed parsl.dataflow.dflow INFO: Task 6 marked for retry parsl.dataflow.dflow INFO: Standard output for task 6 available at /mmfs1/home/stevengs/dirac/DEEP/submit/DEEP/20190504/A1c/science#step1/20240313T004552Z/submit/DEEP/20190504/A1c/science#step1/20240313T004552Z/logs/calibrate/855242/77c764d1-8a76-4375-b57a-c3586164fc7d_calibrate_855242_61.stdout parsl.dataflow.dflow INFO: Standard error for task 6 available at /mmfs1/home/stevengs/dirac/DEEP/submit/DEEP/20190504/A1c/science#step1/20240313T004552Z/submit/DEEP/20190504/A1c/science#step1/20240313T004552Z/logs/calibrate/855242/77c764d1-8a76-4375-b57a-c3586164fc7d_calibrate_855242_61.stderr parsl.dataflow.memoization DEBUG: Ignoring these kwargs for checkpointing: ['stderr', 'stdout'] parsl.dataflow.memoization DEBUG: Ignoring kwarg stderr parsl.dataflow.memoization DEBUG: Ignoring kwarg stdout parsl.dataflow.memoization DEBUG: Task 6 has memoization hash 4ee1b38b81c097313c1a0968eaf59c68 parsl.dataflow.memoization INFO: Task 6 had no result in cache parsl.executors.high_throughput.executor DEBUG: Pushing function .wrapper at 0x14a0a59bf2e0> to queue with args ("'${CTRL_MPEXEC_DIR}/bin/pipetask --long-log --log-level=VERBOSE run-qbb /mmfs1/home/stevengs/dirac/D...",) parsl.dataflow.dflow INFO: Parsl task 6 try 1 launched on executor multi with executor id 26 parsl.dataflow.dflow INFO: Standard output for task 6 available at /mmfs1/home/stevengs/dirac/DEEP/submit/DEEP/20190504/A1c/science#step1/20240313T004552Z/submit/DEEP/20190504/A1c/science#step1/20240313T004552Z/logs/calibrate/855242/77c764d1-8a76-4375-b57a-c3586164fc7d_calibrate_855242_61.stdout parsl.dataflow.dflow INFO: Standard error for task 6 available at /mmfs1/home/stevengs/dirac/DEEP/submit/DEEP/20190504/A1c/science#step1/20240313T004552Z/submit/DEEP/20190504/A1c/science#step1/20240313T004552Z/logs/calibrate/855242/77c764d1-8a76-4375-b57a-c3586164fc7d_calibrate_855242_61.stderr parsl.app.errors DEBUG: Reraising exception of type parsl.dataflow.dflow DEBUG: Task 3 try 0 failed parsl.dataflow.dflow INFO: Task 3 marked for retry parsl.dataflow.dflow INFO: Standard output for task 3 available at /mmfs1/home/stevengs/dirac/DEEP/submit/DEEP/20190504/A1c/science#step1/20240313T004552Z/submit/DEEP/20190504/A1c/science#step1/20240313T004552Z/logs/calibrate/855232/8431b289-f9a9-4bbc-99eb-756a86d9f419_calibrate_855232_61.stdout parsl.dataflow.dflow INFO: Standard error for task 3 available at /mmfs1/home/stevengs/dirac/DEEP/submit/DEEP/20190504/A1c/science#step1/20240313T004552Z/submit/DEEP/20190504/A1c/science#step1/20240313T004552Z/logs/calibrate/855232/8431b289-f9a9-4bbc-99eb-756a86d9f419_calibrate_855232_61.stderr parsl.dataflow.memoization DEBUG: Ignoring these kwargs for checkpointing: ['stderr', 'stdout'] parsl.dataflow.memoization DEBUG: Ignoring kwarg stderr parsl.dataflow.memoization DEBUG: Ignoring kwarg stdout parsl.dataflow.memoization DEBUG: Task 3 has memoization hash 8c01d02f1b63178e7b043bc26f1e657d parsl.dataflow.memoization INFO: Task 3 had no result in cache parsl.executors.high_throughput.executor DEBUG: Pushing function .wrapper at 0x14a0a59bee80> to queue with args ("'${CTRL_MPEXEC_DIR}/bin/pipetask --long-log --log-level=VERBOSE run-qbb /mmfs1/home/stevengs/dirac/D...",) parsl.dataflow.dflow INFO: Parsl task 3 try 1 launched on executor multi with executor id 27 parsl.dataflow.dflow INFO: Standard output for task 3 available at /mmfs1/home/stevengs/dirac/DEEP/submit/DEEP/20190504/A1c/science#step1/20240313T004552Z/submit/DEEP/20190504/A1c/science#step1/20240313T004552Z/logs/calibrate/855232/8431b289-f9a9-4bbc-99eb-756a86d9f419_calibrate_855232_61.stdout parsl.dataflow.dflow INFO: Standard error for task 3 available at /mmfs1/home/stevengs/dirac/DEEP/submit/DEEP/20190504/A1c/science#step1/20240313T004552Z/submit/DEEP/20190504/A1c/science#step1/20240313T004552Z/logs/calibrate/855232/8431b289-f9a9-4bbc-99eb-756a86d9f419_calibrate_855232_61.stderr parsl.app.errors DEBUG: Reraising exception of type parsl.dataflow.dflow DEBUG: Task 5 try 0 failed parsl.dataflow.dflow INFO: Task 5 marked for retry parsl.dataflow.dflow INFO: Standard output for task 5 available at /mmfs1/home/stevengs/dirac/DEEP/submit/DEEP/20190504/A1c/science#step1/20240313T004552Z/submit/DEEP/20190504/A1c/science#step1/20240313T004552Z/logs/calibrate/855233/9cb72634-0004-4cf3-abd5-f5d161501a82_calibrate_855233_61.stdout parsl.dataflow.dflow INFO: Standard error for task 5 available at /mmfs1/home/stevengs/dirac/DEEP/submit/DEEP/20190504/A1c/science#step1/20240313T004552Z/submit/DEEP/20190504/A1c/science#step1/20240313T004552Z/logs/calibrate/855233/9cb72634-0004-4cf3-abd5-f5d161501a82_calibrate_855233_61.stderr parsl.dataflow.memoization DEBUG: Ignoring these kwargs for checkpointing: ['stderr', 'stdout'] parsl.dataflow.memoization DEBUG: Ignoring kwarg stderr parsl.dataflow.memoization DEBUG: Ignoring kwarg stdout parsl.dataflow.memoization DEBUG: Task 5 has memoization hash 35e6072f8e12c92b613d91f28ff6e335 parsl.dataflow.memoization INFO: Task 5 had no result in cache parsl.executors.high_throughput.executor DEBUG: Pushing function .wrapper at 0x14a0a59bf100> to queue with args ("'${CTRL_MPEXEC_DIR}/bin/pipetask --long-log --log-level=VERBOSE run-qbb /mmfs1/home/stevengs/dirac/D...",) parsl.dataflow.dflow INFO: Parsl task 5 try 1 launched on executor multi with executor id 28 parsl.dataflow.dflow INFO: Standard output for task 5 available at /mmfs1/home/stevengs/dirac/DEEP/submit/DEEP/20190504/A1c/science#step1/20240313T004552Z/submit/DEEP/20190504/A1c/science#step1/20240313T004552Z/logs/calibrate/855233/9cb72634-0004-4cf3-abd5-f5d161501a82_calibrate_855233_61.stdout parsl.dataflow.dflow INFO: Standard error for task 5 available at /mmfs1/home/stevengs/dirac/DEEP/submit/DEEP/20190504/A1c/science#step1/20240313T004552Z/submit/DEEP/20190504/A1c/science#step1/20240313T004552Z/logs/calibrate/855233/9cb72634-0004-4cf3-abd5-f5d161501a82_calibrate_855233_61.stderr 2024-03-13 00:47:34 proc_lsst.multi:146 [INFO] found job 45623 in provider local proc_lsst.multi INFO: found job 45623 in provider local 2024-03-13 00:47:34 proc_lsst.multi:146 [INFO] found job 17006648 in provider astro proc_lsst.multi INFO: found job 17006648 in provider astro parsl.providers.slurm.slurm DEBUG: Executing squeue --noheader --format='%i %t' --job '17006648,17006649,17006650,17006651,17006652' parsl.providers.slurm.slurm DEBUG: squeue returned 17006650 R 17006651 R 17006652 R 17006648 R 17006649 R parsl.providers.slurm.slurm DEBUG: Updating job 17006650 with slurm status R to parsl state JobState.RUNNING parsl.providers.slurm.slurm DEBUG: Updating job 17006651 with slurm status R to parsl state JobState.RUNNING parsl.providers.slurm.slurm DEBUG: Updating job 17006652 with slurm status R to parsl state JobState.RUNNING parsl.providers.slurm.slurm DEBUG: Updating job 17006648 with slurm status R to parsl state JobState.RUNNING parsl.providers.slurm.slurm DEBUG: Updating job 17006649 with slurm status R to parsl state JobState.RUNNING 2024-03-13 00:47:35 proc_lsst.multi:146 [INFO] found job 17006649 in provider astro proc_lsst.multi INFO: found job 17006649 in provider astro parsl.providers.slurm.slurm DEBUG: Executing squeue --noheader --format='%i %t' --job '17006648,17006649,17006650,17006651,17006652' parsl.providers.slurm.slurm DEBUG: squeue returned 17006650 R 17006651 R 17006652 R 17006648 R 17006649 R parsl.providers.slurm.slurm DEBUG: Updating job 17006650 with slurm status R to parsl state JobState.RUNNING parsl.providers.slurm.slurm DEBUG: Updating job 17006651 with slurm status R to parsl state JobState.RUNNING parsl.providers.slurm.slurm DEBUG: Updating job 17006652 with slurm status R to parsl state JobState.RUNNING parsl.providers.slurm.slurm DEBUG: Updating job 17006648 with slurm status R to parsl state JobState.RUNNING parsl.providers.slurm.slurm DEBUG: Updating job 17006649 with slurm status R to parsl state JobState.RUNNING 2024-03-13 00:47:35 proc_lsst.multi:146 [INFO] found job 17006650 in provider astro proc_lsst.multi INFO: found job 17006650 in provider astro parsl.providers.slurm.slurm DEBUG: Executing squeue --noheader --format='%i %t' --job '17006648,17006649,17006650,17006651,17006652' parsl.providers.slurm.slurm DEBUG: squeue returned 17006650 R 17006651 R 17006652 R 17006648 R 17006649 R parsl.providers.slurm.slurm DEBUG: Updating job 17006650 with slurm status R to parsl state JobState.RUNNING parsl.providers.slurm.slurm DEBUG: Updating job 17006651 with slurm status R to parsl state JobState.RUNNING parsl.providers.slurm.slurm DEBUG: Updating job 17006652 with slurm status R to parsl state JobState.RUNNING parsl.providers.slurm.slurm DEBUG: Updating job 17006648 with slurm status R to parsl state JobState.RUNNING parsl.providers.slurm.slurm DEBUG: Updating job 17006649 with slurm status R to parsl state JobState.RUNNING 2024-03-13 00:47:36 proc_lsst.multi:146 [INFO] found job 17006651 in provider astro proc_lsst.multi INFO: found job 17006651 in provider astro parsl.providers.slurm.slurm DEBUG: Executing squeue --noheader --format='%i %t' --job '17006648,17006649,17006650,17006651,17006652' parsl.providers.slurm.slurm DEBUG: squeue returned 17006650 R 17006651 R 17006652 R 17006648 R 17006649 R parsl.providers.slurm.slurm DEBUG: Updating job 17006650 with slurm status R to parsl state JobState.RUNNING parsl.providers.slurm.slurm DEBUG: Updating job 17006651 with slurm status R to parsl state JobState.RUNNING parsl.providers.slurm.slurm DEBUG: Updating job 17006652 with slurm status R to parsl state JobState.RUNNING parsl.providers.slurm.slurm DEBUG: Updating job 17006648 with slurm status R to parsl state JobState.RUNNING parsl.providers.slurm.slurm DEBUG: Updating job 17006649 with slurm status R to parsl state JobState.RUNNING 2024-03-13 00:47:36 proc_lsst.multi:146 [INFO] found job 17006652 in provider astro proc_lsst.multi INFO: found job 17006652 in provider astro parsl.providers.slurm.slurm DEBUG: Executing squeue --noheader --format='%i %t' --job '17006648,17006649,17006650,17006651,17006652' parsl.providers.slurm.slurm DEBUG: squeue returned 17006650 R 17006651 R 17006652 R 17006648 R 17006649 R parsl.providers.slurm.slurm DEBUG: Updating job 17006650 with slurm status R to parsl state JobState.RUNNING parsl.providers.slurm.slurm DEBUG: Updating job 17006651 with slurm status R to parsl state JobState.RUNNING parsl.providers.slurm.slurm DEBUG: Updating job 17006652 with slurm status R to parsl state JobState.RUNNING parsl.providers.slurm.slurm DEBUG: Updating job 17006648 with slurm status R to parsl state JobState.RUNNING parsl.providers.slurm.slurm DEBUG: Updating job 17006649 with slurm status R to parsl state JobState.RUNNING parsl.dataflow.strategy DEBUG: general strategy starting with strategy_type simple for 1 executors parsl.dataflow.strategy DEBUG: Strategizing for executor multi parsl.dataflow.strategy DEBUG: Slot ratio calculation: active_slots = 6, active_tasks = 24 parsl.dataflow.strategy DEBUG: Executor multi has 24 active tasks, 6/0 running/pending blocks, and 7 connected workers parsl.dataflow.strategy DEBUG: Strategy case 3: no changes necessary to current block load parsl.process_loggers DEBUG: Normal ending for _general_strategy on thread JobStatusPoller-Timer-Thread-22680214849104 parsl.dataflow.strategy DEBUG: general strategy starting with strategy_type simple for 1 executors parsl.dataflow.strategy DEBUG: Strategizing for executor multi parsl.dataflow.strategy DEBUG: Slot ratio calculation: active_slots = 6, active_tasks = 24 parsl.dataflow.strategy DEBUG: Executor multi has 24 active tasks, 6/0 running/pending blocks, and 7 connected workers parsl.dataflow.strategy DEBUG: Strategy case 3: no changes necessary to current block load parsl.process_loggers DEBUG: Normal ending for _general_strategy on thread JobStatusPoller-Timer-Thread-22680214849104 parsl.app.errors DEBUG: Reraising exception of type parsl.dataflow.dflow DEBUG: Task 9 try 0 failed parsl.dataflow.dflow INFO: Task 9 marked for retry parsl.dataflow.dflow INFO: Standard output for task 9 available at /mmfs1/home/stevengs/dirac/DEEP/submit/DEEP/20190504/A1c/science#step1/20240313T004552Z/submit/DEEP/20190504/A1c/science#step1/20240313T004552Z/logs/characterizeImage/855155/2086e029-091d-4067-95a7-9c7b4cca13bd_characterizeImage_855155_2.stdout parsl.dataflow.dflow INFO: Standard error for task 9 available at /mmfs1/home/stevengs/dirac/DEEP/submit/DEEP/20190504/A1c/science#step1/20240313T004552Z/submit/DEEP/20190504/A1c/science#step1/20240313T004552Z/logs/characterizeImage/855155/2086e029-091d-4067-95a7-9c7b4cca13bd_characterizeImage_855155_2.stderr parsl.dataflow.memoization DEBUG: Ignoring these kwargs for checkpointing: ['stderr', 'stdout'] parsl.dataflow.memoization DEBUG: Ignoring kwarg stderr parsl.dataflow.memoization DEBUG: Ignoring kwarg stdout parsl.dataflow.memoization DEBUG: Task 9 has memoization hash c4cd801f9c64185dde7dd1142756316e parsl.dataflow.memoization INFO: Task 9 had no result in cache parsl.executors.high_throughput.executor DEBUG: Pushing function .wrapper at 0x14a0a59beac0> to queue with args ("'${CTRL_MPEXEC_DIR}/bin/pipetask --long-log --log-level=VERBOSE run-qbb /mmfs1/home/stevengs/dirac/D...",) parsl.dataflow.dflow INFO: Parsl task 9 try 1 launched on executor multi with executor id 29 parsl.dataflow.dflow INFO: Standard output for task 9 available at /mmfs1/home/stevengs/dirac/DEEP/submit/DEEP/20190504/A1c/science#step1/20240313T004552Z/submit/DEEP/20190504/A1c/science#step1/20240313T004552Z/logs/characterizeImage/855155/2086e029-091d-4067-95a7-9c7b4cca13bd_characterizeImage_855155_2.stdout parsl.dataflow.dflow INFO: Standard error for task 9 available at /mmfs1/home/stevengs/dirac/DEEP/submit/DEEP/20190504/A1c/science#step1/20240313T004552Z/submit/DEEP/20190504/A1c/science#step1/20240313T004552Z/logs/characterizeImage/855155/2086e029-091d-4067-95a7-9c7b4cca13bd_characterizeImage_855155_2.stderr parsl.app.errors DEBUG: Reraising exception of type parsl.dataflow.dflow DEBUG: Task 11 try 0 failed parsl.dataflow.dflow INFO: Task 11 marked for retry parsl.dataflow.dflow INFO: Standard output for task 11 available at /mmfs1/home/stevengs/dirac/DEEP/submit/DEEP/20190504/A1c/science#step1/20240313T004552Z/submit/DEEP/20190504/A1c/science#step1/20240313T004552Z/logs/characterizeImage/855232/347540aa-acb4-4849-9813-60eb3425dbad_characterizeImage_855232_2.stdout parsl.dataflow.dflow INFO: Standard error for task 11 available at /mmfs1/home/stevengs/dirac/DEEP/submit/DEEP/20190504/A1c/science#step1/20240313T004552Z/submit/DEEP/20190504/A1c/science#step1/20240313T004552Z/logs/characterizeImage/855232/347540aa-acb4-4849-9813-60eb3425dbad_characterizeImage_855232_2.stderr parsl.dataflow.memoization DEBUG: Ignoring these kwargs for checkpointing: ['stderr', 'stdout'] parsl.dataflow.memoization DEBUG: Ignoring kwarg stderr parsl.dataflow.memoization DEBUG: Ignoring kwarg stdout parsl.dataflow.memoization DEBUG: Task 11 has memoization hash cf7d32aa1f863d471ab92eb8df888da0 parsl.dataflow.memoization INFO: Task 11 had no result in cache parsl.executors.high_throughput.executor DEBUG: Pushing function .wrapper at 0x14a0a59bf060> to queue with args ("'${CTRL_MPEXEC_DIR}/bin/pipetask --long-log --log-level=VERBOSE run-qbb /mmfs1/home/stevengs/dirac/D...",) parsl.dataflow.dflow INFO: Parsl task 11 try 1 launched on executor multi with executor id 30 parsl.dataflow.dflow INFO: Standard output for task 11 available at /mmfs1/home/stevengs/dirac/DEEP/submit/DEEP/20190504/A1c/science#step1/20240313T004552Z/submit/DEEP/20190504/A1c/science#step1/20240313T004552Z/logs/characterizeImage/855232/347540aa-acb4-4849-9813-60eb3425dbad_characterizeImage_855232_2.stdout parsl.dataflow.dflow INFO: Standard error for task 11 available at /mmfs1/home/stevengs/dirac/DEEP/submit/DEEP/20190504/A1c/science#step1/20240313T004552Z/submit/DEEP/20190504/A1c/science#step1/20240313T004552Z/logs/characterizeImage/855232/347540aa-acb4-4849-9813-60eb3425dbad_characterizeImage_855232_2.stderr parsl.dataflow.strategy DEBUG: general strategy starting with strategy_type simple for 1 executors parsl.dataflow.strategy DEBUG: Strategizing for executor multi parsl.dataflow.strategy DEBUG: Slot ratio calculation: active_slots = 6, active_tasks = 24 parsl.dataflow.strategy DEBUG: Executor multi has 24 active tasks, 6/0 running/pending blocks, and 7 connected workers parsl.dataflow.strategy DEBUG: Strategy case 3: no changes necessary to current block load parsl.process_loggers DEBUG: Normal ending for _general_strategy on thread JobStatusPoller-Timer-Thread-22680214849104 parsl.app.errors DEBUG: Reraising exception of type parsl.dataflow.dflow DEBUG: Task 2 try 0 failed parsl.dataflow.dflow INFO: Task 2 marked for retry parsl.dataflow.dflow INFO: Standard output for task 2 available at /mmfs1/home/stevengs/dirac/DEEP/submit/DEEP/20190504/A1c/science#step1/20240313T004552Z/submit/DEEP/20190504/A1c/science#step1/20240313T004552Z/logs/calibrate/855236/efd8283a-61fc-45e6-94ab-003aa8739b27_calibrate_855236_61.stdout parsl.dataflow.dflow INFO: Standard error for task 2 available at /mmfs1/home/stevengs/dirac/DEEP/submit/DEEP/20190504/A1c/science#step1/20240313T004552Z/submit/DEEP/20190504/A1c/science#step1/20240313T004552Z/logs/calibrate/855236/efd8283a-61fc-45e6-94ab-003aa8739b27_calibrate_855236_61.stderr parsl.dataflow.memoization DEBUG: Ignoring these kwargs for checkpointing: ['stderr', 'stdout'] parsl.dataflow.memoization DEBUG: Ignoring kwarg stderr parsl.dataflow.memoization DEBUG: Ignoring kwarg stdout parsl.dataflow.memoization DEBUG: Task 2 has memoization hash e7aa3ba2175fddb779a27d0452be8cc6 parsl.dataflow.memoization INFO: Task 2 had no result in cache parsl.executors.high_throughput.executor DEBUG: Pushing function .wrapper at 0x14a0a59befc0> to queue with args ("'${CTRL_MPEXEC_DIR}/bin/pipetask --long-log --log-level=VERBOSE run-qbb /mmfs1/home/stevengs/dirac/D...",) parsl.dataflow.dflow INFO: Parsl task 2 try 1 launched on executor multi with executor id 31 parsl.dataflow.dflow INFO: Standard output for task 2 available at /mmfs1/home/stevengs/dirac/DEEP/submit/DEEP/20190504/A1c/science#step1/20240313T004552Z/submit/DEEP/20190504/A1c/science#step1/20240313T004552Z/logs/calibrate/855236/efd8283a-61fc-45e6-94ab-003aa8739b27_calibrate_855236_61.stdout parsl.dataflow.dflow INFO: Standard error for task 2 available at /mmfs1/home/stevengs/dirac/DEEP/submit/DEEP/20190504/A1c/science#step1/20240313T004552Z/submit/DEEP/20190504/A1c/science#step1/20240313T004552Z/logs/calibrate/855236/efd8283a-61fc-45e6-94ab-003aa8739b27_calibrate_855236_61.stderr parsl.app.errors DEBUG: Reraising exception of type parsl.dataflow.dflow DEBUG: Task 13 try 0 failed parsl.dataflow.dflow INFO: Task 13 marked for retry parsl.dataflow.dflow INFO: Standard output for task 13 available at /mmfs1/home/stevengs/dirac/DEEP/submit/DEEP/20190504/A1c/science#step1/20240313T004552Z/submit/DEEP/20190504/A1c/science#step1/20240313T004552Z/logs/characterizeImage/855233/6e2c2141-714d-4569-9ba9-525ba24a832e_characterizeImage_855233_2.stdout parsl.dataflow.dflow INFO: Standard error for task 13 available at /mmfs1/home/stevengs/dirac/DEEP/submit/DEEP/20190504/A1c/science#step1/20240313T004552Z/submit/DEEP/20190504/A1c/science#step1/20240313T004552Z/logs/characterizeImage/855233/6e2c2141-714d-4569-9ba9-525ba24a832e_characterizeImage_855233_2.stderr parsl.dataflow.memoization DEBUG: Ignoring these kwargs for checkpointing: ['stderr', 'stdout'] parsl.dataflow.memoization DEBUG: Ignoring kwarg stderr parsl.dataflow.memoization DEBUG: Ignoring kwarg stdout parsl.dataflow.memoization DEBUG: Task 13 has memoization hash 6a9ce1f72afaeac33952c5e7ad7b2553 parsl.dataflow.memoization INFO: Task 13 had no result in cache parsl.executors.high_throughput.executor DEBUG: Pushing function .wrapper at 0x14a0a59bfa60> to queue with args ("'${CTRL_MPEXEC_DIR}/bin/pipetask --long-log --log-level=VERBOSE run-qbb /mmfs1/home/stevengs/dirac/D...",) parsl.dataflow.dflow INFO: Parsl task 13 try 1 launched on executor multi with executor id 32 parsl.dataflow.dflow INFO: Standard output for task 13 available at /mmfs1/home/stevengs/dirac/DEEP/submit/DEEP/20190504/A1c/science#step1/20240313T004552Z/submit/DEEP/20190504/A1c/science#step1/20240313T004552Z/logs/characterizeImage/855233/6e2c2141-714d-4569-9ba9-525ba24a832e_characterizeImage_855233_2.stdout parsl.dataflow.dflow INFO: Standard error for task 13 available at /mmfs1/home/stevengs/dirac/DEEP/submit/DEEP/20190504/A1c/science#step1/20240313T004552Z/submit/DEEP/20190504/A1c/science#step1/20240313T004552Z/logs/characterizeImage/855233/6e2c2141-714d-4569-9ba9-525ba24a832e_characterizeImage_855233_2.stderr parsl.app.errors DEBUG: Reraising exception of type parsl.dataflow.dflow DEBUG: Task 1 try 0 failed parsl.dataflow.dflow INFO: Task 1 marked for retry parsl.dataflow.dflow INFO: Standard output for task 1 available at /mmfs1/home/stevengs/dirac/DEEP/submit/DEEP/20190504/A1c/science#step1/20240313T004552Z/submit/DEEP/20190504/A1c/science#step1/20240313T004552Z/logs/calibrate/855238/cfa4072c-df06-4208-872e-34ec571e2d12_calibrate_855238_61.stdout parsl.dataflow.dflow INFO: Standard error for task 1 available at /mmfs1/home/stevengs/dirac/DEEP/submit/DEEP/20190504/A1c/science#step1/20240313T004552Z/submit/DEEP/20190504/A1c/science#step1/20240313T004552Z/logs/calibrate/855238/cfa4072c-df06-4208-872e-34ec571e2d12_calibrate_855238_61.stderr parsl.dataflow.memoization DEBUG: Ignoring these kwargs for checkpointing: ['stderr', 'stdout'] parsl.dataflow.memoization DEBUG: Ignoring kwarg stderr parsl.dataflow.memoization DEBUG: Ignoring kwarg stdout parsl.dataflow.memoization DEBUG: Task 1 has memoization hash 12f351a87d7e6897258f8bd74860e3af parsl.dataflow.memoization INFO: Task 1 had no result in cache parsl.executors.high_throughput.executor DEBUG: Pushing function .wrapper at 0x14a0a59beca0> to queue with args ("'${CTRL_MPEXEC_DIR}/bin/pipetask --long-log --log-level=VERBOSE run-qbb /mmfs1/home/stevengs/dirac/D...",) parsl.dataflow.dflow INFO: Parsl task 1 try 1 launched on executor multi with executor id 33 parsl.dataflow.dflow INFO: Standard output for task 1 available at /mmfs1/home/stevengs/dirac/DEEP/submit/DEEP/20190504/A1c/science#step1/20240313T004552Z/submit/DEEP/20190504/A1c/science#step1/20240313T004552Z/logs/calibrate/855238/cfa4072c-df06-4208-872e-34ec571e2d12_calibrate_855238_61.stdout parsl.dataflow.dflow INFO: Standard error for task 1 available at /mmfs1/home/stevengs/dirac/DEEP/submit/DEEP/20190504/A1c/science#step1/20240313T004552Z/submit/DEEP/20190504/A1c/science#step1/20240313T004552Z/logs/calibrate/855238/cfa4072c-df06-4208-872e-34ec571e2d12_calibrate_855238_61.stderr parsl.app.errors DEBUG: Reraising exception of type parsl.dataflow.dflow DEBUG: Task 15 try 0 failed parsl.dataflow.dflow INFO: Task 15 marked for retry parsl.dataflow.dflow INFO: Standard output for task 15 available at /mmfs1/home/stevengs/dirac/DEEP/submit/DEEP/20190504/A1c/science#step1/20240313T004552Z/submit/DEEP/20190504/A1c/science#step1/20240313T004552Z/logs/characterizeImage/855242/b137b096-0665-4f22-aacc-50f63e7957d6_characterizeImage_855242_2.stdout parsl.dataflow.dflow INFO: Standard error for task 15 available at /mmfs1/home/stevengs/dirac/DEEP/submit/DEEP/20190504/A1c/science#step1/20240313T004552Z/submit/DEEP/20190504/A1c/science#step1/20240313T004552Z/logs/characterizeImage/855242/b137b096-0665-4f22-aacc-50f63e7957d6_characterizeImage_855242_2.stderr parsl.dataflow.memoization DEBUG: Ignoring these kwargs for checkpointing: ['stderr', 'stdout'] parsl.dataflow.memoization DEBUG: Ignoring kwarg stderr parsl.dataflow.memoization DEBUG: Ignoring kwarg stdout parsl.dataflow.memoization DEBUG: Task 15 has memoization hash 543e14b89f388ed808ff36e721483df2 parsl.dataflow.memoization INFO: Task 15 had no result in cache parsl.executors.high_throughput.executor DEBUG: Pushing function .wrapper at 0x14a0a59bfba0> to queue with args ("'${CTRL_MPEXEC_DIR}/bin/pipetask --long-log --log-level=VERBOSE run-qbb /mmfs1/home/stevengs/dirac/D...",) parsl.dataflow.dflow INFO: Parsl task 15 try 1 launched on executor multi with executor id 34 parsl.dataflow.dflow INFO: Standard output for task 15 available at /mmfs1/home/stevengs/dirac/DEEP/submit/DEEP/20190504/A1c/science#step1/20240313T004552Z/submit/DEEP/20190504/A1c/science#step1/20240313T004552Z/logs/characterizeImage/855242/b137b096-0665-4f22-aacc-50f63e7957d6_characterizeImage_855242_2.stdout parsl.dataflow.dflow INFO: Standard error for task 15 available at /mmfs1/home/stevengs/dirac/DEEP/submit/DEEP/20190504/A1c/science#step1/20240313T004552Z/submit/DEEP/20190504/A1c/science#step1/20240313T004552Z/logs/characterizeImage/855242/b137b096-0665-4f22-aacc-50f63e7957d6_characterizeImage_855242_2.stderr parsl.dataflow.strategy DEBUG: general strategy starting with strategy_type simple for 1 executors parsl.dataflow.strategy DEBUG: Strategizing for executor multi parsl.dataflow.strategy DEBUG: Slot ratio calculation: active_slots = 6, active_tasks = 24 parsl.dataflow.strategy DEBUG: Executor multi has 24 active tasks, 6/0 running/pending blocks, and 7 connected workers parsl.dataflow.strategy DEBUG: Strategy case 3: no changes necessary to current block load parsl.process_loggers DEBUG: Normal ending for _general_strategy on thread JobStatusPoller-Timer-Thread-22680214849104 parsl.app.errors DEBUG: Reraising exception of type parsl.dataflow.dflow DEBUG: Task 17 try 0 failed parsl.dataflow.dflow INFO: Task 17 marked for retry parsl.dataflow.dflow INFO: Standard output for task 17 available at /mmfs1/home/stevengs/dirac/DEEP/submit/DEEP/20190504/A1c/science#step1/20240313T004552Z/submit/DEEP/20190504/A1c/science#step1/20240313T004552Z/logs/characterizeImage/855156/6fa5b72d-f0ed-4553-bdef-d175805bcce4_characterizeImage_855156_2.stdout parsl.dataflow.dflow INFO: Standard error for task 17 available at /mmfs1/home/stevengs/dirac/DEEP/submit/DEEP/20190504/A1c/science#step1/20240313T004552Z/submit/DEEP/20190504/A1c/science#step1/20240313T004552Z/logs/characterizeImage/855156/6fa5b72d-f0ed-4553-bdef-d175805bcce4_characterizeImage_855156_2.stderr parsl.dataflow.memoization DEBUG: Ignoring these kwargs for checkpointing: ['stderr', 'stdout'] parsl.dataflow.memoization DEBUG: Ignoring kwarg stderr parsl.dataflow.memoization DEBUG: Ignoring kwarg stdout parsl.dataflow.memoization DEBUG: Task 17 has memoization hash 3d34f213e14b3d59740fb65a96697482 parsl.dataflow.memoization INFO: Task 17 had no result in cache parsl.executors.high_throughput.executor DEBUG: Pushing function .wrapper at 0x14a0a59bfec0> to queue with args ("'${CTRL_MPEXEC_DIR}/bin/pipetask --long-log --log-level=VERBOSE run-qbb /mmfs1/home/stevengs/dirac/D...",) parsl.dataflow.dflow INFO: Parsl task 17 try 1 launched on executor multi with executor id 35 parsl.dataflow.dflow INFO: Standard output for task 17 available at /mmfs1/home/stevengs/dirac/DEEP/submit/DEEP/20190504/A1c/science#step1/20240313T004552Z/submit/DEEP/20190504/A1c/science#step1/20240313T004552Z/logs/characterizeImage/855156/6fa5b72d-f0ed-4553-bdef-d175805bcce4_characterizeImage_855156_2.stdout parsl.dataflow.dflow INFO: Standard error for task 17 available at /mmfs1/home/stevengs/dirac/DEEP/submit/DEEP/20190504/A1c/science#step1/20240313T004552Z/submit/DEEP/20190504/A1c/science#step1/20240313T004552Z/logs/characterizeImage/855156/6fa5b72d-f0ed-4553-bdef-d175805bcce4_characterizeImage_855156_2.stderr parsl.app.errors DEBUG: Reraising exception of type parsl.dataflow.dflow DEBUG: Task 19 try 0 failed parsl.dataflow.dflow INFO: Task 19 marked for retry parsl.dataflow.dflow INFO: Standard output for task 19 available at /mmfs1/home/stevengs/dirac/DEEP/submit/DEEP/20190504/A1c/science#step1/20240313T004552Z/submit/DEEP/20190504/A1c/science#step1/20240313T004552Z/logs/characterizeImage/855239/9aa0bc06-a00f-4190-9d5e-56a0bc89d175_characterizeImage_855239_2.stdout parsl.dataflow.dflow INFO: Standard error for task 19 available at /mmfs1/home/stevengs/dirac/DEEP/submit/DEEP/20190504/A1c/science#step1/20240313T004552Z/submit/DEEP/20190504/A1c/science#step1/20240313T004552Z/logs/characterizeImage/855239/9aa0bc06-a00f-4190-9d5e-56a0bc89d175_characterizeImage_855239_2.stderr parsl.dataflow.memoization DEBUG: Ignoring these kwargs for checkpointing: ['stderr', 'stdout'] parsl.dataflow.memoization DEBUG: Ignoring kwarg stderr parsl.dataflow.memoization DEBUG: Ignoring kwarg stdout parsl.dataflow.memoization DEBUG: Task 19 has memoization hash cfb31922038d574e4c7bf5532b3cefff parsl.dataflow.memoization INFO: Task 19 had no result in cache parsl.executors.high_throughput.executor DEBUG: Pushing function .wrapper at 0x14a0a5a140e0> to queue with args ("'${CTRL_MPEXEC_DIR}/bin/pipetask --long-log --log-level=VERBOSE run-qbb /mmfs1/home/stevengs/dirac/D...",) parsl.dataflow.dflow INFO: Parsl task 19 try 1 launched on executor multi with executor id 36 parsl.dataflow.dflow INFO: Standard output for task 19 available at /mmfs1/home/stevengs/dirac/DEEP/submit/DEEP/20190504/A1c/science#step1/20240313T004552Z/submit/DEEP/20190504/A1c/science#step1/20240313T004552Z/logs/characterizeImage/855239/9aa0bc06-a00f-4190-9d5e-56a0bc89d175_characterizeImage_855239_2.stdout parsl.dataflow.dflow INFO: Standard error for task 19 available at /mmfs1/home/stevengs/dirac/DEEP/submit/DEEP/20190504/A1c/science#step1/20240313T004552Z/submit/DEEP/20190504/A1c/science#step1/20240313T004552Z/logs/characterizeImage/855239/9aa0bc06-a00f-4190-9d5e-56a0bc89d175_characterizeImage_855239_2.stderr parsl.dataflow.strategy DEBUG: general strategy starting with strategy_type simple for 1 executors parsl.dataflow.strategy DEBUG: Strategizing for executor multi parsl.dataflow.strategy DEBUG: Slot ratio calculation: active_slots = 6, active_tasks = 24 parsl.dataflow.strategy DEBUG: Executor multi has 24 active tasks, 6/0 running/pending blocks, and 7 connected workers parsl.dataflow.strategy DEBUG: Strategy case 3: no changes necessary to current block load parsl.process_loggers DEBUG: Normal ending for _general_strategy on thread JobStatusPoller-Timer-Thread-22680214849104 parsl.app.errors DEBUG: Reraising exception of type parsl.dataflow.dflow DEBUG: Task 21 try 0 failed parsl.dataflow.dflow INFO: Task 21 marked for retry parsl.dataflow.dflow INFO: Standard output for task 21 available at /mmfs1/home/stevengs/dirac/DEEP/submit/DEEP/20190504/A1c/science#step1/20240313T004552Z/submit/DEEP/20190504/A1c/science#step1/20240313T004552Z/logs/characterizeImage/855235/1a05a316-e1c2-484e-9bc6-793456bd113e_characterizeImage_855235_2.stdout parsl.dataflow.dflow INFO: Standard error for task 21 available at /mmfs1/home/stevengs/dirac/DEEP/submit/DEEP/20190504/A1c/science#step1/20240313T004552Z/submit/DEEP/20190504/A1c/science#step1/20240313T004552Z/logs/characterizeImage/855235/1a05a316-e1c2-484e-9bc6-793456bd113e_characterizeImage_855235_2.stderr parsl.dataflow.memoization DEBUG: Ignoring these kwargs for checkpointing: ['stderr', 'stdout'] parsl.dataflow.memoization DEBUG: Ignoring kwarg stderr parsl.dataflow.memoization DEBUG: Ignoring kwarg stdout parsl.dataflow.memoization DEBUG: Task 21 has memoization hash e5bff4b603a3fb4276ac94ffaccbec95 parsl.dataflow.memoization INFO: Task 21 had no result in cache parsl.executors.high_throughput.executor DEBUG: Pushing function .wrapper at 0x14a0a5a14180> to queue with args ("'${CTRL_MPEXEC_DIR}/bin/pipetask --long-log --log-level=VERBOSE run-qbb /mmfs1/home/stevengs/dirac/D...",) parsl.dataflow.dflow INFO: Parsl task 21 try 1 launched on executor multi with executor id 37 parsl.dataflow.dflow INFO: Standard output for task 21 available at /mmfs1/home/stevengs/dirac/DEEP/submit/DEEP/20190504/A1c/science#step1/20240313T004552Z/submit/DEEP/20190504/A1c/science#step1/20240313T004552Z/logs/characterizeImage/855235/1a05a316-e1c2-484e-9bc6-793456bd113e_characterizeImage_855235_2.stdout parsl.dataflow.dflow INFO: Standard error for task 21 available at /mmfs1/home/stevengs/dirac/DEEP/submit/DEEP/20190504/A1c/science#step1/20240313T004552Z/submit/DEEP/20190504/A1c/science#step1/20240313T004552Z/logs/characterizeImage/855235/1a05a316-e1c2-484e-9bc6-793456bd113e_characterizeImage_855235_2.stderr parsl.app.errors DEBUG: Reraising exception of type parsl.dataflow.dflow DEBUG: Task 4 try 0 failed parsl.dataflow.dflow INFO: Task 4 marked for retry parsl.dataflow.dflow INFO: Standard output for task 4 available at /mmfs1/home/stevengs/dirac/DEEP/submit/DEEP/20190504/A1c/science#step1/20240313T004552Z/submit/DEEP/20190504/A1c/science#step1/20240313T004552Z/logs/calibrate/855239/afabbcb2-21b2-42ac-9a80-a7ca4dd3e01f_calibrate_855239_61.stdout parsl.dataflow.dflow INFO: Standard error for task 4 available at /mmfs1/home/stevengs/dirac/DEEP/submit/DEEP/20190504/A1c/science#step1/20240313T004552Z/submit/DEEP/20190504/A1c/science#step1/20240313T004552Z/logs/calibrate/855239/afabbcb2-21b2-42ac-9a80-a7ca4dd3e01f_calibrate_855239_61.stderr parsl.dataflow.memoization DEBUG: Ignoring these kwargs for checkpointing: ['stderr', 'stdout'] parsl.dataflow.memoization DEBUG: Ignoring kwarg stderr parsl.dataflow.memoization DEBUG: Ignoring kwarg stdout parsl.dataflow.memoization DEBUG: Task 4 has memoization hash 7ac17bf061ed83a782dd7ceca00ba027 parsl.dataflow.memoization INFO: Task 4 had no result in cache parsl.executors.high_throughput.executor DEBUG: Pushing function .wrapper at 0x14a0a59bef20> to queue with args ("'${CTRL_MPEXEC_DIR}/bin/pipetask --long-log --log-level=VERBOSE run-qbb /mmfs1/home/stevengs/dirac/D...",) parsl.dataflow.dflow INFO: Parsl task 4 try 1 launched on executor multi with executor id 38 parsl.dataflow.dflow INFO: Standard output for task 4 available at /mmfs1/home/stevengs/dirac/DEEP/submit/DEEP/20190504/A1c/science#step1/20240313T004552Z/submit/DEEP/20190504/A1c/science#step1/20240313T004552Z/logs/calibrate/855239/afabbcb2-21b2-42ac-9a80-a7ca4dd3e01f_calibrate_855239_61.stdout parsl.dataflow.dflow INFO: Standard error for task 4 available at /mmfs1/home/stevengs/dirac/DEEP/submit/DEEP/20190504/A1c/science#step1/20240313T004552Z/submit/DEEP/20190504/A1c/science#step1/20240313T004552Z/logs/calibrate/855239/afabbcb2-21b2-42ac-9a80-a7ca4dd3e01f_calibrate_855239_61.stderr parsl.app.errors DEBUG: Reraising exception of type parsl.dataflow.dflow DEBUG: Task 23 try 0 failed parsl.dataflow.dflow INFO: Task 23 marked for retry parsl.dataflow.dflow INFO: Standard output for task 23 available at /mmfs1/home/stevengs/dirac/DEEP/submit/DEEP/20190504/A1c/science#step1/20240313T004552Z/submit/DEEP/20190504/A1c/science#step1/20240313T004552Z/logs/characterizeImage/855231/0006b709-d908-4273-8054-8b0ac74bbaa1_characterizeImage_855231_2.stdout parsl.dataflow.dflow INFO: Standard error for task 23 available at /mmfs1/home/stevengs/dirac/DEEP/submit/DEEP/20190504/A1c/science#step1/20240313T004552Z/submit/DEEP/20190504/A1c/science#step1/20240313T004552Z/logs/characterizeImage/855231/0006b709-d908-4273-8054-8b0ac74bbaa1_characterizeImage_855231_2.stderr parsl.dataflow.memoization DEBUG: Ignoring these kwargs for checkpointing: ['stderr', 'stdout'] parsl.dataflow.memoization DEBUG: Ignoring kwarg stderr parsl.dataflow.memoization DEBUG: Ignoring kwarg stdout parsl.dataflow.memoization DEBUG: Task 23 has memoization hash 06b6c076abfa94b9413c6d6e7e064e18 parsl.dataflow.memoization INFO: Task 23 had no result in cache parsl.executors.high_throughput.executor DEBUG: Pushing function .wrapper at 0x14a0a5a142c0> to queue with args ("'${CTRL_MPEXEC_DIR}/bin/pipetask --long-log --log-level=VERBOSE run-qbb /mmfs1/home/stevengs/dirac/D...",) parsl.dataflow.dflow INFO: Parsl task 23 try 1 launched on executor multi with executor id 39 parsl.dataflow.dflow INFO: Standard output for task 23 available at /mmfs1/home/stevengs/dirac/DEEP/submit/DEEP/20190504/A1c/science#step1/20240313T004552Z/submit/DEEP/20190504/A1c/science#step1/20240313T004552Z/logs/characterizeImage/855231/0006b709-d908-4273-8054-8b0ac74bbaa1_characterizeImage_855231_2.stdout parsl.dataflow.dflow INFO: Standard error for task 23 available at /mmfs1/home/stevengs/dirac/DEEP/submit/DEEP/20190504/A1c/science#step1/20240313T004552Z/submit/DEEP/20190504/A1c/science#step1/20240313T004552Z/logs/characterizeImage/855231/0006b709-d908-4273-8054-8b0ac74bbaa1_characterizeImage_855231_2.stderr parsl.dataflow.strategy DEBUG: general strategy starting with strategy_type simple for 1 executors parsl.dataflow.strategy DEBUG: Strategizing for executor multi parsl.dataflow.strategy DEBUG: Slot ratio calculation: active_slots = 6, active_tasks = 24 parsl.dataflow.strategy DEBUG: Executor multi has 24 active tasks, 6/0 running/pending blocks, and 7 connected workers parsl.dataflow.strategy DEBUG: Strategy case 3: no changes necessary to current block load parsl.process_loggers DEBUG: Normal ending for _general_strategy on thread JobStatusPoller-Timer-Thread-22680214849104 parsl.app.errors DEBUG: Reraising exception of type parsl.dataflow.dflow DEBUG: Task 25 try 0 failed parsl.dataflow.dflow INFO: Task 25 marked for retry parsl.dataflow.dflow INFO: Standard output for task 25 available at /mmfs1/home/stevengs/dirac/DEEP/submit/DEEP/20190504/A1c/science#step1/20240313T004552Z/submit/DEEP/20190504/A1c/science#step1/20240313T004552Z/logs/characterizeImage/855237/3964ae79-23a7-439d-900c-c00b7d8f7d45_characterizeImage_855237_2.stdout parsl.dataflow.dflow INFO: Standard error for task 25 available at /mmfs1/home/stevengs/dirac/DEEP/submit/DEEP/20190504/A1c/science#step1/20240313T004552Z/submit/DEEP/20190504/A1c/science#step1/20240313T004552Z/logs/characterizeImage/855237/3964ae79-23a7-439d-900c-c00b7d8f7d45_characterizeImage_855237_2.stderr parsl.dataflow.memoization DEBUG: Ignoring these kwargs for checkpointing: ['stderr', 'stdout'] parsl.dataflow.memoization DEBUG: Ignoring kwarg stderr parsl.dataflow.memoization DEBUG: Ignoring kwarg stdout parsl.dataflow.memoization DEBUG: Task 25 has memoization hash bfdecf62681aa65ffe26b17739ad7ac2 parsl.dataflow.memoization INFO: Task 25 had no result in cache parsl.executors.high_throughput.executor DEBUG: Pushing function .wrapper at 0x14a0a5a144a0> to queue with args ("'${CTRL_MPEXEC_DIR}/bin/pipetask --long-log --log-level=VERBOSE run-qbb /mmfs1/home/stevengs/dirac/D...",) parsl.dataflow.dflow INFO: Parsl task 25 try 1 launched on executor multi with executor id 40 parsl.dataflow.dflow INFO: Standard output for task 25 available at /mmfs1/home/stevengs/dirac/DEEP/submit/DEEP/20190504/A1c/science#step1/20240313T004552Z/submit/DEEP/20190504/A1c/science#step1/20240313T004552Z/logs/characterizeImage/855237/3964ae79-23a7-439d-900c-c00b7d8f7d45_characterizeImage_855237_2.stdout parsl.dataflow.dflow INFO: Standard error for task 25 available at /mmfs1/home/stevengs/dirac/DEEP/submit/DEEP/20190504/A1c/science#step1/20240313T004552Z/submit/DEEP/20190504/A1c/science#step1/20240313T004552Z/logs/characterizeImage/855237/3964ae79-23a7-439d-900c-c00b7d8f7d45_characterizeImage_855237_2.stderr parsl.app.errors DEBUG: Reraising exception of type parsl.dataflow.dflow DEBUG: Task 27 try 0 failed parsl.dataflow.dflow INFO: Task 27 marked for retry parsl.dataflow.dflow INFO: Standard output for task 27 available at /mmfs1/home/stevengs/dirac/DEEP/submit/DEEP/20190504/A1c/science#step1/20240313T004552Z/submit/DEEP/20190504/A1c/science#step1/20240313T004552Z/logs/characterizeImage/855234/6020d23d-ca0d-4fc1-919a-75bb54614ea8_characterizeImage_855234_2.stdout parsl.dataflow.dflow INFO: Standard error for task 27 available at /mmfs1/home/stevengs/dirac/DEEP/submit/DEEP/20190504/A1c/science#step1/20240313T004552Z/submit/DEEP/20190504/A1c/science#step1/20240313T004552Z/logs/characterizeImage/855234/6020d23d-ca0d-4fc1-919a-75bb54614ea8_characterizeImage_855234_2.stderr parsl.dataflow.memoization DEBUG: Ignoring these kwargs for checkpointing: ['stderr', 'stdout'] parsl.dataflow.memoization DEBUG: Ignoring kwarg stderr parsl.dataflow.memoization DEBUG: Ignoring kwarg stdout parsl.dataflow.memoization DEBUG: Task 27 has memoization hash 44db16090aa4aaf3a35355dc07aebd18 parsl.dataflow.memoization INFO: Task 27 had no result in cache parsl.executors.high_throughput.executor DEBUG: Pushing function .wrapper at 0x14a0a5a14680> to queue with args ("'${CTRL_MPEXEC_DIR}/bin/pipetask --long-log --log-level=VERBOSE run-qbb /mmfs1/home/stevengs/dirac/D...",) parsl.dataflow.dflow INFO: Parsl task 27 try 1 launched on executor multi with executor id 41 parsl.dataflow.dflow INFO: Standard output for task 27 available at /mmfs1/home/stevengs/dirac/DEEP/submit/DEEP/20190504/A1c/science#step1/20240313T004552Z/submit/DEEP/20190504/A1c/science#step1/20240313T004552Z/logs/characterizeImage/855234/6020d23d-ca0d-4fc1-919a-75bb54614ea8_characterizeImage_855234_2.stdout parsl.dataflow.dflow INFO: Standard error for task 27 available at /mmfs1/home/stevengs/dirac/DEEP/submit/DEEP/20190504/A1c/science#step1/20240313T004552Z/submit/DEEP/20190504/A1c/science#step1/20240313T004552Z/logs/characterizeImage/855234/6020d23d-ca0d-4fc1-919a-75bb54614ea8_characterizeImage_855234_2.stderr parsl.app.errors DEBUG: Reraising exception of type parsl.dataflow.dflow DEBUG: Task 31 try 0 failed parsl.dataflow.dflow INFO: Task 31 marked for retry parsl.dataflow.dflow INFO: Standard output for task 31 available at /mmfs1/home/stevengs/dirac/DEEP/submit/DEEP/20190504/A1c/science#step1/20240313T004552Z/submit/DEEP/20190504/A1c/science#step1/20240313T004552Z/logs/characterizeImage/855238/4605cee3-743a-493d-83f5-fdf545d07467_characterizeImage_855238_2.stdout parsl.dataflow.dflow INFO: Standard error for task 31 available at /mmfs1/home/stevengs/dirac/DEEP/submit/DEEP/20190504/A1c/science#step1/20240313T004552Z/submit/DEEP/20190504/A1c/science#step1/20240313T004552Z/logs/characterizeImage/855238/4605cee3-743a-493d-83f5-fdf545d07467_characterizeImage_855238_2.stderr parsl.dataflow.memoization DEBUG: Ignoring these kwargs for checkpointing: ['stderr', 'stdout'] parsl.dataflow.memoization DEBUG: Ignoring kwarg stderr parsl.dataflow.memoization DEBUG: Ignoring kwarg stdout parsl.dataflow.memoization DEBUG: Task 31 has memoization hash a067278971b1cbc6f9ee7563d832fe8d parsl.dataflow.memoization INFO: Task 31 had no result in cache parsl.executors.high_throughput.executor DEBUG: Pushing function .wrapper at 0x14a0a5a14860> to queue with args ("'${CTRL_MPEXEC_DIR}/bin/pipetask --long-log --log-level=VERBOSE run-qbb /mmfs1/home/stevengs/dirac/D...",) parsl.dataflow.dflow INFO: Parsl task 31 try 1 launched on executor multi with executor id 42 parsl.dataflow.dflow INFO: Standard output for task 31 available at /mmfs1/home/stevengs/dirac/DEEP/submit/DEEP/20190504/A1c/science#step1/20240313T004552Z/submit/DEEP/20190504/A1c/science#step1/20240313T004552Z/logs/characterizeImage/855238/4605cee3-743a-493d-83f5-fdf545d07467_characterizeImage_855238_2.stdout parsl.dataflow.dflow INFO: Standard error for task 31 available at /mmfs1/home/stevengs/dirac/DEEP/submit/DEEP/20190504/A1c/science#step1/20240313T004552Z/submit/DEEP/20190504/A1c/science#step1/20240313T004552Z/logs/characterizeImage/855238/4605cee3-743a-493d-83f5-fdf545d07467_characterizeImage_855238_2.stderr parsl.app.errors DEBUG: Reraising exception of type parsl.dataflow.dflow DEBUG: Task 29 try 0 failed parsl.dataflow.dflow INFO: Task 29 marked for retry parsl.dataflow.dflow INFO: Standard output for task 29 available at /mmfs1/home/stevengs/dirac/DEEP/submit/DEEP/20190504/A1c/science#step1/20240313T004552Z/submit/DEEP/20190504/A1c/science#step1/20240313T004552Z/logs/characterizeImage/855240/a66aed64-17b6-4dfc-ab12-497daf3266cd_characterizeImage_855240_2.stdout parsl.dataflow.dflow INFO: Standard error for task 29 available at /mmfs1/home/stevengs/dirac/DEEP/submit/DEEP/20190504/A1c/science#step1/20240313T004552Z/submit/DEEP/20190504/A1c/science#step1/20240313T004552Z/logs/characterizeImage/855240/a66aed64-17b6-4dfc-ab12-497daf3266cd_characterizeImage_855240_2.stderr parsl.dataflow.memoization DEBUG: Ignoring these kwargs for checkpointing: ['stderr', 'stdout'] parsl.dataflow.memoization DEBUG: Ignoring kwarg stderr parsl.dataflow.memoization DEBUG: Ignoring kwarg stdout parsl.dataflow.memoization DEBUG: Task 29 has memoization hash b84bc45020e63591ded62a69b711a669 parsl.dataflow.memoization INFO: Task 29 had no result in cache parsl.executors.high_throughput.executor DEBUG: Pushing function .wrapper at 0x14a0a5a14a40> to queue with args ("'${CTRL_MPEXEC_DIR}/bin/pipetask --long-log --log-level=VERBOSE run-qbb /mmfs1/home/stevengs/dirac/D...",) parsl.dataflow.dflow INFO: Parsl task 29 try 1 launched on executor multi with executor id 43 parsl.dataflow.dflow INFO: Standard output for task 29 available at /mmfs1/home/stevengs/dirac/DEEP/submit/DEEP/20190504/A1c/science#step1/20240313T004552Z/submit/DEEP/20190504/A1c/science#step1/20240313T004552Z/logs/characterizeImage/855240/a66aed64-17b6-4dfc-ab12-497daf3266cd_characterizeImage_855240_2.stdout parsl.dataflow.dflow INFO: Standard error for task 29 available at /mmfs1/home/stevengs/dirac/DEEP/submit/DEEP/20190504/A1c/science#step1/20240313T004552Z/submit/DEEP/20190504/A1c/science#step1/20240313T004552Z/logs/characterizeImage/855240/a66aed64-17b6-4dfc-ab12-497daf3266cd_characterizeImage_855240_2.stderr parsl.app.errors DEBUG: Reraising exception of type parsl.dataflow.dflow DEBUG: Task 33 try 0 failed parsl.dataflow.dflow INFO: Task 33 marked for retry parsl.dataflow.dflow INFO: Standard output for task 33 available at /mmfs1/home/stevengs/dirac/DEEP/submit/DEEP/20190504/A1c/science#step1/20240313T004552Z/submit/DEEP/20190504/A1c/science#step1/20240313T004552Z/logs/characterizeImage/855154/4f4a56ad-0482-49cf-afc8-b2c863ede055_characterizeImage_855154_2.stdout parsl.dataflow.dflow INFO: Standard error for task 33 available at /mmfs1/home/stevengs/dirac/DEEP/submit/DEEP/20190504/A1c/science#step1/20240313T004552Z/submit/DEEP/20190504/A1c/science#step1/20240313T004552Z/logs/characterizeImage/855154/4f4a56ad-0482-49cf-afc8-b2c863ede055_characterizeImage_855154_2.stderr parsl.dataflow.memoization DEBUG: Ignoring these kwargs for checkpointing: ['stderr', 'stdout'] parsl.dataflow.memoization DEBUG: Ignoring kwarg stderr parsl.dataflow.memoization DEBUG: Ignoring kwarg stdout parsl.dataflow.memoization DEBUG: Task 33 has memoization hash c55044e0b9b4ddf47a2be94fdc8cf8ba parsl.dataflow.memoization INFO: Task 33 had no result in cache parsl.executors.high_throughput.executor DEBUG: Pushing function .wrapper at 0x14a0a5a14c20> to queue with args ("'${CTRL_MPEXEC_DIR}/bin/pipetask --long-log --log-level=VERBOSE run-qbb /mmfs1/home/stevengs/dirac/D...",) parsl.dataflow.dflow INFO: Parsl task 33 try 1 launched on executor multi with executor id 44 parsl.dataflow.dflow INFO: Standard output for task 33 available at /mmfs1/home/stevengs/dirac/DEEP/submit/DEEP/20190504/A1c/science#step1/20240313T004552Z/submit/DEEP/20190504/A1c/science#step1/20240313T004552Z/logs/characterizeImage/855154/4f4a56ad-0482-49cf-afc8-b2c863ede055_characterizeImage_855154_2.stdout parsl.dataflow.dflow INFO: Standard error for task 33 available at /mmfs1/home/stevengs/dirac/DEEP/submit/DEEP/20190504/A1c/science#step1/20240313T004552Z/submit/DEEP/20190504/A1c/science#step1/20240313T004552Z/logs/characterizeImage/855154/4f4a56ad-0482-49cf-afc8-b2c863ede055_characterizeImage_855154_2.stderr 2024-03-13 00:48:04 proc_lsst.multi:146 [INFO] found job 45623 in provider local proc_lsst.multi INFO: found job 45623 in provider local 2024-03-13 00:48:04 proc_lsst.multi:146 [INFO] found job 17006648 in provider astro proc_lsst.multi INFO: found job 17006648 in provider astro parsl.providers.slurm.slurm DEBUG: Executing squeue --noheader --format='%i %t' --job '17006648,17006649,17006650,17006651,17006652' parsl.providers.slurm.slurm DEBUG: squeue returned 17006650 R 17006651 R 17006652 R 17006648 R 17006649 R parsl.providers.slurm.slurm DEBUG: Updating job 17006650 with slurm status R to parsl state JobState.RUNNING parsl.providers.slurm.slurm DEBUG: Updating job 17006651 with slurm status R to parsl state JobState.RUNNING parsl.providers.slurm.slurm DEBUG: Updating job 17006652 with slurm status R to parsl state JobState.RUNNING parsl.providers.slurm.slurm DEBUG: Updating job 17006648 with slurm status R to parsl state JobState.RUNNING parsl.providers.slurm.slurm DEBUG: Updating job 17006649 with slurm status R to parsl state JobState.RUNNING 2024-03-13 00:48:05 proc_lsst.multi:146 [INFO] found job 17006649 in provider astro proc_lsst.multi INFO: found job 17006649 in provider astro parsl.providers.slurm.slurm DEBUG: Executing squeue --noheader --format='%i %t' --job '17006648,17006649,17006650,17006651,17006652' parsl.providers.slurm.slurm DEBUG: squeue returned 17006650 R 17006651 R 17006652 R 17006648 R 17006649 R parsl.providers.slurm.slurm DEBUG: Updating job 17006650 with slurm status R to parsl state JobState.RUNNING parsl.providers.slurm.slurm DEBUG: Updating job 17006651 with slurm status R to parsl state JobState.RUNNING parsl.providers.slurm.slurm DEBUG: Updating job 17006652 with slurm status R to parsl state JobState.RUNNING parsl.providers.slurm.slurm DEBUG: Updating job 17006648 with slurm status R to parsl state JobState.RUNNING parsl.providers.slurm.slurm DEBUG: Updating job 17006649 with slurm status R to parsl state JobState.RUNNING 2024-03-13 00:48:05 proc_lsst.multi:146 [INFO] found job 17006650 in provider astro proc_lsst.multi INFO: found job 17006650 in provider astro parsl.providers.slurm.slurm DEBUG: Executing squeue --noheader --format='%i %t' --job '17006648,17006649,17006650,17006651,17006652' parsl.providers.slurm.slurm DEBUG: squeue returned 17006650 R 17006651 R 17006652 R 17006648 R 17006649 R parsl.providers.slurm.slurm DEBUG: Updating job 17006650 with slurm status R to parsl state JobState.RUNNING parsl.providers.slurm.slurm DEBUG: Updating job 17006651 with slurm status R to parsl state JobState.RUNNING parsl.providers.slurm.slurm DEBUG: Updating job 17006652 with slurm status R to parsl state JobState.RUNNING parsl.providers.slurm.slurm DEBUG: Updating job 17006648 with slurm status R to parsl state JobState.RUNNING parsl.providers.slurm.slurm DEBUG: Updating job 17006649 with slurm status R to parsl state JobState.RUNNING 2024-03-13 00:48:06 proc_lsst.multi:146 [INFO] found job 17006651 in provider astro proc_lsst.multi INFO: found job 17006651 in provider astro parsl.providers.slurm.slurm DEBUG: Executing squeue --noheader --format='%i %t' --job '17006648,17006649,17006650,17006651,17006652' parsl.providers.slurm.slurm DEBUG: squeue returned 17006650 R 17006651 R 17006652 R 17006648 R 17006649 R parsl.providers.slurm.slurm DEBUG: Updating job 17006650 with slurm status R to parsl state JobState.RUNNING parsl.providers.slurm.slurm DEBUG: Updating job 17006651 with slurm status R to parsl state JobState.RUNNING parsl.providers.slurm.slurm DEBUG: Updating job 17006652 with slurm status R to parsl state JobState.RUNNING parsl.providers.slurm.slurm DEBUG: Updating job 17006648 with slurm status R to parsl state JobState.RUNNING parsl.providers.slurm.slurm DEBUG: Updating job 17006649 with slurm status R to parsl state JobState.RUNNING 2024-03-13 00:48:06 proc_lsst.multi:146 [INFO] found job 17006652 in provider astro proc_lsst.multi INFO: found job 17006652 in provider astro parsl.providers.slurm.slurm DEBUG: Executing squeue --noheader --format='%i %t' --job '17006648,17006649,17006650,17006651,17006652' parsl.app.errors DEBUG: Reraising exception of type parsl.dataflow.dflow DEBUG: Task 35 try 0 failed parsl.dataflow.dflow INFO: Task 35 marked for retry parsl.dataflow.dflow INFO: Standard output for task 35 available at /mmfs1/home/stevengs/dirac/DEEP/submit/DEEP/20190504/A1c/science#step1/20240313T004552Z/submit/DEEP/20190504/A1c/science#step1/20240313T004552Z/logs/characterizeImage/855241/d8762014-da79-4bd3-963c-08e730b4ba96_characterizeImage_855241_2.stdout parsl.dataflow.dflow INFO: Standard error for task 35 available at /mmfs1/home/stevengs/dirac/DEEP/submit/DEEP/20190504/A1c/science#step1/20240313T004552Z/submit/DEEP/20190504/A1c/science#step1/20240313T004552Z/logs/characterizeImage/855241/d8762014-da79-4bd3-963c-08e730b4ba96_characterizeImage_855241_2.stderr parsl.dataflow.memoization DEBUG: Ignoring these kwargs for checkpointing: ['stderr', 'stdout'] parsl.dataflow.memoization DEBUG: Ignoring kwarg stderr parsl.dataflow.memoization DEBUG: Ignoring kwarg stdout parsl.dataflow.memoization DEBUG: Task 35 has memoization hash 0bdca18ddf82aa9cb0ccc4e02c2c9e8e parsl.dataflow.memoization INFO: Task 35 had no result in cache parsl.executors.high_throughput.executor DEBUG: Pushing function .wrapper at 0x14a0a5a14e00> to queue with args ("'${CTRL_MPEXEC_DIR}/bin/pipetask --long-log --log-level=VERBOSE run-qbb /mmfs1/home/stevengs/dirac/D...",) parsl.dataflow.dflow INFO: Parsl task 35 try 1 launched on executor multi with executor id 45 parsl.dataflow.dflow INFO: Standard output for task 35 available at /mmfs1/home/stevengs/dirac/DEEP/submit/DEEP/20190504/A1c/science#step1/20240313T004552Z/submit/DEEP/20190504/A1c/science#step1/20240313T004552Z/logs/characterizeImage/855241/d8762014-da79-4bd3-963c-08e730b4ba96_characterizeImage_855241_2.stdout parsl.dataflow.dflow INFO: Standard error for task 35 available at /mmfs1/home/stevengs/dirac/DEEP/submit/DEEP/20190504/A1c/science#step1/20240313T004552Z/submit/DEEP/20190504/A1c/science#step1/20240313T004552Z/logs/characterizeImage/855241/d8762014-da79-4bd3-963c-08e730b4ba96_characterizeImage_855241_2.stderr parsl.providers.slurm.slurm DEBUG: squeue returned 17006650 R 17006651 R 17006652 R 17006648 R 17006649 R parsl.providers.slurm.slurm DEBUG: Updating job 17006650 with slurm status R to parsl state JobState.RUNNING parsl.providers.slurm.slurm DEBUG: Updating job 17006651 with slurm status R to parsl state JobState.RUNNING parsl.providers.slurm.slurm DEBUG: Updating job 17006652 with slurm status R to parsl state JobState.RUNNING parsl.providers.slurm.slurm DEBUG: Updating job 17006648 with slurm status R to parsl state JobState.RUNNING parsl.providers.slurm.slurm DEBUG: Updating job 17006649 with slurm status R to parsl state JobState.RUNNING parsl.dataflow.strategy DEBUG: general strategy starting with strategy_type simple for 1 executors parsl.dataflow.strategy DEBUG: Strategizing for executor multi parsl.dataflow.strategy DEBUG: Slot ratio calculation: active_slots = 6, active_tasks = 24 parsl.dataflow.strategy DEBUG: Executor multi has 24 active tasks, 6/0 running/pending blocks, and 7 connected workers parsl.dataflow.strategy DEBUG: Strategy case 3: no changes necessary to current block load parsl.process_loggers DEBUG: Normal ending for _general_strategy on thread JobStatusPoller-Timer-Thread-22680214849104 parsl.app.errors DEBUG: Reraising exception of type parsl.dataflow.dflow DEBUG: Task 37 try 0 failed parsl.dataflow.dflow INFO: Task 37 marked for retry parsl.dataflow.dflow INFO: Standard output for task 37 available at /mmfs1/home/stevengs/dirac/DEEP/submit/DEEP/20190504/A1c/science#step1/20240313T004552Z/submit/DEEP/20190504/A1c/science#step1/20240313T004552Z/logs/characterizeImage/855236/d0f18884-6a7f-417a-98c5-2ab6d11b2f68_characterizeImage_855236_2.stdout parsl.dataflow.dflow INFO: Standard error for task 37 available at /mmfs1/home/stevengs/dirac/DEEP/submit/DEEP/20190504/A1c/science#step1/20240313T004552Z/submit/DEEP/20190504/A1c/science#step1/20240313T004552Z/logs/characterizeImage/855236/d0f18884-6a7f-417a-98c5-2ab6d11b2f68_characterizeImage_855236_2.stderr parsl.dataflow.memoization DEBUG: Ignoring these kwargs for checkpointing: ['stderr', 'stdout'] parsl.dataflow.memoization DEBUG: Ignoring kwarg stderr parsl.dataflow.memoization DEBUG: Ignoring kwarg stdout parsl.dataflow.memoization DEBUG: Task 37 has memoization hash 3bda4ff7e2ebbb3b265ec953a4b02846 parsl.dataflow.memoization INFO: Task 37 had no result in cache parsl.executors.high_throughput.executor DEBUG: Pushing function .wrapper at 0x14a0a5a14fe0> to queue with args ("'${CTRL_MPEXEC_DIR}/bin/pipetask --long-log --log-level=VERBOSE run-qbb /mmfs1/home/stevengs/dirac/D...",) parsl.dataflow.dflow INFO: Parsl task 37 try 1 launched on executor multi with executor id 46 parsl.dataflow.dflow INFO: Standard output for task 37 available at /mmfs1/home/stevengs/dirac/DEEP/submit/DEEP/20190504/A1c/science#step1/20240313T004552Z/submit/DEEP/20190504/A1c/science#step1/20240313T004552Z/logs/characterizeImage/855236/d0f18884-6a7f-417a-98c5-2ab6d11b2f68_characterizeImage_855236_2.stdout parsl.dataflow.dflow INFO: Standard error for task 37 available at /mmfs1/home/stevengs/dirac/DEEP/submit/DEEP/20190504/A1c/science#step1/20240313T004552Z/submit/DEEP/20190504/A1c/science#step1/20240313T004552Z/logs/characterizeImage/855236/d0f18884-6a7f-417a-98c5-2ab6d11b2f68_characterizeImage_855236_2.stderr parsl.dataflow.strategy DEBUG: general strategy starting with strategy_type simple for 1 executors parsl.dataflow.strategy DEBUG: Strategizing for executor multi parsl.dataflow.strategy DEBUG: Slot ratio calculation: active_slots = 6, active_tasks = 24 parsl.dataflow.strategy DEBUG: Executor multi has 24 active tasks, 6/0 running/pending blocks, and 7 connected workers parsl.dataflow.strategy DEBUG: Strategy case 3: no changes necessary to current block load parsl.process_loggers DEBUG: Normal ending for _general_strategy on thread JobStatusPoller-Timer-Thread-22680214849104 parsl.dataflow.strategy DEBUG: general strategy starting with strategy_type simple for 1 executors parsl.dataflow.strategy DEBUG: Strategizing for executor multi parsl.dataflow.strategy DEBUG: Slot ratio calculation: active_slots = 6, active_tasks = 24 parsl.dataflow.strategy DEBUG: Executor multi has 24 active tasks, 6/0 running/pending blocks, and 7 connected workers parsl.dataflow.strategy DEBUG: Strategy case 3: no changes necessary to current block load parsl.process_loggers DEBUG: Normal ending for _general_strategy on thread JobStatusPoller-Timer-Thread-22680214849104 parsl.app.errors DEBUG: Reraising exception of type parsl.dataflow.dflow DEBUG: Task 9 try 1 failed parsl.dataflow.dflow ERROR: Task 9 failed after 1 retry attempts Traceback (most recent call last): File "/mmfs1/gscratch/dirac/shared/opt/conda/envs/lsst-scipipe-8.0.0/lib/python3.11/site-packages/parsl/dataflow/dflow.py", line 300, in handle_exec_update res = self._unwrap_remote_exception_wrapper(future) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ File "/mmfs1/gscratch/dirac/shared/opt/conda/envs/lsst-scipipe-8.0.0/lib/python3.11/site-packages/parsl/dataflow/dflow.py", line 566, in _unwrap_remote_exception_wrapper result.reraise() File "/mmfs1/gscratch/dirac/shared/opt/conda/envs/lsst-scipipe-8.0.0/lib/python3.11/site-packages/parsl/app/errors.py", line 123, in reraise reraise(t, v, v.__traceback__) File "/mmfs1/gscratch/dirac/shared/opt/conda/envs/lsst-scipipe-8.0.0/lib/python3.11/site-packages/six.py", line 719, in reraise raise value File "/mmfs1/gscratch/dirac/shared/opt/conda/envs/lsst-scipipe-8.0.0/lib/python3.11/site-packages/parsl/app/errors.py", line 146, in wrapper return func(*args, **kwargs) ^^^^^^^^^^^^^^^^^ File "/mmfs1/gscratch/dirac/shared/opt/conda/envs/lsst-scipipe-8.0.0/lib/python3.11/site-packages/parsl/app/bash.py", line 86, in remote_side_bash_executor raise pe.BashExitFailure(func_name, proc.returncode) ^^^^^^^^^^^^^^^^^ parsl.app.errors.BashExitFailure: bash_app characterizeImage failed with unix exit code 1 parsl.dataflow.memoization DEBUG: Storing app cache entry c4cd801f9c64185dde7dd1142756316e with result from task 9 parsl.dataflow.dflow WARNING: No tasks checkpointed so far in this run. Please ensure caching is enabled parsl.dataflow.dflow INFO: Task 10 failed due to dependency failure parsl.dataflow.dflow DEBUG: Task 10 try 0 failed parsl.dataflow.dflow INFO: Task 10 failed due to dependency failure so skipping retries parsl.dataflow.memoization ERROR: Attempting to update app cache entry but hashsum is not a string key parsl.dataflow.dflow WARNING: No tasks checkpointed so far in this run. Please ensure caching is enabled parsl.dataflow.dflow INFO: Standard output for task 10 available at /mmfs1/home/stevengs/dirac/DEEP/submit/DEEP/20190504/A1c/science#step1/20240313T004552Z/submit/DEEP/20190504/A1c/science#step1/20240313T004552Z/logs/calibrate/855155/9d91b8cb-5974-4b65-b875-324d0ce12713_calibrate_855155_2.stdout parsl.dataflow.dflow INFO: Standard error for task 10 available at /mmfs1/home/stevengs/dirac/DEEP/submit/DEEP/20190504/A1c/science#step1/20240313T004552Z/submit/DEEP/20190504/A1c/science#step1/20240313T004552Z/logs/calibrate/855155/9d91b8cb-5974-4b65-b875-324d0ce12713_calibrate_855155_2.stderr parsl.dataflow.dflow INFO: Standard output for task 9 available at /mmfs1/home/stevengs/dirac/DEEP/submit/DEEP/20190504/A1c/science#step1/20240313T004552Z/submit/DEEP/20190504/A1c/science#step1/20240313T004552Z/logs/characterizeImage/855155/2086e029-091d-4067-95a7-9c7b4cca13bd_characterizeImage_855155_2.stdout parsl.dataflow.dflow INFO: Standard error for task 9 available at /mmfs1/home/stevengs/dirac/DEEP/submit/DEEP/20190504/A1c/science#step1/20240313T004552Z/submit/DEEP/20190504/A1c/science#step1/20240313T004552Z/logs/characterizeImage/855155/2086e029-091d-4067-95a7-9c7b4cca13bd_characterizeImage_855155_2.stderr parsl.dataflow.strategy DEBUG: general strategy starting with strategy_type simple for 1 executors parsl.dataflow.strategy DEBUG: Strategizing for executor multi parsl.dataflow.strategy DEBUG: Slot ratio calculation: active_slots = 6, active_tasks = 23 parsl.dataflow.strategy DEBUG: Executor multi has 23 active tasks, 6/0 running/pending blocks, and 7 connected workers parsl.dataflow.strategy DEBUG: Strategy case 3: no changes necessary to current block load parsl.process_loggers DEBUG: Normal ending for _general_strategy on thread JobStatusPoller-Timer-Thread-22680214849104 parsl.app.errors DEBUG: Reraising exception of type parsl.dataflow.dflow DEBUG: Task 7 try 0 failed parsl.dataflow.dflow INFO: Task 7 marked for retry parsl.dataflow.dflow INFO: Standard output for task 7 available at /mmfs1/home/stevengs/dirac/DEEP/submit/DEEP/20190504/A1c/science#step1/20240313T004552Z/submit/DEEP/20190504/A1c/science#step1/20240313T004552Z/logs/calibrate/855235/ee291926-868e-4af4-9a7c-688cd03d4a89_calibrate_855235_61.stdout parsl.dataflow.dflow INFO: Standard error for task 7 available at /mmfs1/home/stevengs/dirac/DEEP/submit/DEEP/20190504/A1c/science#step1/20240313T004552Z/submit/DEEP/20190504/A1c/science#step1/20240313T004552Z/logs/calibrate/855235/ee291926-868e-4af4-9a7c-688cd03d4a89_calibrate_855235_61.stderr parsl.dataflow.memoization DEBUG: Ignoring these kwargs for checkpointing: ['stderr', 'stdout'] parsl.dataflow.memoization DEBUG: Ignoring kwarg stderr parsl.dataflow.memoization DEBUG: Ignoring kwarg stdout parsl.dataflow.memoization DEBUG: Task 7 has memoization hash 1311e7b153c8c919a4c655b0827f90d9 parsl.dataflow.memoization INFO: Task 7 had no result in cache parsl.executors.high_throughput.executor DEBUG: Pushing function .wrapper at 0x14a0a59be340> to queue with args ("'${CTRL_MPEXEC_DIR}/bin/pipetask --long-log --log-level=VERBOSE run-qbb /mmfs1/home/stevengs/dirac/D...",) parsl.dataflow.dflow INFO: Parsl task 7 try 1 launched on executor multi with executor id 47 parsl.dataflow.dflow INFO: Standard output for task 7 available at /mmfs1/home/stevengs/dirac/DEEP/submit/DEEP/20190504/A1c/science#step1/20240313T004552Z/submit/DEEP/20190504/A1c/science#step1/20240313T004552Z/logs/calibrate/855235/ee291926-868e-4af4-9a7c-688cd03d4a89_calibrate_855235_61.stdout parsl.dataflow.dflow INFO: Standard error for task 7 available at /mmfs1/home/stevengs/dirac/DEEP/submit/DEEP/20190504/A1c/science#step1/20240313T004552Z/submit/DEEP/20190504/A1c/science#step1/20240313T004552Z/logs/calibrate/855235/ee291926-868e-4af4-9a7c-688cd03d4a89_calibrate_855235_61.stderr parsl.app.errors DEBUG: Reraising exception of type parsl.dataflow.dflow DEBUG: Task 11 try 1 failed parsl.dataflow.dflow ERROR: Task 11 failed after 1 retry attempts Traceback (most recent call last): File "/mmfs1/gscratch/dirac/shared/opt/conda/envs/lsst-scipipe-8.0.0/lib/python3.11/site-packages/parsl/dataflow/dflow.py", line 300, in handle_exec_update res = self._unwrap_remote_exception_wrapper(future) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ File "/mmfs1/gscratch/dirac/shared/opt/conda/envs/lsst-scipipe-8.0.0/lib/python3.11/site-packages/parsl/dataflow/dflow.py", line 566, in _unwrap_remote_exception_wrapper result.reraise() File "/mmfs1/gscratch/dirac/shared/opt/conda/envs/lsst-scipipe-8.0.0/lib/python3.11/site-packages/parsl/app/errors.py", line 123, in reraise reraise(t, v, v.__traceback__) File "/mmfs1/gscratch/dirac/shared/opt/conda/envs/lsst-scipipe-8.0.0/lib/python3.11/site-packages/six.py", line 719, in reraise raise value File "/mmfs1/gscratch/dirac/shared/opt/conda/envs/lsst-scipipe-8.0.0/lib/python3.11/site-packages/parsl/app/errors.py", line 146, in wrapper return func(*args, **kwargs) ^^^^^^^^^^^^^^^^^ File "/mmfs1/gscratch/dirac/shared/opt/conda/envs/lsst-scipipe-8.0.0/lib/python3.11/site-packages/parsl/app/bash.py", line 86, in remote_side_bash_executor raise pe.BashExitFailure(func_name, proc.returncode) ^^^^^^^^^^^^^^^^^ parsl.app.errors.BashExitFailure: bash_app characterizeImage failed with unix exit code 1 parsl.dataflow.memoization DEBUG: Storing app cache entry cf7d32aa1f863d471ab92eb8df888da0 with result from task 11 parsl.dataflow.dflow WARNING: No tasks checkpointed so far in this run. Please ensure caching is enabled parsl.dataflow.dflow INFO: Task 12 failed due to dependency failure parsl.dataflow.dflow DEBUG: Task 12 try 0 failed parsl.dataflow.dflow INFO: Task 12 failed due to dependency failure so skipping retries parsl.dataflow.memoization ERROR: Attempting to update app cache entry but hashsum is not a string key parsl.dataflow.dflow WARNING: No tasks checkpointed so far in this run. Please ensure caching is enabled parsl.dataflow.dflow INFO: Standard output for task 12 available at /mmfs1/home/stevengs/dirac/DEEP/submit/DEEP/20190504/A1c/science#step1/20240313T004552Z/submit/DEEP/20190504/A1c/science#step1/20240313T004552Z/logs/calibrate/855232/b980e855-01fb-4f5a-941a-57bbd00d763c_calibrate_855232_2.stdout parsl.dataflow.dflow INFO: Standard error for task 12 available at /mmfs1/home/stevengs/dirac/DEEP/submit/DEEP/20190504/A1c/science#step1/20240313T004552Z/submit/DEEP/20190504/A1c/science#step1/20240313T004552Z/logs/calibrate/855232/b980e855-01fb-4f5a-941a-57bbd00d763c_calibrate_855232_2.stderr parsl.dataflow.dflow INFO: Standard output for task 11 available at /mmfs1/home/stevengs/dirac/DEEP/submit/DEEP/20190504/A1c/science#step1/20240313T004552Z/submit/DEEP/20190504/A1c/science#step1/20240313T004552Z/logs/characterizeImage/855232/347540aa-acb4-4849-9813-60eb3425dbad_characterizeImage_855232_2.stdout parsl.dataflow.dflow INFO: Standard error for task 11 available at /mmfs1/home/stevengs/dirac/DEEP/submit/DEEP/20190504/A1c/science#step1/20240313T004552Z/submit/DEEP/20190504/A1c/science#step1/20240313T004552Z/logs/characterizeImage/855232/347540aa-acb4-4849-9813-60eb3425dbad_characterizeImage_855232_2.stderr parsl.dataflow.strategy DEBUG: general strategy starting with strategy_type simple for 1 executors parsl.dataflow.strategy DEBUG: Strategizing for executor multi parsl.dataflow.strategy DEBUG: Slot ratio calculation: active_slots = 6, active_tasks = 22 parsl.dataflow.strategy DEBUG: Executor multi has 22 active tasks, 6/0 running/pending blocks, and 7 connected workers parsl.dataflow.strategy DEBUG: Strategy case 3: no changes necessary to current block load parsl.process_loggers DEBUG: Normal ending for _general_strategy on thread JobStatusPoller-Timer-Thread-22680214849104 parsl.dataflow.strategy DEBUG: general strategy starting with strategy_type simple for 1 executors parsl.dataflow.strategy DEBUG: Strategizing for executor multi parsl.dataflow.strategy DEBUG: Slot ratio calculation: active_slots = 6, active_tasks = 22 parsl.dataflow.strategy DEBUG: Executor multi has 22 active tasks, 6/0 running/pending blocks, and 7 connected workers parsl.dataflow.strategy DEBUG: Strategy case 3: no changes necessary to current block load parsl.process_loggers DEBUG: Normal ending for _general_strategy on thread JobStatusPoller-Timer-Thread-22680214849104 parsl.app.errors DEBUG: Reraising exception of type parsl.dataflow.dflow DEBUG: Task 13 try 1 failed parsl.dataflow.dflow ERROR: Task 13 failed after 1 retry attempts Traceback (most recent call last): File "/mmfs1/gscratch/dirac/shared/opt/conda/envs/lsst-scipipe-8.0.0/lib/python3.11/site-packages/parsl/dataflow/dflow.py", line 300, in handle_exec_update res = self._unwrap_remote_exception_wrapper(future) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ File "/mmfs1/gscratch/dirac/shared/opt/conda/envs/lsst-scipipe-8.0.0/lib/python3.11/site-packages/parsl/dataflow/dflow.py", line 566, in _unwrap_remote_exception_wrapper result.reraise() File "/mmfs1/gscratch/dirac/shared/opt/conda/envs/lsst-scipipe-8.0.0/lib/python3.11/site-packages/parsl/app/errors.py", line 123, in reraise reraise(t, v, v.__traceback__) File "/mmfs1/gscratch/dirac/shared/opt/conda/envs/lsst-scipipe-8.0.0/lib/python3.11/site-packages/six.py", line 719, in reraise raise value File "/mmfs1/gscratch/dirac/shared/opt/conda/envs/lsst-scipipe-8.0.0/lib/python3.11/site-packages/parsl/app/errors.py", line 146, in wrapper return func(*args, **kwargs) ^^^^^^^^^^^^^^^^^ File "/mmfs1/gscratch/dirac/shared/opt/conda/envs/lsst-scipipe-8.0.0/lib/python3.11/site-packages/parsl/app/bash.py", line 86, in remote_side_bash_executor raise pe.BashExitFailure(func_name, proc.returncode) ^^^^^^^^^^^^^^^^^ parsl.app.errors.BashExitFailure: bash_app characterizeImage failed with unix exit code 1 parsl.dataflow.memoization DEBUG: Storing app cache entry 6a9ce1f72afaeac33952c5e7ad7b2553 with result from task 13 parsl.dataflow.dflow WARNING: No tasks checkpointed so far in this run. Please ensure caching is enabled parsl.dataflow.dflow INFO: Task 14 failed due to dependency failure parsl.dataflow.dflow DEBUG: Task 14 try 0 failed parsl.dataflow.dflow INFO: Task 14 failed due to dependency failure so skipping retries parsl.dataflow.memoization ERROR: Attempting to update app cache entry but hashsum is not a string key parsl.dataflow.dflow WARNING: No tasks checkpointed so far in this run. Please ensure caching is enabled parsl.dataflow.dflow INFO: Standard output for task 14 available at /mmfs1/home/stevengs/dirac/DEEP/submit/DEEP/20190504/A1c/science#step1/20240313T004552Z/submit/DEEP/20190504/A1c/science#step1/20240313T004552Z/logs/calibrate/855233/403ab666-45b7-4a27-bf7a-1e3c7969b9c5_calibrate_855233_2.stdout parsl.dataflow.dflow INFO: Standard error for task 14 available at /mmfs1/home/stevengs/dirac/DEEP/submit/DEEP/20190504/A1c/science#step1/20240313T004552Z/submit/DEEP/20190504/A1c/science#step1/20240313T004552Z/logs/calibrate/855233/403ab666-45b7-4a27-bf7a-1e3c7969b9c5_calibrate_855233_2.stderr parsl.dataflow.dflow INFO: Standard output for task 13 available at /mmfs1/home/stevengs/dirac/DEEP/submit/DEEP/20190504/A1c/science#step1/20240313T004552Z/submit/DEEP/20190504/A1c/science#step1/20240313T004552Z/logs/characterizeImage/855233/6e2c2141-714d-4569-9ba9-525ba24a832e_characterizeImage_855233_2.stdout parsl.dataflow.dflow INFO: Standard error for task 13 available at /mmfs1/home/stevengs/dirac/DEEP/submit/DEEP/20190504/A1c/science#step1/20240313T004552Z/submit/DEEP/20190504/A1c/science#step1/20240313T004552Z/logs/characterizeImage/855233/6e2c2141-714d-4569-9ba9-525ba24a832e_characterizeImage_855233_2.stderr 2024-03-13 00:48:34 proc_lsst.multi:146 [INFO] found job 45623 in provider local proc_lsst.multi INFO: found job 45623 in provider local 2024-03-13 00:48:34 proc_lsst.multi:146 [INFO] found job 17006648 in provider astro proc_lsst.multi INFO: found job 17006648 in provider astro parsl.providers.slurm.slurm DEBUG: Executing squeue --noheader --format='%i %t' --job '17006648,17006649,17006650,17006651,17006652' parsl.providers.slurm.slurm DEBUG: squeue returned 17006650 R 17006651 R 17006652 R 17006648 R 17006649 R parsl.providers.slurm.slurm DEBUG: Updating job 17006650 with slurm status R to parsl state JobState.RUNNING parsl.providers.slurm.slurm DEBUG: Updating job 17006651 with slurm status R to parsl state JobState.RUNNING parsl.providers.slurm.slurm DEBUG: Updating job 17006652 with slurm status R to parsl state JobState.RUNNING parsl.providers.slurm.slurm DEBUG: Updating job 17006648 with slurm status R to parsl state JobState.RUNNING parsl.providers.slurm.slurm DEBUG: Updating job 17006649 with slurm status R to parsl state JobState.RUNNING 2024-03-13 00:48:35 proc_lsst.multi:146 [INFO] found job 17006649 in provider astro proc_lsst.multi INFO: found job 17006649 in provider astro parsl.providers.slurm.slurm DEBUG: Executing squeue --noheader --format='%i %t' --job '17006648,17006649,17006650,17006651,17006652' parsl.providers.slurm.slurm DEBUG: squeue returned 17006650 R 17006651 R 17006652 R 17006648 R 17006649 R parsl.providers.slurm.slurm DEBUG: Updating job 17006650 with slurm status R to parsl state JobState.RUNNING parsl.providers.slurm.slurm DEBUG: Updating job 17006651 with slurm status R to parsl state JobState.RUNNING parsl.providers.slurm.slurm DEBUG: Updating job 17006652 with slurm status R to parsl state JobState.RUNNING parsl.providers.slurm.slurm DEBUG: Updating job 17006648 with slurm status R to parsl state JobState.RUNNING parsl.providers.slurm.slurm DEBUG: Updating job 17006649 with slurm status R to parsl state JobState.RUNNING 2024-03-13 00:48:35 proc_lsst.multi:146 [INFO] found job 17006650 in provider astro proc_lsst.multi INFO: found job 17006650 in provider astro parsl.providers.slurm.slurm DEBUG: Executing squeue --noheader --format='%i %t' --job '17006648,17006649,17006650,17006651,17006652' parsl.providers.slurm.slurm DEBUG: squeue returned 17006650 R 17006651 R 17006652 R 17006648 R 17006649 R parsl.providers.slurm.slurm DEBUG: Updating job 17006650 with slurm status R to parsl state JobState.RUNNING parsl.providers.slurm.slurm DEBUG: Updating job 17006651 with slurm status R to parsl state JobState.RUNNING parsl.providers.slurm.slurm DEBUG: Updating job 17006652 with slurm status R to parsl state JobState.RUNNING parsl.providers.slurm.slurm DEBUG: Updating job 17006648 with slurm status R to parsl state JobState.RUNNING parsl.providers.slurm.slurm DEBUG: Updating job 17006649 with slurm status R to parsl state JobState.RUNNING 2024-03-13 00:48:36 proc_lsst.multi:146 [INFO] found job 17006651 in provider astro proc_lsst.multi INFO: found job 17006651 in provider astro parsl.providers.slurm.slurm DEBUG: Executing squeue --noheader --format='%i %t' --job '17006648,17006649,17006650,17006651,17006652' parsl.providers.slurm.slurm DEBUG: squeue returned 17006650 R 17006651 R 17006652 R 17006648 R 17006649 R parsl.providers.slurm.slurm DEBUG: Updating job 17006650 with slurm status R to parsl state JobState.RUNNING parsl.providers.slurm.slurm DEBUG: Updating job 17006651 with slurm status R to parsl state JobState.RUNNING parsl.providers.slurm.slurm DEBUG: Updating job 17006652 with slurm status R to parsl state JobState.RUNNING parsl.providers.slurm.slurm DEBUG: Updating job 17006648 with slurm status R to parsl state JobState.RUNNING parsl.providers.slurm.slurm DEBUG: Updating job 17006649 with slurm status R to parsl state JobState.RUNNING 2024-03-13 00:48:36 proc_lsst.multi:146 [INFO] found job 17006652 in provider astro proc_lsst.multi INFO: found job 17006652 in provider astro parsl.providers.slurm.slurm DEBUG: Executing squeue --noheader --format='%i %t' --job '17006648,17006649,17006650,17006651,17006652' parsl.providers.slurm.slurm DEBUG: squeue returned 17006650 R 17006651 R 17006652 R 17006648 R 17006649 R parsl.providers.slurm.slurm DEBUG: Updating job 17006650 with slurm status R to parsl state JobState.RUNNING parsl.providers.slurm.slurm DEBUG: Updating job 17006651 with slurm status R to parsl state JobState.RUNNING parsl.providers.slurm.slurm DEBUG: Updating job 17006652 with slurm status R to parsl state JobState.RUNNING parsl.providers.slurm.slurm DEBUG: Updating job 17006648 with slurm status R to parsl state JobState.RUNNING parsl.providers.slurm.slurm DEBUG: Updating job 17006649 with slurm status R to parsl state JobState.RUNNING parsl.dataflow.strategy DEBUG: general strategy starting with strategy_type simple for 1 executors parsl.dataflow.strategy DEBUG: Strategizing for executor multi parsl.dataflow.strategy DEBUG: Slot ratio calculation: active_slots = 6, active_tasks = 21 parsl.dataflow.strategy DEBUG: Executor multi has 21 active tasks, 6/0 running/pending blocks, and 7 connected workers parsl.dataflow.strategy DEBUG: Strategy case 3: no changes necessary to current block load parsl.process_loggers DEBUG: Normal ending for _general_strategy on thread JobStatusPoller-Timer-Thread-22680214849104 parsl.dataflow.strategy DEBUG: general strategy starting with strategy_type simple for 1 executors parsl.dataflow.strategy DEBUG: Strategizing for executor multi parsl.dataflow.strategy DEBUG: Slot ratio calculation: active_slots = 6, active_tasks = 21 parsl.dataflow.strategy DEBUG: Executor multi has 21 active tasks, 6/0 running/pending blocks, and 7 connected workers parsl.dataflow.strategy DEBUG: Strategy case 3: no changes necessary to current block load parsl.process_loggers DEBUG: Normal ending for _general_strategy on thread JobStatusPoller-Timer-Thread-22680214849104 parsl.dataflow.strategy DEBUG: general strategy starting with strategy_type simple for 1 executors parsl.dataflow.strategy DEBUG: Strategizing for executor multi parsl.dataflow.strategy DEBUG: Slot ratio calculation: active_slots = 6, active_tasks = 21 parsl.dataflow.strategy DEBUG: Executor multi has 21 active tasks, 6/0 running/pending blocks, and 7 connected workers parsl.dataflow.strategy DEBUG: Strategy case 3: no changes necessary to current block load parsl.process_loggers DEBUG: Normal ending for _general_strategy on thread JobStatusPoller-Timer-Thread-22680214849104 parsl.app.errors DEBUG: Reraising exception of type parsl.dataflow.dflow DEBUG: Task 6 try 1 failed parsl.dataflow.dflow ERROR: Task 6 failed after 1 retry attempts Traceback (most recent call last): File "/mmfs1/gscratch/dirac/shared/opt/conda/envs/lsst-scipipe-8.0.0/lib/python3.11/site-packages/parsl/dataflow/dflow.py", line 300, in handle_exec_update res = self._unwrap_remote_exception_wrapper(future) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ File "/mmfs1/gscratch/dirac/shared/opt/conda/envs/lsst-scipipe-8.0.0/lib/python3.11/site-packages/parsl/dataflow/dflow.py", line 566, in _unwrap_remote_exception_wrapper result.reraise() File "/mmfs1/gscratch/dirac/shared/opt/conda/envs/lsst-scipipe-8.0.0/lib/python3.11/site-packages/parsl/app/errors.py", line 123, in reraise reraise(t, v, v.__traceback__) File "/mmfs1/gscratch/dirac/shared/opt/conda/envs/lsst-scipipe-8.0.0/lib/python3.11/site-packages/six.py", line 719, in reraise raise value File "/mmfs1/gscratch/dirac/shared/opt/conda/envs/lsst-scipipe-8.0.0/lib/python3.11/site-packages/parsl/app/errors.py", line 146, in wrapper return func(*args, **kwargs) ^^^^^^^^^^^^^^^^^ File "/mmfs1/gscratch/dirac/shared/opt/conda/envs/lsst-scipipe-8.0.0/lib/python3.11/site-packages/parsl/app/bash.py", line 86, in remote_side_bash_executor raise pe.BashExitFailure(func_name, proc.returncode) ^^^^^^^^^^^^^^^^^ parsl.app.errors.BashExitFailure: bash_app calibrate failed with unix exit code 1 parsl.dataflow.memoization DEBUG: Storing app cache entry 4ee1b38b81c097313c1a0968eaf59c68 with result from task 6 parsl.dataflow.dflow WARNING: No tasks checkpointed so far in this run. Please ensure caching is enabled parsl.dataflow.dflow INFO: Standard output for task 6 available at /mmfs1/home/stevengs/dirac/DEEP/submit/DEEP/20190504/A1c/science#step1/20240313T004552Z/submit/DEEP/20190504/A1c/science#step1/20240313T004552Z/logs/calibrate/855242/77c764d1-8a76-4375-b57a-c3586164fc7d_calibrate_855242_61.stdout parsl.dataflow.dflow INFO: Standard error for task 6 available at /mmfs1/home/stevengs/dirac/DEEP/submit/DEEP/20190504/A1c/science#step1/20240313T004552Z/submit/DEEP/20190504/A1c/science#step1/20240313T004552Z/logs/calibrate/855242/77c764d1-8a76-4375-b57a-c3586164fc7d_calibrate_855242_61.stderr parsl.dataflow.strategy DEBUG: general strategy starting with strategy_type simple for 1 executors parsl.dataflow.strategy DEBUG: Strategizing for executor multi parsl.dataflow.strategy DEBUG: Slot ratio calculation: active_slots = 6, active_tasks = 20 parsl.dataflow.strategy DEBUG: Executor multi has 20 active tasks, 6/0 running/pending blocks, and 7 connected workers parsl.dataflow.strategy DEBUG: Strategy case 3: no changes necessary to current block load parsl.process_loggers DEBUG: Normal ending for _general_strategy on thread JobStatusPoller-Timer-Thread-22680214849104 parsl.app.errors DEBUG: Reraising exception of type parsl.dataflow.dflow DEBUG: Task 15 try 1 failed parsl.dataflow.dflow ERROR: Task 15 failed after 1 retry attempts Traceback (most recent call last): File "/mmfs1/gscratch/dirac/shared/opt/conda/envs/lsst-scipipe-8.0.0/lib/python3.11/site-packages/parsl/dataflow/dflow.py", line 300, in handle_exec_update res = self._unwrap_remote_exception_wrapper(future) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ File "/mmfs1/gscratch/dirac/shared/opt/conda/envs/lsst-scipipe-8.0.0/lib/python3.11/site-packages/parsl/dataflow/dflow.py", line 566, in _unwrap_remote_exception_wrapper result.reraise() File "/mmfs1/gscratch/dirac/shared/opt/conda/envs/lsst-scipipe-8.0.0/lib/python3.11/site-packages/parsl/app/errors.py", line 123, in reraise reraise(t, v, v.__traceback__) File "/mmfs1/gscratch/dirac/shared/opt/conda/envs/lsst-scipipe-8.0.0/lib/python3.11/site-packages/six.py", line 719, in reraise raise value File "/mmfs1/gscratch/dirac/shared/opt/conda/envs/lsst-scipipe-8.0.0/lib/python3.11/site-packages/parsl/app/errors.py", line 146, in wrapper return func(*args, **kwargs) ^^^^^^^^^^^^^^^^^ File "/mmfs1/gscratch/dirac/shared/opt/conda/envs/lsst-scipipe-8.0.0/lib/python3.11/site-packages/parsl/app/bash.py", line 86, in remote_side_bash_executor raise pe.BashExitFailure(func_name, proc.returncode) ^^^^^^^^^^^^^^^^^ parsl.app.errors.BashExitFailure: bash_app characterizeImage failed with unix exit code 1 parsl.dataflow.memoization DEBUG: Storing app cache entry 543e14b89f388ed808ff36e721483df2 with result from task 15 parsl.dataflow.dflow WARNING: No tasks checkpointed so far in this run. Please ensure caching is enabled parsl.dataflow.dflow INFO: Task 16 failed due to dependency failure parsl.dataflow.dflow DEBUG: Task 16 try 0 failed parsl.dataflow.dflow INFO: Task 16 failed due to dependency failure so skipping retries parsl.dataflow.memoization ERROR: Attempting to update app cache entry but hashsum is not a string key parsl.dataflow.dflow WARNING: No tasks checkpointed so far in this run. Please ensure caching is enabled parsl.dataflow.dflow INFO: Standard output for task 16 available at /mmfs1/home/stevengs/dirac/DEEP/submit/DEEP/20190504/A1c/science#step1/20240313T004552Z/submit/DEEP/20190504/A1c/science#step1/20240313T004552Z/logs/calibrate/855242/a2d0caf5-fc48-4d71-9e68-62fbc76c14de_calibrate_855242_2.stdout parsl.dataflow.dflow INFO: Standard error for task 16 available at /mmfs1/home/stevengs/dirac/DEEP/submit/DEEP/20190504/A1c/science#step1/20240313T004552Z/submit/DEEP/20190504/A1c/science#step1/20240313T004552Z/logs/calibrate/855242/a2d0caf5-fc48-4d71-9e68-62fbc76c14de_calibrate_855242_2.stderr parsl.dataflow.dflow INFO: Standard output for task 15 available at /mmfs1/home/stevengs/dirac/DEEP/submit/DEEP/20190504/A1c/science#step1/20240313T004552Z/submit/DEEP/20190504/A1c/science#step1/20240313T004552Z/logs/characterizeImage/855242/b137b096-0665-4f22-aacc-50f63e7957d6_characterizeImage_855242_2.stdout parsl.dataflow.dflow INFO: Standard error for task 15 available at /mmfs1/home/stevengs/dirac/DEEP/submit/DEEP/20190504/A1c/science#step1/20240313T004552Z/submit/DEEP/20190504/A1c/science#step1/20240313T004552Z/logs/characterizeImage/855242/b137b096-0665-4f22-aacc-50f63e7957d6_characterizeImage_855242_2.stderr parsl.app.errors DEBUG: Reraising exception of type parsl.dataflow.dflow DEBUG: Task 3 try 1 failed parsl.dataflow.dflow ERROR: Task 3 failed after 1 retry attempts Traceback (most recent call last): File "/mmfs1/gscratch/dirac/shared/opt/conda/envs/lsst-scipipe-8.0.0/lib/python3.11/site-packages/parsl/dataflow/dflow.py", line 300, in handle_exec_update res = self._unwrap_remote_exception_wrapper(future) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ File "/mmfs1/gscratch/dirac/shared/opt/conda/envs/lsst-scipipe-8.0.0/lib/python3.11/site-packages/parsl/dataflow/dflow.py", line 566, in _unwrap_remote_exception_wrapper result.reraise() File "/mmfs1/gscratch/dirac/shared/opt/conda/envs/lsst-scipipe-8.0.0/lib/python3.11/site-packages/parsl/app/errors.py", line 123, in reraise reraise(t, v, v.__traceback__) File "/mmfs1/gscratch/dirac/shared/opt/conda/envs/lsst-scipipe-8.0.0/lib/python3.11/site-packages/six.py", line 719, in reraise raise value File "/mmfs1/gscratch/dirac/shared/opt/conda/envs/lsst-scipipe-8.0.0/lib/python3.11/site-packages/parsl/app/errors.py", line 146, in wrapper return func(*args, **kwargs) ^^^^^^^^^^^^^^^^^ File "/mmfs1/gscratch/dirac/shared/opt/conda/envs/lsst-scipipe-8.0.0/lib/python3.11/site-packages/parsl/app/bash.py", line 86, in remote_side_bash_executor raise pe.BashExitFailure(func_name, proc.returncode) ^^^^^^^^^^^^^^^^^ parsl.app.errors.BashExitFailure: bash_app calibrate failed with unix exit code 1 parsl.dataflow.memoization DEBUG: Storing app cache entry 8c01d02f1b63178e7b043bc26f1e657d with result from task 3 parsl.dataflow.dflow WARNING: No tasks checkpointed so far in this run. Please ensure caching is enabled parsl.dataflow.dflow INFO: Standard output for task 3 available at /mmfs1/home/stevengs/dirac/DEEP/submit/DEEP/20190504/A1c/science#step1/20240313T004552Z/submit/DEEP/20190504/A1c/science#step1/20240313T004552Z/logs/calibrate/855232/8431b289-f9a9-4bbc-99eb-756a86d9f419_calibrate_855232_61.stdout parsl.dataflow.dflow INFO: Standard error for task 3 available at /mmfs1/home/stevengs/dirac/DEEP/submit/DEEP/20190504/A1c/science#step1/20240313T004552Z/submit/DEEP/20190504/A1c/science#step1/20240313T004552Z/logs/calibrate/855232/8431b289-f9a9-4bbc-99eb-756a86d9f419_calibrate_855232_61.stderr parsl.dataflow.strategy DEBUG: general strategy starting with strategy_type simple for 1 executors parsl.dataflow.strategy DEBUG: Strategizing for executor multi parsl.dataflow.strategy DEBUG: Slot ratio calculation: active_slots = 6, active_tasks = 18 parsl.dataflow.strategy DEBUG: Executor multi has 18 active tasks, 6/0 running/pending blocks, and 7 connected workers parsl.dataflow.strategy DEBUG: Strategy case 3: no changes necessary to current block load parsl.process_loggers DEBUG: Normal ending for _general_strategy on thread JobStatusPoller-Timer-Thread-22680214849104 parsl.app.errors DEBUG: Reraising exception of type parsl.dataflow.dflow DEBUG: Task 0 try 1 failed parsl.dataflow.dflow ERROR: Task 0 failed after 1 retry attempts Traceback (most recent call last): File "/mmfs1/gscratch/dirac/shared/opt/conda/envs/lsst-scipipe-8.0.0/lib/python3.11/site-packages/parsl/dataflow/dflow.py", line 300, in handle_exec_update res = self._unwrap_remote_exception_wrapper(future) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ File "/mmfs1/gscratch/dirac/shared/opt/conda/envs/lsst-scipipe-8.0.0/lib/python3.11/site-packages/parsl/dataflow/dflow.py", line 566, in _unwrap_remote_exception_wrapper result.reraise() File "/mmfs1/gscratch/dirac/shared/opt/conda/envs/lsst-scipipe-8.0.0/lib/python3.11/site-packages/parsl/app/errors.py", line 123, in reraise reraise(t, v, v.__traceback__) File "/mmfs1/gscratch/dirac/shared/opt/conda/envs/lsst-scipipe-8.0.0/lib/python3.11/site-packages/six.py", line 719, in reraise raise value File "/mmfs1/gscratch/dirac/shared/opt/conda/envs/lsst-scipipe-8.0.0/lib/python3.11/site-packages/parsl/app/errors.py", line 146, in wrapper return func(*args, **kwargs) ^^^^^^^^^^^^^^^^^ File "/mmfs1/gscratch/dirac/shared/opt/conda/envs/lsst-scipipe-8.0.0/lib/python3.11/site-packages/parsl/app/bash.py", line 86, in remote_side_bash_executor raise pe.BashExitFailure(func_name, proc.returncode) ^^^^^^^^^^^^^^^^^ parsl.app.errors.BashExitFailure: bash_app calibrate failed with unix exit code 1 parsl.dataflow.memoization DEBUG: Storing app cache entry 511835a8acc02d39e3b6d1d2225bb1a4 with result from task 0 parsl.dataflow.dflow WARNING: No tasks checkpointed so far in this run. Please ensure caching is enabled parsl.dataflow.dflow INFO: Standard output for task 0 available at /mmfs1/home/stevengs/dirac/DEEP/submit/DEEP/20190504/A1c/science#step1/20240313T004552Z/submit/DEEP/20190504/A1c/science#step1/20240313T004552Z/logs/calibrate/855240/46c0754b-7817-4760-9c47-d4bff5a7fcea_calibrate_855240_61.stdout parsl.dataflow.dflow INFO: Standard error for task 0 available at /mmfs1/home/stevengs/dirac/DEEP/submit/DEEP/20190504/A1c/science#step1/20240313T004552Z/submit/DEEP/20190504/A1c/science#step1/20240313T004552Z/logs/calibrate/855240/46c0754b-7817-4760-9c47-d4bff5a7fcea_calibrate_855240_61.stderr parsl.app.errors DEBUG: Reraising exception of type parsl.dataflow.dflow DEBUG: Task 5 try 1 failed parsl.dataflow.dflow ERROR: Task 5 failed after 1 retry attempts Traceback (most recent call last): File "/mmfs1/gscratch/dirac/shared/opt/conda/envs/lsst-scipipe-8.0.0/lib/python3.11/site-packages/parsl/dataflow/dflow.py", line 300, in handle_exec_update res = self._unwrap_remote_exception_wrapper(future) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ File "/mmfs1/gscratch/dirac/shared/opt/conda/envs/lsst-scipipe-8.0.0/lib/python3.11/site-packages/parsl/dataflow/dflow.py", line 566, in _unwrap_remote_exception_wrapper result.reraise() File "/mmfs1/gscratch/dirac/shared/opt/conda/envs/lsst-scipipe-8.0.0/lib/python3.11/site-packages/parsl/app/errors.py", line 123, in reraise reraise(t, v, v.__traceback__) File "/mmfs1/gscratch/dirac/shared/opt/conda/envs/lsst-scipipe-8.0.0/lib/python3.11/site-packages/six.py", line 719, in reraise raise value File "/mmfs1/gscratch/dirac/shared/opt/conda/envs/lsst-scipipe-8.0.0/lib/python3.11/site-packages/parsl/app/errors.py", line 146, in wrapper return func(*args, **kwargs) ^^^^^^^^^^^^^^^^^ File "/mmfs1/gscratch/dirac/shared/opt/conda/envs/lsst-scipipe-8.0.0/lib/python3.11/site-packages/parsl/app/bash.py", line 86, in remote_side_bash_executor raise pe.BashExitFailure(func_name, proc.returncode) ^^^^^^^^^^^^^^^^^ parsl.app.errors.BashExitFailure: bash_app calibrate failed with unix exit code 1 parsl.dataflow.memoization DEBUG: Storing app cache entry 35e6072f8e12c92b613d91f28ff6e335 with result from task 5 parsl.dataflow.dflow WARNING: No tasks checkpointed so far in this run. Please ensure caching is enabled parsl.dataflow.dflow INFO: Standard output for task 5 available at /mmfs1/home/stevengs/dirac/DEEP/submit/DEEP/20190504/A1c/science#step1/20240313T004552Z/submit/DEEP/20190504/A1c/science#step1/20240313T004552Z/logs/calibrate/855233/9cb72634-0004-4cf3-abd5-f5d161501a82_calibrate_855233_61.stdout parsl.dataflow.dflow INFO: Standard error for task 5 available at /mmfs1/home/stevengs/dirac/DEEP/submit/DEEP/20190504/A1c/science#step1/20240313T004552Z/submit/DEEP/20190504/A1c/science#step1/20240313T004552Z/logs/calibrate/855233/9cb72634-0004-4cf3-abd5-f5d161501a82_calibrate_855233_61.stderr parsl.dataflow.strategy DEBUG: general strategy starting with strategy_type simple for 1 executors parsl.dataflow.strategy DEBUG: Strategizing for executor multi parsl.dataflow.strategy DEBUG: Slot ratio calculation: active_slots = 6, active_tasks = 16 parsl.dataflow.strategy DEBUG: Executor multi has 16 active tasks, 6/0 running/pending blocks, and 7 connected workers parsl.dataflow.strategy DEBUG: Strategy case 3: no changes necessary to current block load parsl.process_loggers DEBUG: Normal ending for _general_strategy on thread JobStatusPoller-Timer-Thread-22680214849104 parsl.app.errors DEBUG: Reraising exception of type parsl.dataflow.dflow DEBUG: Task 17 try 1 failed parsl.dataflow.dflow ERROR: Task 17 failed after 1 retry attempts Traceback (most recent call last): File "/mmfs1/gscratch/dirac/shared/opt/conda/envs/lsst-scipipe-8.0.0/lib/python3.11/site-packages/parsl/dataflow/dflow.py", line 300, in handle_exec_update res = self._unwrap_remote_exception_wrapper(future) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ File "/mmfs1/gscratch/dirac/shared/opt/conda/envs/lsst-scipipe-8.0.0/lib/python3.11/site-packages/parsl/dataflow/dflow.py", line 566, in _unwrap_remote_exception_wrapper result.reraise() File "/mmfs1/gscratch/dirac/shared/opt/conda/envs/lsst-scipipe-8.0.0/lib/python3.11/site-packages/parsl/app/errors.py", line 123, in reraise reraise(t, v, v.__traceback__) File "/mmfs1/gscratch/dirac/shared/opt/conda/envs/lsst-scipipe-8.0.0/lib/python3.11/site-packages/six.py", line 719, in reraise raise value File "/mmfs1/gscratch/dirac/shared/opt/conda/envs/lsst-scipipe-8.0.0/lib/python3.11/site-packages/parsl/app/errors.py", line 146, in wrapper return func(*args, **kwargs) ^^^^^^^^^^^^^^^^^ File "/mmfs1/gscratch/dirac/shared/opt/conda/envs/lsst-scipipe-8.0.0/lib/python3.11/site-packages/parsl/app/bash.py", line 86, in remote_side_bash_executor raise pe.BashExitFailure(func_name, proc.returncode) ^^^^^^^^^^^^^^^^^ parsl.app.errors.BashExitFailure: bash_app characterizeImage failed with unix exit code 1 parsl.dataflow.memoization DEBUG: Storing app cache entry 3d34f213e14b3d59740fb65a96697482 with result from task 17 parsl.dataflow.dflow WARNING: No tasks checkpointed so far in this run. Please ensure caching is enabled parsl.dataflow.dflow INFO: Task 18 failed due to dependency failure parsl.dataflow.dflow DEBUG: Task 18 try 0 failed parsl.dataflow.dflow INFO: Task 18 failed due to dependency failure so skipping retries parsl.dataflow.memoization ERROR: Attempting to update app cache entry but hashsum is not a string key parsl.dataflow.dflow WARNING: No tasks checkpointed so far in this run. Please ensure caching is enabled parsl.dataflow.dflow INFO: Standard output for task 18 available at /mmfs1/home/stevengs/dirac/DEEP/submit/DEEP/20190504/A1c/science#step1/20240313T004552Z/submit/DEEP/20190504/A1c/science#step1/20240313T004552Z/logs/calibrate/855156/3dc7987c-f84c-48e5-9ebd-39314ab84f67_calibrate_855156_2.stdout parsl.dataflow.dflow INFO: Standard error for task 18 available at /mmfs1/home/stevengs/dirac/DEEP/submit/DEEP/20190504/A1c/science#step1/20240313T004552Z/submit/DEEP/20190504/A1c/science#step1/20240313T004552Z/logs/calibrate/855156/3dc7987c-f84c-48e5-9ebd-39314ab84f67_calibrate_855156_2.stderr parsl.dataflow.dflow INFO: Standard output for task 17 available at /mmfs1/home/stevengs/dirac/DEEP/submit/DEEP/20190504/A1c/science#step1/20240313T004552Z/submit/DEEP/20190504/A1c/science#step1/20240313T004552Z/logs/characterizeImage/855156/6fa5b72d-f0ed-4553-bdef-d175805bcce4_characterizeImage_855156_2.stdout parsl.dataflow.dflow INFO: Standard error for task 17 available at /mmfs1/home/stevengs/dirac/DEEP/submit/DEEP/20190504/A1c/science#step1/20240313T004552Z/submit/DEEP/20190504/A1c/science#step1/20240313T004552Z/logs/characterizeImage/855156/6fa5b72d-f0ed-4553-bdef-d175805bcce4_characterizeImage_855156_2.stderr parsl.app.errors DEBUG: Reraising exception of type parsl.dataflow.dflow DEBUG: Task 19 try 1 failed parsl.dataflow.dflow ERROR: Task 19 failed after 1 retry attempts Traceback (most recent call last): File "/mmfs1/gscratch/dirac/shared/opt/conda/envs/lsst-scipipe-8.0.0/lib/python3.11/site-packages/parsl/dataflow/dflow.py", line 300, in handle_exec_update res = self._unwrap_remote_exception_wrapper(future) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ File "/mmfs1/gscratch/dirac/shared/opt/conda/envs/lsst-scipipe-8.0.0/lib/python3.11/site-packages/parsl/dataflow/dflow.py", line 566, in _unwrap_remote_exception_wrapper result.reraise() File "/mmfs1/gscratch/dirac/shared/opt/conda/envs/lsst-scipipe-8.0.0/lib/python3.11/site-packages/parsl/app/errors.py", line 123, in reraise reraise(t, v, v.__traceback__) File "/mmfs1/gscratch/dirac/shared/opt/conda/envs/lsst-scipipe-8.0.0/lib/python3.11/site-packages/six.py", line 719, in reraise raise value File "/mmfs1/gscratch/dirac/shared/opt/conda/envs/lsst-scipipe-8.0.0/lib/python3.11/site-packages/parsl/app/errors.py", line 146, in wrapper return func(*args, **kwargs) ^^^^^^^^^^^^^^^^^ File "/mmfs1/gscratch/dirac/shared/opt/conda/envs/lsst-scipipe-8.0.0/lib/python3.11/site-packages/parsl/app/bash.py", line 86, in remote_side_bash_executor raise pe.BashExitFailure(func_name, proc.returncode) ^^^^^^^^^^^^^^^^^ parsl.app.errors.BashExitFailure: bash_app characterizeImage failed with unix exit code 1 parsl.dataflow.memoization DEBUG: Storing app cache entry cfb31922038d574e4c7bf5532b3cefff with result from task 19 parsl.dataflow.dflow WARNING: No tasks checkpointed so far in this run. Please ensure caching is enabled parsl.dataflow.dflow INFO: Task 20 failed due to dependency failure parsl.dataflow.dflow DEBUG: Task 20 try 0 failed parsl.dataflow.dflow INFO: Task 20 failed due to dependency failure so skipping retries parsl.dataflow.memoization ERROR: Attempting to update app cache entry but hashsum is not a string key parsl.dataflow.dflow WARNING: No tasks checkpointed so far in this run. Please ensure caching is enabled parsl.dataflow.dflow INFO: Standard output for task 20 available at /mmfs1/home/stevengs/dirac/DEEP/submit/DEEP/20190504/A1c/science#step1/20240313T004552Z/submit/DEEP/20190504/A1c/science#step1/20240313T004552Z/logs/calibrate/855239/4510d27d-362d-4f7a-bec8-e2f7a0945712_calibrate_855239_2.stdout parsl.dataflow.dflow INFO: Standard error for task 20 available at /mmfs1/home/stevengs/dirac/DEEP/submit/DEEP/20190504/A1c/science#step1/20240313T004552Z/submit/DEEP/20190504/A1c/science#step1/20240313T004552Z/logs/calibrate/855239/4510d27d-362d-4f7a-bec8-e2f7a0945712_calibrate_855239_2.stderr parsl.dataflow.dflow INFO: Standard output for task 19 available at /mmfs1/home/stevengs/dirac/DEEP/submit/DEEP/20190504/A1c/science#step1/20240313T004552Z/submit/DEEP/20190504/A1c/science#step1/20240313T004552Z/logs/characterizeImage/855239/9aa0bc06-a00f-4190-9d5e-56a0bc89d175_characterizeImage_855239_2.stdout parsl.dataflow.dflow INFO: Standard error for task 19 available at /mmfs1/home/stevengs/dirac/DEEP/submit/DEEP/20190504/A1c/science#step1/20240313T004552Z/submit/DEEP/20190504/A1c/science#step1/20240313T004552Z/logs/characterizeImage/855239/9aa0bc06-a00f-4190-9d5e-56a0bc89d175_characterizeImage_855239_2.stderr parsl.app.errors DEBUG: Reraising exception of type parsl.dataflow.dflow DEBUG: Task 21 try 1 failed parsl.dataflow.dflow ERROR: Task 21 failed after 1 retry attempts Traceback (most recent call last): File "/mmfs1/gscratch/dirac/shared/opt/conda/envs/lsst-scipipe-8.0.0/lib/python3.11/site-packages/parsl/dataflow/dflow.py", line 300, in handle_exec_update res = self._unwrap_remote_exception_wrapper(future) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ File "/mmfs1/gscratch/dirac/shared/opt/conda/envs/lsst-scipipe-8.0.0/lib/python3.11/site-packages/parsl/dataflow/dflow.py", line 566, in _unwrap_remote_exception_wrapper result.reraise() File "/mmfs1/gscratch/dirac/shared/opt/conda/envs/lsst-scipipe-8.0.0/lib/python3.11/site-packages/parsl/app/errors.py", line 123, in reraise reraise(t, v, v.__traceback__) File "/mmfs1/gscratch/dirac/shared/opt/conda/envs/lsst-scipipe-8.0.0/lib/python3.11/site-packages/six.py", line 719, in reraise raise value File "/mmfs1/gscratch/dirac/shared/opt/conda/envs/lsst-scipipe-8.0.0/lib/python3.11/site-packages/parsl/app/errors.py", line 146, in wrapper return func(*args, **kwargs) ^^^^^^^^^^^^^^^^^ File "/mmfs1/gscratch/dirac/shared/opt/conda/envs/lsst-scipipe-8.0.0/lib/python3.11/site-packages/parsl/app/bash.py", line 86, in remote_side_bash_executor raise pe.BashExitFailure(func_name, proc.returncode) ^^^^^^^^^^^^^^^^^ parsl.app.errors.BashExitFailure: bash_app characterizeImage failed with unix exit code 1 parsl.dataflow.memoization DEBUG: Storing app cache entry e5bff4b603a3fb4276ac94ffaccbec95 with result from task 21 parsl.dataflow.dflow WARNING: No tasks checkpointed so far in this run. Please ensure caching is enabled parsl.dataflow.dflow INFO: Task 22 failed due to dependency failure parsl.dataflow.dflow DEBUG: Task 22 try 0 failed parsl.dataflow.dflow INFO: Task 22 failed due to dependency failure so skipping retries parsl.dataflow.memoization ERROR: Attempting to update app cache entry but hashsum is not a string key parsl.dataflow.dflow WARNING: No tasks checkpointed so far in this run. Please ensure caching is enabled parsl.dataflow.dflow INFO: Standard output for task 22 available at /mmfs1/home/stevengs/dirac/DEEP/submit/DEEP/20190504/A1c/science#step1/20240313T004552Z/submit/DEEP/20190504/A1c/science#step1/20240313T004552Z/logs/calibrate/855235/317988cc-e255-45cd-8f5c-1e35bc40f88e_calibrate_855235_2.stdout parsl.dataflow.dflow INFO: Standard error for task 22 available at /mmfs1/home/stevengs/dirac/DEEP/submit/DEEP/20190504/A1c/science#step1/20240313T004552Z/submit/DEEP/20190504/A1c/science#step1/20240313T004552Z/logs/calibrate/855235/317988cc-e255-45cd-8f5c-1e35bc40f88e_calibrate_855235_2.stderr parsl.dataflow.dflow INFO: Standard output for task 21 available at /mmfs1/home/stevengs/dirac/DEEP/submit/DEEP/20190504/A1c/science#step1/20240313T004552Z/submit/DEEP/20190504/A1c/science#step1/20240313T004552Z/logs/characterizeImage/855235/1a05a316-e1c2-484e-9bc6-793456bd113e_characterizeImage_855235_2.stdout parsl.dataflow.dflow INFO: Standard error for task 21 available at /mmfs1/home/stevengs/dirac/DEEP/submit/DEEP/20190504/A1c/science#step1/20240313T004552Z/submit/DEEP/20190504/A1c/science#step1/20240313T004552Z/logs/characterizeImage/855235/1a05a316-e1c2-484e-9bc6-793456bd113e_characterizeImage_855235_2.stderr parsl.app.errors DEBUG: Reraising exception of type parsl.dataflow.dflow DEBUG: Task 8 try 0 failed parsl.dataflow.dflow INFO: Task 8 marked for retry parsl.dataflow.dflow INFO: Standard output for task 8 available at /mmfs1/home/stevengs/dirac/DEEP/submit/DEEP/20190504/A1c/science#step1/20240313T004552Z/submit/DEEP/20190504/A1c/science#step1/20240313T004552Z/logs/calibrate/855154/cceb2f44-be57-4f20-9192-4ebc5906f4c0_calibrate_855154_61.stdout parsl.dataflow.dflow INFO: Standard error for task 8 available at /mmfs1/home/stevengs/dirac/DEEP/submit/DEEP/20190504/A1c/science#step1/20240313T004552Z/submit/DEEP/20190504/A1c/science#step1/20240313T004552Z/logs/calibrate/855154/cceb2f44-be57-4f20-9192-4ebc5906f4c0_calibrate_855154_61.stderr parsl.dataflow.memoization DEBUG: Ignoring these kwargs for checkpointing: ['stderr', 'stdout'] parsl.dataflow.memoization DEBUG: Ignoring kwarg stderr parsl.dataflow.memoization DEBUG: Ignoring kwarg stdout parsl.dataflow.memoization DEBUG: Task 8 has memoization hash aa318d7095aa430df1e119729183fe2f parsl.dataflow.memoization INFO: Task 8 had no result in cache parsl.executors.high_throughput.executor DEBUG: Pushing function .wrapper at 0x14a0a59beb60> to queue with args ("'${CTRL_MPEXEC_DIR}/bin/pipetask --long-log --log-level=VERBOSE run-qbb /mmfs1/home/stevengs/dirac/D...",) parsl.dataflow.dflow INFO: Parsl task 8 try 1 launched on executor multi with executor id 48 parsl.dataflow.dflow INFO: Standard output for task 8 available at /mmfs1/home/stevengs/dirac/DEEP/submit/DEEP/20190504/A1c/science#step1/20240313T004552Z/submit/DEEP/20190504/A1c/science#step1/20240313T004552Z/logs/calibrate/855154/cceb2f44-be57-4f20-9192-4ebc5906f4c0_calibrate_855154_61.stdout parsl.dataflow.dflow INFO: Standard error for task 8 available at /mmfs1/home/stevengs/dirac/DEEP/submit/DEEP/20190504/A1c/science#step1/20240313T004552Z/submit/DEEP/20190504/A1c/science#step1/20240313T004552Z/logs/calibrate/855154/cceb2f44-be57-4f20-9192-4ebc5906f4c0_calibrate_855154_61.stderr 2024-03-13 00:49:04 proc_lsst.multi:146 [INFO] found job 45623 in provider local proc_lsst.multi INFO: found job 45623 in provider local 2024-03-13 00:49:04 proc_lsst.multi:146 [INFO] found job 17006648 in provider astro proc_lsst.multi INFO: found job 17006648 in provider astro parsl.providers.slurm.slurm DEBUG: Executing squeue --noheader --format='%i %t' --job '17006648,17006649,17006650,17006651,17006652' parsl.providers.slurm.slurm DEBUG: squeue returned 17006650 R 17006651 R 17006652 R 17006648 R 17006649 R parsl.providers.slurm.slurm DEBUG: Updating job 17006650 with slurm status R to parsl state JobState.RUNNING parsl.providers.slurm.slurm DEBUG: Updating job 17006651 with slurm status R to parsl state JobState.RUNNING parsl.providers.slurm.slurm DEBUG: Updating job 17006652 with slurm status R to parsl state JobState.RUNNING parsl.providers.slurm.slurm DEBUG: Updating job 17006648 with slurm status R to parsl state JobState.RUNNING parsl.providers.slurm.slurm DEBUG: Updating job 17006649 with slurm status R to parsl state JobState.RUNNING 2024-03-13 00:49:05 proc_lsst.multi:146 [INFO] found job 17006649 in provider astro proc_lsst.multi INFO: found job 17006649 in provider astro parsl.providers.slurm.slurm DEBUG: Executing squeue --noheader --format='%i %t' --job '17006648,17006649,17006650,17006651,17006652' parsl.providers.slurm.slurm DEBUG: squeue returned 17006650 R 17006651 R 17006652 R 17006648 R 17006649 R parsl.providers.slurm.slurm DEBUG: Updating job 17006650 with slurm status R to parsl state JobState.RUNNING parsl.providers.slurm.slurm DEBUG: Updating job 17006651 with slurm status R to parsl state JobState.RUNNING parsl.providers.slurm.slurm DEBUG: Updating job 17006652 with slurm status R to parsl state JobState.RUNNING parsl.providers.slurm.slurm DEBUG: Updating job 17006648 with slurm status R to parsl state JobState.RUNNING parsl.providers.slurm.slurm DEBUG: Updating job 17006649 with slurm status R to parsl state JobState.RUNNING 2024-03-13 00:49:05 proc_lsst.multi:146 [INFO] found job 17006650 in provider astro proc_lsst.multi INFO: found job 17006650 in provider astro parsl.providers.slurm.slurm DEBUG: Executing squeue --noheader --format='%i %t' --job '17006648,17006649,17006650,17006651,17006652' parsl.providers.slurm.slurm DEBUG: squeue returned 17006650 R 17006651 R 17006652 R 17006648 R 17006649 R parsl.providers.slurm.slurm DEBUG: Updating job 17006650 with slurm status R to parsl state JobState.RUNNING parsl.providers.slurm.slurm DEBUG: Updating job 17006651 with slurm status R to parsl state JobState.RUNNING parsl.providers.slurm.slurm DEBUG: Updating job 17006652 with slurm status R to parsl state JobState.RUNNING parsl.providers.slurm.slurm DEBUG: Updating job 17006648 with slurm status R to parsl state JobState.RUNNING parsl.providers.slurm.slurm DEBUG: Updating job 17006649 with slurm status R to parsl state JobState.RUNNING 2024-03-13 00:49:06 proc_lsst.multi:146 [INFO] found job 17006651 in provider astro proc_lsst.multi INFO: found job 17006651 in provider astro parsl.providers.slurm.slurm DEBUG: Executing squeue --noheader --format='%i %t' --job '17006648,17006649,17006650,17006651,17006652' parsl.providers.slurm.slurm DEBUG: squeue returned 17006650 R 17006651 R 17006652 R 17006648 R 17006649 R parsl.providers.slurm.slurm DEBUG: Updating job 17006650 with slurm status R to parsl state JobState.RUNNING parsl.providers.slurm.slurm DEBUG: Updating job 17006651 with slurm status R to parsl state JobState.RUNNING parsl.providers.slurm.slurm DEBUG: Updating job 17006652 with slurm status R to parsl state JobState.RUNNING parsl.providers.slurm.slurm DEBUG: Updating job 17006648 with slurm status R to parsl state JobState.RUNNING parsl.providers.slurm.slurm DEBUG: Updating job 17006649 with slurm status R to parsl state JobState.RUNNING 2024-03-13 00:49:06 proc_lsst.multi:146 [INFO] found job 17006652 in provider astro proc_lsst.multi INFO: found job 17006652 in provider astro parsl.providers.slurm.slurm DEBUG: Executing squeue --noheader --format='%i %t' --job '17006648,17006649,17006650,17006651,17006652' parsl.providers.slurm.slurm DEBUG: squeue returned 17006650 R 17006651 R 17006652 R 17006648 R 17006649 R parsl.providers.slurm.slurm DEBUG: Updating job 17006650 with slurm status R to parsl state JobState.RUNNING parsl.providers.slurm.slurm DEBUG: Updating job 17006651 with slurm status R to parsl state JobState.RUNNING parsl.providers.slurm.slurm DEBUG: Updating job 17006652 with slurm status R to parsl state JobState.RUNNING parsl.providers.slurm.slurm DEBUG: Updating job 17006648 with slurm status R to parsl state JobState.RUNNING parsl.providers.slurm.slurm DEBUG: Updating job 17006649 with slurm status R to parsl state JobState.RUNNING parsl.dataflow.strategy DEBUG: general strategy starting with strategy_type simple for 1 executors parsl.dataflow.strategy DEBUG: Strategizing for executor multi parsl.dataflow.strategy DEBUG: Slot ratio calculation: active_slots = 6, active_tasks = 13 parsl.dataflow.strategy DEBUG: Executor multi has 13 active tasks, 6/0 running/pending blocks, and 7 connected workers parsl.dataflow.strategy DEBUG: Strategy case 3: no changes necessary to current block load parsl.process_loggers DEBUG: Normal ending for _general_strategy on thread JobStatusPoller-Timer-Thread-22680214849104 parsl.app.errors DEBUG: Reraising exception of type parsl.dataflow.dflow DEBUG: Task 25 try 1 failed parsl.dataflow.dflow ERROR: Task 25 failed after 1 retry attempts Traceback (most recent call last): File "/mmfs1/gscratch/dirac/shared/opt/conda/envs/lsst-scipipe-8.0.0/lib/python3.11/site-packages/parsl/dataflow/dflow.py", line 300, in handle_exec_update res = self._unwrap_remote_exception_wrapper(future) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ File "/mmfs1/gscratch/dirac/shared/opt/conda/envs/lsst-scipipe-8.0.0/lib/python3.11/site-packages/parsl/dataflow/dflow.py", line 566, in _unwrap_remote_exception_wrapper result.reraise() File "/mmfs1/gscratch/dirac/shared/opt/conda/envs/lsst-scipipe-8.0.0/lib/python3.11/site-packages/parsl/app/errors.py", line 123, in reraise reraise(t, v, v.__traceback__) File "/mmfs1/gscratch/dirac/shared/opt/conda/envs/lsst-scipipe-8.0.0/lib/python3.11/site-packages/six.py", line 719, in reraise raise value File "/mmfs1/gscratch/dirac/shared/opt/conda/envs/lsst-scipipe-8.0.0/lib/python3.11/site-packages/parsl/app/errors.py", line 146, in wrapper return func(*args, **kwargs) ^^^^^^^^^^^^^^^^^ File "/mmfs1/gscratch/dirac/shared/opt/conda/envs/lsst-scipipe-8.0.0/lib/python3.11/site-packages/parsl/app/bash.py", line 86, in remote_side_bash_executor raise pe.BashExitFailure(func_name, proc.returncode) ^^^^^^^^^^^^^^^^^ parsl.app.errors.BashExitFailure: bash_app characterizeImage failed with unix exit code 1 parsl.dataflow.memoization DEBUG: Storing app cache entry bfdecf62681aa65ffe26b17739ad7ac2 with result from task 25 parsl.dataflow.dflow WARNING: No tasks checkpointed so far in this run. Please ensure caching is enabled parsl.dataflow.dflow INFO: Task 26 failed due to dependency failure parsl.dataflow.dflow DEBUG: Task 26 try 0 failed parsl.dataflow.dflow INFO: Task 26 failed due to dependency failure so skipping retries parsl.dataflow.memoization ERROR: Attempting to update app cache entry but hashsum is not a string key parsl.dataflow.dflow WARNING: No tasks checkpointed so far in this run. Please ensure caching is enabled parsl.dataflow.dflow INFO: Standard output for task 26 available at /mmfs1/home/stevengs/dirac/DEEP/submit/DEEP/20190504/A1c/science#step1/20240313T004552Z/submit/DEEP/20190504/A1c/science#step1/20240313T004552Z/logs/calibrate/855237/dde0a230-fd23-40b0-afea-38c251cdbe68_calibrate_855237_2.stdout parsl.dataflow.dflow INFO: Standard error for task 26 available at /mmfs1/home/stevengs/dirac/DEEP/submit/DEEP/20190504/A1c/science#step1/20240313T004552Z/submit/DEEP/20190504/A1c/science#step1/20240313T004552Z/logs/calibrate/855237/dde0a230-fd23-40b0-afea-38c251cdbe68_calibrate_855237_2.stderr parsl.dataflow.dflow INFO: Standard output for task 25 available at /mmfs1/home/stevengs/dirac/DEEP/submit/DEEP/20190504/A1c/science#step1/20240313T004552Z/submit/DEEP/20190504/A1c/science#step1/20240313T004552Z/logs/characterizeImage/855237/3964ae79-23a7-439d-900c-c00b7d8f7d45_characterizeImage_855237_2.stdout parsl.dataflow.dflow INFO: Standard error for task 25 available at /mmfs1/home/stevengs/dirac/DEEP/submit/DEEP/20190504/A1c/science#step1/20240313T004552Z/submit/DEEP/20190504/A1c/science#step1/20240313T004552Z/logs/characterizeImage/855237/3964ae79-23a7-439d-900c-c00b7d8f7d45_characterizeImage_855237_2.stderr parsl.app.errors DEBUG: Reraising exception of type parsl.dataflow.dflow DEBUG: Task 23 try 1 failed parsl.dataflow.dflow ERROR: Task 23 failed after 1 retry attempts Traceback (most recent call last): File "/mmfs1/gscratch/dirac/shared/opt/conda/envs/lsst-scipipe-8.0.0/lib/python3.11/site-packages/parsl/dataflow/dflow.py", line 300, in handle_exec_update res = self._unwrap_remote_exception_wrapper(future) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ File "/mmfs1/gscratch/dirac/shared/opt/conda/envs/lsst-scipipe-8.0.0/lib/python3.11/site-packages/parsl/dataflow/dflow.py", line 566, in _unwrap_remote_exception_wrapper result.reraise() File "/mmfs1/gscratch/dirac/shared/opt/conda/envs/lsst-scipipe-8.0.0/lib/python3.11/site-packages/parsl/app/errors.py", line 123, in reraise reraise(t, v, v.__traceback__) File "/mmfs1/gscratch/dirac/shared/opt/conda/envs/lsst-scipipe-8.0.0/lib/python3.11/site-packages/six.py", line 719, in reraise raise value File "/mmfs1/gscratch/dirac/shared/opt/conda/envs/lsst-scipipe-8.0.0/lib/python3.11/site-packages/parsl/app/errors.py", line 146, in wrapper return func(*args, **kwargs) ^^^^^^^^^^^^^^^^^ File "/mmfs1/gscratch/dirac/shared/opt/conda/envs/lsst-scipipe-8.0.0/lib/python3.11/site-packages/parsl/app/bash.py", line 86, in remote_side_bash_executor raise pe.BashExitFailure(func_name, proc.returncode) ^^^^^^^^^^^^^^^^^ parsl.app.errors.BashExitFailure: bash_app characterizeImage failed with unix exit code 1 parsl.dataflow.memoization DEBUG: Storing app cache entry 06b6c076abfa94b9413c6d6e7e064e18 with result from task 23 parsl.dataflow.dflow WARNING: No tasks checkpointed so far in this run. Please ensure caching is enabled parsl.dataflow.dflow INFO: Task 24 failed due to dependency failure parsl.dataflow.dflow DEBUG: Task 24 try 0 failed parsl.dataflow.dflow INFO: Task 24 failed due to dependency failure so skipping retries parsl.dataflow.memoization ERROR: Attempting to update app cache entry but hashsum is not a string key parsl.dataflow.dflow WARNING: No tasks checkpointed so far in this run. Please ensure caching is enabled parsl.dataflow.dflow INFO: Standard output for task 24 available at /mmfs1/home/stevengs/dirac/DEEP/submit/DEEP/20190504/A1c/science#step1/20240313T004552Z/submit/DEEP/20190504/A1c/science#step1/20240313T004552Z/logs/calibrate/855231/cee351f9-b25d-4bdd-8227-4fa3f5d03e80_calibrate_855231_2.stdout parsl.dataflow.dflow INFO: Standard error for task 24 available at /mmfs1/home/stevengs/dirac/DEEP/submit/DEEP/20190504/A1c/science#step1/20240313T004552Z/submit/DEEP/20190504/A1c/science#step1/20240313T004552Z/logs/calibrate/855231/cee351f9-b25d-4bdd-8227-4fa3f5d03e80_calibrate_855231_2.stderr parsl.dataflow.dflow INFO: Standard output for task 23 available at /mmfs1/home/stevengs/dirac/DEEP/submit/DEEP/20190504/A1c/science#step1/20240313T004552Z/submit/DEEP/20190504/A1c/science#step1/20240313T004552Z/logs/characterizeImage/855231/0006b709-d908-4273-8054-8b0ac74bbaa1_characterizeImage_855231_2.stdout parsl.dataflow.dflow INFO: Standard error for task 23 available at /mmfs1/home/stevengs/dirac/DEEP/submit/DEEP/20190504/A1c/science#step1/20240313T004552Z/submit/DEEP/20190504/A1c/science#step1/20240313T004552Z/logs/characterizeImage/855231/0006b709-d908-4273-8054-8b0ac74bbaa1_characterizeImage_855231_2.stderr parsl.app.errors DEBUG: Reraising exception of type parsl.dataflow.dflow DEBUG: Task 27 try 1 failed parsl.dataflow.dflow ERROR: Task 27 failed after 1 retry attempts Traceback (most recent call last): File "/mmfs1/gscratch/dirac/shared/opt/conda/envs/lsst-scipipe-8.0.0/lib/python3.11/site-packages/parsl/dataflow/dflow.py", line 300, in handle_exec_update res = self._unwrap_remote_exception_wrapper(future) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ File "/mmfs1/gscratch/dirac/shared/opt/conda/envs/lsst-scipipe-8.0.0/lib/python3.11/site-packages/parsl/dataflow/dflow.py", line 566, in _unwrap_remote_exception_wrapper result.reraise() File "/mmfs1/gscratch/dirac/shared/opt/conda/envs/lsst-scipipe-8.0.0/lib/python3.11/site-packages/parsl/app/errors.py", line 123, in reraise reraise(t, v, v.__traceback__) File "/mmfs1/gscratch/dirac/shared/opt/conda/envs/lsst-scipipe-8.0.0/lib/python3.11/site-packages/six.py", line 719, in reraise raise value File "/mmfs1/gscratch/dirac/shared/opt/conda/envs/lsst-scipipe-8.0.0/lib/python3.11/site-packages/parsl/app/errors.py", line 146, in wrapper return func(*args, **kwargs) ^^^^^^^^^^^^^^^^^ File "/mmfs1/gscratch/dirac/shared/opt/conda/envs/lsst-scipipe-8.0.0/lib/python3.11/site-packages/parsl/app/bash.py", line 86, in remote_side_bash_executor raise pe.BashExitFailure(func_name, proc.returncode) ^^^^^^^^^^^^^^^^^ parsl.app.errors.BashExitFailure: bash_app characterizeImage failed with unix exit code 1 parsl.dataflow.memoization DEBUG: Storing app cache entry 44db16090aa4aaf3a35355dc07aebd18 with result from task 27 parsl.dataflow.dflow WARNING: No tasks checkpointed so far in this run. Please ensure caching is enabled parsl.dataflow.dflow INFO: Task 28 failed due to dependency failure parsl.dataflow.dflow DEBUG: Task 28 try 0 failed parsl.dataflow.dflow INFO: Task 28 failed due to dependency failure so skipping retries parsl.dataflow.memoization ERROR: Attempting to update app cache entry but hashsum is not a string key parsl.dataflow.dflow WARNING: No tasks checkpointed so far in this run. Please ensure caching is enabled parsl.dataflow.dflow INFO: Standard output for task 28 available at /mmfs1/home/stevengs/dirac/DEEP/submit/DEEP/20190504/A1c/science#step1/20240313T004552Z/submit/DEEP/20190504/A1c/science#step1/20240313T004552Z/logs/calibrate/855234/7875e5ea-eb42-4b03-b3b6-157f98419e3b_calibrate_855234_2.stdout parsl.dataflow.dflow INFO: Standard error for task 28 available at /mmfs1/home/stevengs/dirac/DEEP/submit/DEEP/20190504/A1c/science#step1/20240313T004552Z/submit/DEEP/20190504/A1c/science#step1/20240313T004552Z/logs/calibrate/855234/7875e5ea-eb42-4b03-b3b6-157f98419e3b_calibrate_855234_2.stderr parsl.dataflow.dflow INFO: Standard output for task 27 available at /mmfs1/home/stevengs/dirac/DEEP/submit/DEEP/20190504/A1c/science#step1/20240313T004552Z/submit/DEEP/20190504/A1c/science#step1/20240313T004552Z/logs/characterizeImage/855234/6020d23d-ca0d-4fc1-919a-75bb54614ea8_characterizeImage_855234_2.stdout parsl.dataflow.dflow INFO: Standard error for task 27 available at /mmfs1/home/stevengs/dirac/DEEP/submit/DEEP/20190504/A1c/science#step1/20240313T004552Z/submit/DEEP/20190504/A1c/science#step1/20240313T004552Z/logs/characterizeImage/855234/6020d23d-ca0d-4fc1-919a-75bb54614ea8_characterizeImage_855234_2.stderr parsl.dataflow.strategy DEBUG: general strategy starting with strategy_type simple for 1 executors parsl.dataflow.strategy DEBUG: Strategizing for executor multi parsl.dataflow.strategy DEBUG: Slot ratio calculation: active_slots = 6, active_tasks = 10 parsl.dataflow.strategy DEBUG: Executor multi has 10 active tasks, 6/0 running/pending blocks, and 7 connected workers parsl.dataflow.strategy DEBUG: Strategy case 3: no changes necessary to current block load parsl.process_loggers DEBUG: Normal ending for _general_strategy on thread JobStatusPoller-Timer-Thread-22680214849104 parsl.app.errors DEBUG: Reraising exception of type parsl.dataflow.dflow DEBUG: Task 31 try 1 failed parsl.dataflow.dflow ERROR: Task 31 failed after 1 retry attempts Traceback (most recent call last): File "/mmfs1/gscratch/dirac/shared/opt/conda/envs/lsst-scipipe-8.0.0/lib/python3.11/site-packages/parsl/dataflow/dflow.py", line 300, in handle_exec_update res = self._unwrap_remote_exception_wrapper(future) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ File "/mmfs1/gscratch/dirac/shared/opt/conda/envs/lsst-scipipe-8.0.0/lib/python3.11/site-packages/parsl/dataflow/dflow.py", line 566, in _unwrap_remote_exception_wrapper result.reraise() File "/mmfs1/gscratch/dirac/shared/opt/conda/envs/lsst-scipipe-8.0.0/lib/python3.11/site-packages/parsl/app/errors.py", line 123, in reraise reraise(t, v, v.__traceback__) File "/mmfs1/gscratch/dirac/shared/opt/conda/envs/lsst-scipipe-8.0.0/lib/python3.11/site-packages/six.py", line 719, in reraise raise value File "/mmfs1/gscratch/dirac/shared/opt/conda/envs/lsst-scipipe-8.0.0/lib/python3.11/site-packages/parsl/app/errors.py", line 146, in wrapper return func(*args, **kwargs) ^^^^^^^^^^^^^^^^^ File "/mmfs1/gscratch/dirac/shared/opt/conda/envs/lsst-scipipe-8.0.0/lib/python3.11/site-packages/parsl/app/bash.py", line 86, in remote_side_bash_executor raise pe.BashExitFailure(func_name, proc.returncode) ^^^^^^^^^^^^^^^^^ parsl.app.errors.BashExitFailure: bash_app characterizeImage failed with unix exit code 1 parsl.dataflow.memoization DEBUG: Storing app cache entry a067278971b1cbc6f9ee7563d832fe8d with result from task 31 parsl.dataflow.dflow WARNING: No tasks checkpointed so far in this run. Please ensure caching is enabled parsl.dataflow.dflow INFO: Task 32 failed due to dependency failure parsl.dataflow.dflow DEBUG: Task 32 try 0 failed parsl.dataflow.dflow INFO: Task 32 failed due to dependency failure so skipping retries parsl.dataflow.memoization ERROR: Attempting to update app cache entry but hashsum is not a string key parsl.dataflow.dflow WARNING: No tasks checkpointed so far in this run. Please ensure caching is enabled parsl.dataflow.dflow INFO: Standard output for task 32 available at /mmfs1/home/stevengs/dirac/DEEP/submit/DEEP/20190504/A1c/science#step1/20240313T004552Z/submit/DEEP/20190504/A1c/science#step1/20240313T004552Z/logs/calibrate/855238/f2062f9f-f690-4838-bace-e9fab971105b_calibrate_855238_2.stdout parsl.dataflow.dflow INFO: Standard error for task 32 available at /mmfs1/home/stevengs/dirac/DEEP/submit/DEEP/20190504/A1c/science#step1/20240313T004552Z/submit/DEEP/20190504/A1c/science#step1/20240313T004552Z/logs/calibrate/855238/f2062f9f-f690-4838-bace-e9fab971105b_calibrate_855238_2.stderr parsl.dataflow.dflow INFO: Standard output for task 31 available at /mmfs1/home/stevengs/dirac/DEEP/submit/DEEP/20190504/A1c/science#step1/20240313T004552Z/submit/DEEP/20190504/A1c/science#step1/20240313T004552Z/logs/characterizeImage/855238/4605cee3-743a-493d-83f5-fdf545d07467_characterizeImage_855238_2.stdout parsl.dataflow.dflow INFO: Standard error for task 31 available at /mmfs1/home/stevengs/dirac/DEEP/submit/DEEP/20190504/A1c/science#step1/20240313T004552Z/submit/DEEP/20190504/A1c/science#step1/20240313T004552Z/logs/characterizeImage/855238/4605cee3-743a-493d-83f5-fdf545d07467_characterizeImage_855238_2.stderr parsl.dataflow.strategy DEBUG: general strategy starting with strategy_type simple for 1 executors parsl.dataflow.strategy DEBUG: Strategizing for executor multi parsl.dataflow.strategy DEBUG: Slot ratio calculation: active_slots = 6, active_tasks = 9 parsl.dataflow.strategy DEBUG: Executor multi has 9 active tasks, 6/0 running/pending blocks, and 7 connected workers parsl.dataflow.strategy DEBUG: Strategy case 3: no changes necessary to current block load parsl.process_loggers DEBUG: Normal ending for _general_strategy on thread JobStatusPoller-Timer-Thread-22680214849104 parsl.app.errors DEBUG: Reraising exception of type parsl.dataflow.dflow DEBUG: Task 29 try 1 failed parsl.dataflow.dflow ERROR: Task 29 failed after 1 retry attempts Traceback (most recent call last): File "/mmfs1/gscratch/dirac/shared/opt/conda/envs/lsst-scipipe-8.0.0/lib/python3.11/site-packages/parsl/dataflow/dflow.py", line 300, in handle_exec_update res = self._unwrap_remote_exception_wrapper(future) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ File "/mmfs1/gscratch/dirac/shared/opt/conda/envs/lsst-scipipe-8.0.0/lib/python3.11/site-packages/parsl/dataflow/dflow.py", line 566, in _unwrap_remote_exception_wrapper result.reraise() File "/mmfs1/gscratch/dirac/shared/opt/conda/envs/lsst-scipipe-8.0.0/lib/python3.11/site-packages/parsl/app/errors.py", line 123, in reraise reraise(t, v, v.__traceback__) File "/mmfs1/gscratch/dirac/shared/opt/conda/envs/lsst-scipipe-8.0.0/lib/python3.11/site-packages/six.py", line 719, in reraise raise value File "/mmfs1/gscratch/dirac/shared/opt/conda/envs/lsst-scipipe-8.0.0/lib/python3.11/site-packages/parsl/app/errors.py", line 146, in wrapper return func(*args, **kwargs) ^^^^^^^^^^^^^^^^^ File "/mmfs1/gscratch/dirac/shared/opt/conda/envs/lsst-scipipe-8.0.0/lib/python3.11/site-packages/parsl/app/bash.py", line 86, in remote_side_bash_executor raise pe.BashExitFailure(func_name, proc.returncode) ^^^^^^^^^^^^^^^^^ parsl.app.errors.BashExitFailure: bash_app characterizeImage failed with unix exit code 1 parsl.dataflow.memoization DEBUG: Storing app cache entry b84bc45020e63591ded62a69b711a669 with result from task 29 parsl.dataflow.dflow WARNING: No tasks checkpointed so far in this run. Please ensure caching is enabled parsl.dataflow.dflow INFO: Task 30 failed due to dependency failure parsl.dataflow.dflow DEBUG: Task 30 try 0 failed parsl.dataflow.dflow INFO: Task 30 failed due to dependency failure so skipping retries parsl.dataflow.memoization ERROR: Attempting to update app cache entry but hashsum is not a string key parsl.dataflow.dflow WARNING: No tasks checkpointed so far in this run. Please ensure caching is enabled parsl.dataflow.dflow INFO: Standard output for task 30 available at /mmfs1/home/stevengs/dirac/DEEP/submit/DEEP/20190504/A1c/science#step1/20240313T004552Z/submit/DEEP/20190504/A1c/science#step1/20240313T004552Z/logs/calibrate/855240/e958baae-357f-462b-8662-ebd9dac8f862_calibrate_855240_2.stdout parsl.dataflow.dflow INFO: Standard error for task 30 available at /mmfs1/home/stevengs/dirac/DEEP/submit/DEEP/20190504/A1c/science#step1/20240313T004552Z/submit/DEEP/20190504/A1c/science#step1/20240313T004552Z/logs/calibrate/855240/e958baae-357f-462b-8662-ebd9dac8f862_calibrate_855240_2.stderr parsl.dataflow.dflow INFO: Standard output for task 29 available at /mmfs1/home/stevengs/dirac/DEEP/submit/DEEP/20190504/A1c/science#step1/20240313T004552Z/submit/DEEP/20190504/A1c/science#step1/20240313T004552Z/logs/characterizeImage/855240/a66aed64-17b6-4dfc-ab12-497daf3266cd_characterizeImage_855240_2.stdout parsl.dataflow.dflow INFO: Standard error for task 29 available at /mmfs1/home/stevengs/dirac/DEEP/submit/DEEP/20190504/A1c/science#step1/20240313T004552Z/submit/DEEP/20190504/A1c/science#step1/20240313T004552Z/logs/characterizeImage/855240/a66aed64-17b6-4dfc-ab12-497daf3266cd_characterizeImage_855240_2.stderr parsl.app.errors DEBUG: Reraising exception of type parsl.dataflow.dflow DEBUG: Task 33 try 1 failed parsl.dataflow.dflow ERROR: Task 33 failed after 1 retry attempts Traceback (most recent call last): File "/mmfs1/gscratch/dirac/shared/opt/conda/envs/lsst-scipipe-8.0.0/lib/python3.11/site-packages/parsl/dataflow/dflow.py", line 300, in handle_exec_update res = self._unwrap_remote_exception_wrapper(future) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ File "/mmfs1/gscratch/dirac/shared/opt/conda/envs/lsst-scipipe-8.0.0/lib/python3.11/site-packages/parsl/dataflow/dflow.py", line 566, in _unwrap_remote_exception_wrapper result.reraise() File "/mmfs1/gscratch/dirac/shared/opt/conda/envs/lsst-scipipe-8.0.0/lib/python3.11/site-packages/parsl/app/errors.py", line 123, in reraise reraise(t, v, v.__traceback__) File "/mmfs1/gscratch/dirac/shared/opt/conda/envs/lsst-scipipe-8.0.0/lib/python3.11/site-packages/six.py", line 719, in reraise raise value File "/mmfs1/gscratch/dirac/shared/opt/conda/envs/lsst-scipipe-8.0.0/lib/python3.11/site-packages/parsl/app/errors.py", line 146, in wrapper return func(*args, **kwargs) ^^^^^^^^^^^^^^^^^ File "/mmfs1/gscratch/dirac/shared/opt/conda/envs/lsst-scipipe-8.0.0/lib/python3.11/site-packages/parsl/app/bash.py", line 86, in remote_side_bash_executor raise pe.BashExitFailure(func_name, proc.returncode) ^^^^^^^^^^^^^^^^^ parsl.app.errors.BashExitFailure: bash_app characterizeImage failed with unix exit code 1 parsl.dataflow.memoization DEBUG: Storing app cache entry c55044e0b9b4ddf47a2be94fdc8cf8ba with result from task 33 parsl.dataflow.dflow WARNING: No tasks checkpointed so far in this run. Please ensure caching is enabled parsl.dataflow.dflow INFO: Task 34 failed due to dependency failure parsl.dataflow.dflow DEBUG: Task 34 try 0 failed parsl.dataflow.dflow INFO: Task 34 failed due to dependency failure so skipping retries parsl.dataflow.memoization ERROR: Attempting to update app cache entry but hashsum is not a string key parsl.dataflow.dflow WARNING: No tasks checkpointed so far in this run. Please ensure caching is enabled parsl.dataflow.dflow INFO: Standard output for task 34 available at /mmfs1/home/stevengs/dirac/DEEP/submit/DEEP/20190504/A1c/science#step1/20240313T004552Z/submit/DEEP/20190504/A1c/science#step1/20240313T004552Z/logs/calibrate/855154/6082ac95-d408-4089-afe3-79945974ed2a_calibrate_855154_2.stdout parsl.dataflow.dflow INFO: Standard error for task 34 available at /mmfs1/home/stevengs/dirac/DEEP/submit/DEEP/20190504/A1c/science#step1/20240313T004552Z/submit/DEEP/20190504/A1c/science#step1/20240313T004552Z/logs/calibrate/855154/6082ac95-d408-4089-afe3-79945974ed2a_calibrate_855154_2.stderr parsl.dataflow.dflow INFO: Standard output for task 33 available at /mmfs1/home/stevengs/dirac/DEEP/submit/DEEP/20190504/A1c/science#step1/20240313T004552Z/submit/DEEP/20190504/A1c/science#step1/20240313T004552Z/logs/characterizeImage/855154/4f4a56ad-0482-49cf-afc8-b2c863ede055_characterizeImage_855154_2.stdout parsl.dataflow.dflow INFO: Standard error for task 33 available at /mmfs1/home/stevengs/dirac/DEEP/submit/DEEP/20190504/A1c/science#step1/20240313T004552Z/submit/DEEP/20190504/A1c/science#step1/20240313T004552Z/logs/characterizeImage/855154/4f4a56ad-0482-49cf-afc8-b2c863ede055_characterizeImage_855154_2.stderr parsl.app.errors DEBUG: Reraising exception of type parsl.dataflow.dflow DEBUG: Task 35 try 1 failed parsl.dataflow.dflow ERROR: Task 35 failed after 1 retry attempts Traceback (most recent call last): File "/mmfs1/gscratch/dirac/shared/opt/conda/envs/lsst-scipipe-8.0.0/lib/python3.11/site-packages/parsl/dataflow/dflow.py", line 300, in handle_exec_update res = self._unwrap_remote_exception_wrapper(future) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ File "/mmfs1/gscratch/dirac/shared/opt/conda/envs/lsst-scipipe-8.0.0/lib/python3.11/site-packages/parsl/dataflow/dflow.py", line 566, in _unwrap_remote_exception_wrapper result.reraise() File "/mmfs1/gscratch/dirac/shared/opt/conda/envs/lsst-scipipe-8.0.0/lib/python3.11/site-packages/parsl/app/errors.py", line 123, in reraise reraise(t, v, v.__traceback__) File "/mmfs1/gscratch/dirac/shared/opt/conda/envs/lsst-scipipe-8.0.0/lib/python3.11/site-packages/six.py", line 719, in reraise raise value File "/mmfs1/gscratch/dirac/shared/opt/conda/envs/lsst-scipipe-8.0.0/lib/python3.11/site-packages/parsl/app/errors.py", line 146, in wrapper return func(*args, **kwargs) ^^^^^^^^^^^^^^^^^ File "/mmfs1/gscratch/dirac/shared/opt/conda/envs/lsst-scipipe-8.0.0/lib/python3.11/site-packages/parsl/app/bash.py", line 86, in remote_side_bash_executor raise pe.BashExitFailure(func_name, proc.returncode) ^^^^^^^^^^^^^^^^^ parsl.app.errors.BashExitFailure: bash_app characterizeImage failed with unix exit code 1 parsl.dataflow.memoization DEBUG: Storing app cache entry 0bdca18ddf82aa9cb0ccc4e02c2c9e8e with result from task 35 parsl.dataflow.dflow WARNING: No tasks checkpointed so far in this run. Please ensure caching is enabled parsl.dataflow.dflow INFO: Task 36 failed due to dependency failure parsl.dataflow.dflow DEBUG: Task 36 try 0 failed parsl.dataflow.dflow INFO: Task 36 failed due to dependency failure so skipping retries parsl.dataflow.memoization ERROR: Attempting to update app cache entry but hashsum is not a string key parsl.dataflow.dflow WARNING: No tasks checkpointed so far in this run. Please ensure caching is enabled parsl.dataflow.dflow INFO: Standard output for task 36 available at /mmfs1/home/stevengs/dirac/DEEP/submit/DEEP/20190504/A1c/science#step1/20240313T004552Z/submit/DEEP/20190504/A1c/science#step1/20240313T004552Z/logs/calibrate/855241/c6f2a0f6-7676-4f02-bf11-2c3584088aac_calibrate_855241_2.stdout parsl.dataflow.dflow INFO: Standard error for task 36 available at /mmfs1/home/stevengs/dirac/DEEP/submit/DEEP/20190504/A1c/science#step1/20240313T004552Z/submit/DEEP/20190504/A1c/science#step1/20240313T004552Z/logs/calibrate/855241/c6f2a0f6-7676-4f02-bf11-2c3584088aac_calibrate_855241_2.stderr parsl.dataflow.dflow INFO: Standard output for task 35 available at /mmfs1/home/stevengs/dirac/DEEP/submit/DEEP/20190504/A1c/science#step1/20240313T004552Z/submit/DEEP/20190504/A1c/science#step1/20240313T004552Z/logs/characterizeImage/855241/d8762014-da79-4bd3-963c-08e730b4ba96_characterizeImage_855241_2.stdout parsl.dataflow.dflow INFO: Standard error for task 35 available at /mmfs1/home/stevengs/dirac/DEEP/submit/DEEP/20190504/A1c/science#step1/20240313T004552Z/submit/DEEP/20190504/A1c/science#step1/20240313T004552Z/logs/characterizeImage/855241/d8762014-da79-4bd3-963c-08e730b4ba96_characterizeImage_855241_2.stderr parsl.app.errors DEBUG: Reraising exception of type parsl.dataflow.dflow DEBUG: Task 37 try 1 failed parsl.dataflow.dflow ERROR: Task 37 failed after 1 retry attempts Traceback (most recent call last): File "/mmfs1/gscratch/dirac/shared/opt/conda/envs/lsst-scipipe-8.0.0/lib/python3.11/site-packages/parsl/dataflow/dflow.py", line 300, in handle_exec_update res = self._unwrap_remote_exception_wrapper(future) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ File "/mmfs1/gscratch/dirac/shared/opt/conda/envs/lsst-scipipe-8.0.0/lib/python3.11/site-packages/parsl/dataflow/dflow.py", line 566, in _unwrap_remote_exception_wrapper result.reraise() File "/mmfs1/gscratch/dirac/shared/opt/conda/envs/lsst-scipipe-8.0.0/lib/python3.11/site-packages/parsl/app/errors.py", line 123, in reraise reraise(t, v, v.__traceback__) File "/mmfs1/gscratch/dirac/shared/opt/conda/envs/lsst-scipipe-8.0.0/lib/python3.11/site-packages/six.py", line 719, in reraise raise value File "/mmfs1/gscratch/dirac/shared/opt/conda/envs/lsst-scipipe-8.0.0/lib/python3.11/site-packages/parsl/app/errors.py", line 146, in wrapper return func(*args, **kwargs) ^^^^^^^^^^^^^^^^^ File "/mmfs1/gscratch/dirac/shared/opt/conda/envs/lsst-scipipe-8.0.0/lib/python3.11/site-packages/parsl/app/bash.py", line 86, in remote_side_bash_executor raise pe.BashExitFailure(func_name, proc.returncode) ^^^^^^^^^^^^^^^^^ parsl.app.errors.BashExitFailure: bash_app characterizeImage failed with unix exit code 1 parsl.dataflow.memoization DEBUG: Storing app cache entry 3bda4ff7e2ebbb3b265ec953a4b02846 with result from task 37 parsl.dataflow.dflow WARNING: No tasks checkpointed so far in this run. Please ensure caching is enabled parsl.dataflow.dflow INFO: Task 38 failed due to dependency failure parsl.dataflow.dflow DEBUG: Task 38 try 0 failed parsl.dataflow.dflow INFO: Task 38 failed due to dependency failure so skipping retries parsl.dataflow.memoization ERROR: Attempting to update app cache entry but hashsum is not a string key parsl.dataflow.dflow WARNING: No tasks checkpointed so far in this run. Please ensure caching is enabled parsl.dataflow.dflow INFO: Standard output for task 38 available at /mmfs1/home/stevengs/dirac/DEEP/submit/DEEP/20190504/A1c/science#step1/20240313T004552Z/submit/DEEP/20190504/A1c/science#step1/20240313T004552Z/logs/calibrate/855236/d50282be-f791-4abe-8d8d-446a8f197d35_calibrate_855236_2.stdout parsl.dataflow.dflow INFO: Standard error for task 38 available at /mmfs1/home/stevengs/dirac/DEEP/submit/DEEP/20190504/A1c/science#step1/20240313T004552Z/submit/DEEP/20190504/A1c/science#step1/20240313T004552Z/logs/calibrate/855236/d50282be-f791-4abe-8d8d-446a8f197d35_calibrate_855236_2.stderr parsl.dataflow.dflow INFO: Standard output for task 37 available at /mmfs1/home/stevengs/dirac/DEEP/submit/DEEP/20190504/A1c/science#step1/20240313T004552Z/submit/DEEP/20190504/A1c/science#step1/20240313T004552Z/logs/characterizeImage/855236/d0f18884-6a7f-417a-98c5-2ab6d11b2f68_characterizeImage_855236_2.stdout parsl.dataflow.dflow INFO: Standard error for task 37 available at /mmfs1/home/stevengs/dirac/DEEP/submit/DEEP/20190504/A1c/science#step1/20240313T004552Z/submit/DEEP/20190504/A1c/science#step1/20240313T004552Z/logs/characterizeImage/855236/d0f18884-6a7f-417a-98c5-2ab6d11b2f68_characterizeImage_855236_2.stderr parsl.dataflow.strategy DEBUG: general strategy starting with strategy_type simple for 1 executors parsl.dataflow.strategy DEBUG: Strategizing for executor multi parsl.dataflow.strategy DEBUG: Slot ratio calculation: active_slots = 6, active_tasks = 5 parsl.dataflow.strategy DEBUG: Executor multi has 5 active tasks, 6/0 running/pending blocks, and 7 connected workers parsl.dataflow.strategy DEBUG: Strategy case 4b: more slots than tasks parsl.dataflow.strategy DEBUG: This strategy does not support scaling in parsl.process_loggers DEBUG: Normal ending for _general_strategy on thread JobStatusPoller-Timer-Thread-22680214849104 parsl.app.errors DEBUG: Reraising exception of type parsl.dataflow.dflow DEBUG: Task 2 try 1 failed parsl.dataflow.dflow ERROR: Task 2 failed after 1 retry attempts Traceback (most recent call last): File "/mmfs1/gscratch/dirac/shared/opt/conda/envs/lsst-scipipe-8.0.0/lib/python3.11/site-packages/parsl/dataflow/dflow.py", line 300, in handle_exec_update res = self._unwrap_remote_exception_wrapper(future) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ File "/mmfs1/gscratch/dirac/shared/opt/conda/envs/lsst-scipipe-8.0.0/lib/python3.11/site-packages/parsl/dataflow/dflow.py", line 566, in _unwrap_remote_exception_wrapper result.reraise() File "/mmfs1/gscratch/dirac/shared/opt/conda/envs/lsst-scipipe-8.0.0/lib/python3.11/site-packages/parsl/app/errors.py", line 123, in reraise reraise(t, v, v.__traceback__) File "/mmfs1/gscratch/dirac/shared/opt/conda/envs/lsst-scipipe-8.0.0/lib/python3.11/site-packages/six.py", line 719, in reraise raise value File "/mmfs1/gscratch/dirac/shared/opt/conda/envs/lsst-scipipe-8.0.0/lib/python3.11/site-packages/parsl/app/errors.py", line 146, in wrapper return func(*args, **kwargs) ^^^^^^^^^^^^^^^^^ File "/mmfs1/gscratch/dirac/shared/opt/conda/envs/lsst-scipipe-8.0.0/lib/python3.11/site-packages/parsl/app/bash.py", line 86, in remote_side_bash_executor raise pe.BashExitFailure(func_name, proc.returncode) ^^^^^^^^^^^^^^^^^ parsl.app.errors.BashExitFailure: bash_app calibrate failed with unix exit code 1 parsl.dataflow.memoization DEBUG: Storing app cache entry e7aa3ba2175fddb779a27d0452be8cc6 with result from task 2 parsl.dataflow.dflow WARNING: No tasks checkpointed so far in this run. Please ensure caching is enabled parsl.dataflow.dflow INFO: Standard output for task 2 available at /mmfs1/home/stevengs/dirac/DEEP/submit/DEEP/20190504/A1c/science#step1/20240313T004552Z/submit/DEEP/20190504/A1c/science#step1/20240313T004552Z/logs/calibrate/855236/efd8283a-61fc-45e6-94ab-003aa8739b27_calibrate_855236_61.stdout parsl.dataflow.dflow INFO: Standard error for task 2 available at /mmfs1/home/stevengs/dirac/DEEP/submit/DEEP/20190504/A1c/science#step1/20240313T004552Z/submit/DEEP/20190504/A1c/science#step1/20240313T004552Z/logs/calibrate/855236/efd8283a-61fc-45e6-94ab-003aa8739b27_calibrate_855236_61.stderr parsl.dataflow.strategy DEBUG: general strategy starting with strategy_type simple for 1 executors parsl.dataflow.strategy DEBUG: Strategizing for executor multi parsl.dataflow.strategy DEBUG: Slot ratio calculation: active_slots = 6, active_tasks = 4 parsl.dataflow.strategy DEBUG: Executor multi has 4 active tasks, 6/0 running/pending blocks, and 7 connected workers parsl.dataflow.strategy DEBUG: Strategy case 4b: more slots than tasks parsl.dataflow.strategy DEBUG: This strategy does not support scaling in parsl.process_loggers DEBUG: Normal ending for _general_strategy on thread JobStatusPoller-Timer-Thread-22680214849104 parsl.dataflow.strategy DEBUG: general strategy starting with strategy_type simple for 1 executors parsl.dataflow.strategy DEBUG: Strategizing for executor multi parsl.dataflow.strategy DEBUG: Slot ratio calculation: active_slots = 6, active_tasks = 4 parsl.dataflow.strategy DEBUG: Executor multi has 4 active tasks, 6/0 running/pending blocks, and 7 connected workers parsl.dataflow.strategy DEBUG: Strategy case 4b: more slots than tasks parsl.dataflow.strategy DEBUG: This strategy does not support scaling in parsl.process_loggers DEBUG: Normal ending for _general_strategy on thread JobStatusPoller-Timer-Thread-22680214849104 2024-03-13 00:49:34 proc_lsst.multi:146 [INFO] found job 45623 in provider local proc_lsst.multi INFO: found job 45623 in provider local 2024-03-13 00:49:34 proc_lsst.multi:146 [INFO] found job 17006648 in provider astro proc_lsst.multi INFO: found job 17006648 in provider astro parsl.providers.slurm.slurm DEBUG: Executing squeue --noheader --format='%i %t' --job '17006648,17006649,17006650,17006651,17006652' parsl.providers.slurm.slurm DEBUG: squeue returned 17006650 R 17006651 R 17006652 R 17006648 R 17006649 R parsl.providers.slurm.slurm DEBUG: Updating job 17006650 with slurm status R to parsl state JobState.RUNNING parsl.providers.slurm.slurm DEBUG: Updating job 17006651 with slurm status R to parsl state JobState.RUNNING parsl.providers.slurm.slurm DEBUG: Updating job 17006652 with slurm status R to parsl state JobState.RUNNING parsl.providers.slurm.slurm DEBUG: Updating job 17006648 with slurm status R to parsl state JobState.RUNNING parsl.providers.slurm.slurm DEBUG: Updating job 17006649 with slurm status R to parsl state JobState.RUNNING 2024-03-13 00:49:34 proc_lsst.multi:146 [INFO] found job 17006649 in provider astro proc_lsst.multi INFO: found job 17006649 in provider astro parsl.providers.slurm.slurm DEBUG: Executing squeue --noheader --format='%i %t' --job '17006648,17006649,17006650,17006651,17006652' parsl.providers.slurm.slurm DEBUG: squeue returned 17006650 R 17006651 R 17006652 R 17006648 R 17006649 R parsl.providers.slurm.slurm DEBUG: Updating job 17006650 with slurm status R to parsl state JobState.RUNNING parsl.providers.slurm.slurm DEBUG: Updating job 17006651 with slurm status R to parsl state JobState.RUNNING parsl.providers.slurm.slurm DEBUG: Updating job 17006652 with slurm status R to parsl state JobState.RUNNING parsl.providers.slurm.slurm DEBUG: Updating job 17006648 with slurm status R to parsl state JobState.RUNNING parsl.providers.slurm.slurm DEBUG: Updating job 17006649 with slurm status R to parsl state JobState.RUNNING 2024-03-13 00:49:35 proc_lsst.multi:146 [INFO] found job 17006650 in provider astro proc_lsst.multi INFO: found job 17006650 in provider astro parsl.providers.slurm.slurm DEBUG: Executing squeue --noheader --format='%i %t' --job '17006648,17006649,17006650,17006651,17006652' parsl.providers.slurm.slurm DEBUG: squeue returned 17006650 R 17006651 R 17006652 R 17006648 R 17006649 R parsl.providers.slurm.slurm DEBUG: Updating job 17006650 with slurm status R to parsl state JobState.RUNNING parsl.providers.slurm.slurm DEBUG: Updating job 17006651 with slurm status R to parsl state JobState.RUNNING parsl.providers.slurm.slurm DEBUG: Updating job 17006652 with slurm status R to parsl state JobState.RUNNING parsl.providers.slurm.slurm DEBUG: Updating job 17006648 with slurm status R to parsl state JobState.RUNNING parsl.providers.slurm.slurm DEBUG: Updating job 17006649 with slurm status R to parsl state JobState.RUNNING 2024-03-13 00:49:35 proc_lsst.multi:146 [INFO] found job 17006651 in provider astro proc_lsst.multi INFO: found job 17006651 in provider astro parsl.providers.slurm.slurm DEBUG: Executing squeue --noheader --format='%i %t' --job '17006648,17006649,17006650,17006651,17006652' parsl.providers.slurm.slurm DEBUG: squeue returned 17006650 R 17006651 R 17006652 R 17006648 R 17006649 R parsl.providers.slurm.slurm DEBUG: Updating job 17006650 with slurm status R to parsl state JobState.RUNNING parsl.providers.slurm.slurm DEBUG: Updating job 17006651 with slurm status R to parsl state JobState.RUNNING parsl.providers.slurm.slurm DEBUG: Updating job 17006652 with slurm status R to parsl state JobState.RUNNING parsl.providers.slurm.slurm DEBUG: Updating job 17006648 with slurm status R to parsl state JobState.RUNNING parsl.providers.slurm.slurm DEBUG: Updating job 17006649 with slurm status R to parsl state JobState.RUNNING 2024-03-13 00:49:35 proc_lsst.multi:146 [INFO] found job 17006652 in provider astro proc_lsst.multi INFO: found job 17006652 in provider astro parsl.providers.slurm.slurm DEBUG: Executing squeue --noheader --format='%i %t' --job '17006648,17006649,17006650,17006651,17006652' parsl.providers.slurm.slurm DEBUG: squeue returned 17006650 R 17006651 R 17006652 R 17006648 R 17006649 R parsl.providers.slurm.slurm DEBUG: Updating job 17006650 with slurm status R to parsl state JobState.RUNNING parsl.providers.slurm.slurm DEBUG: Updating job 17006651 with slurm status R to parsl state JobState.RUNNING parsl.providers.slurm.slurm DEBUG: Updating job 17006652 with slurm status R to parsl state JobState.RUNNING parsl.providers.slurm.slurm DEBUG: Updating job 17006648 with slurm status R to parsl state JobState.RUNNING parsl.providers.slurm.slurm DEBUG: Updating job 17006649 with slurm status R to parsl state JobState.RUNNING parsl.dataflow.strategy DEBUG: general strategy starting with strategy_type simple for 1 executors parsl.dataflow.strategy DEBUG: Strategizing for executor multi parsl.dataflow.strategy DEBUG: Slot ratio calculation: active_slots = 6, active_tasks = 4 parsl.dataflow.strategy DEBUG: Executor multi has 4 active tasks, 6/0 running/pending blocks, and 7 connected workers parsl.dataflow.strategy DEBUG: Strategy case 4b: more slots than tasks parsl.dataflow.strategy DEBUG: This strategy does not support scaling in parsl.process_loggers DEBUG: Normal ending for _general_strategy on thread JobStatusPoller-Timer-Thread-22680214849104 parsl.dataflow.strategy DEBUG: general strategy starting with strategy_type simple for 1 executors parsl.dataflow.strategy DEBUG: Strategizing for executor multi parsl.dataflow.strategy DEBUG: Slot ratio calculation: active_slots = 6, active_tasks = 4 parsl.dataflow.strategy DEBUG: Executor multi has 4 active tasks, 6/0 running/pending blocks, and 7 connected workers parsl.dataflow.strategy DEBUG: Strategy case 4b: more slots than tasks parsl.dataflow.strategy DEBUG: This strategy does not support scaling in parsl.process_loggers DEBUG: Normal ending for _general_strategy on thread JobStatusPoller-Timer-Thread-22680214849104 parsl.app.errors DEBUG: Reraising exception of type parsl.dataflow.dflow DEBUG: Task 1 try 1 failed parsl.dataflow.dflow ERROR: Task 1 failed after 1 retry attempts Traceback (most recent call last): File "/mmfs1/gscratch/dirac/shared/opt/conda/envs/lsst-scipipe-8.0.0/lib/python3.11/site-packages/parsl/dataflow/dflow.py", line 300, in handle_exec_update res = self._unwrap_remote_exception_wrapper(future) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ File "/mmfs1/gscratch/dirac/shared/opt/conda/envs/lsst-scipipe-8.0.0/lib/python3.11/site-packages/parsl/dataflow/dflow.py", line 566, in _unwrap_remote_exception_wrapper result.reraise() File "/mmfs1/gscratch/dirac/shared/opt/conda/envs/lsst-scipipe-8.0.0/lib/python3.11/site-packages/parsl/app/errors.py", line 123, in reraise reraise(t, v, v.__traceback__) File "/mmfs1/gscratch/dirac/shared/opt/conda/envs/lsst-scipipe-8.0.0/lib/python3.11/site-packages/six.py", line 719, in reraise raise value File "/mmfs1/gscratch/dirac/shared/opt/conda/envs/lsst-scipipe-8.0.0/lib/python3.11/site-packages/parsl/app/errors.py", line 146, in wrapper return func(*args, **kwargs) ^^^^^^^^^^^^^^^^^ File "/mmfs1/gscratch/dirac/shared/opt/conda/envs/lsst-scipipe-8.0.0/lib/python3.11/site-packages/parsl/app/bash.py", line 86, in remote_side_bash_executor raise pe.BashExitFailure(func_name, proc.returncode) ^^^^^^^^^^^^^^^^^ parsl.app.errors.BashExitFailure: bash_app calibrate failed with unix exit code 1 parsl.dataflow.memoization DEBUG: Storing app cache entry 12f351a87d7e6897258f8bd74860e3af with result from task 1 parsl.dataflow.dflow WARNING: No tasks checkpointed so far in this run. Please ensure caching is enabled parsl.dataflow.dflow INFO: Standard output for task 1 available at /mmfs1/home/stevengs/dirac/DEEP/submit/DEEP/20190504/A1c/science#step1/20240313T004552Z/submit/DEEP/20190504/A1c/science#step1/20240313T004552Z/logs/calibrate/855238/cfa4072c-df06-4208-872e-34ec571e2d12_calibrate_855238_61.stdout parsl.dataflow.dflow INFO: Standard error for task 1 available at /mmfs1/home/stevengs/dirac/DEEP/submit/DEEP/20190504/A1c/science#step1/20240313T004552Z/submit/DEEP/20190504/A1c/science#step1/20240313T004552Z/logs/calibrate/855238/cfa4072c-df06-4208-872e-34ec571e2d12_calibrate_855238_61.stderr parsl.dataflow.strategy DEBUG: general strategy starting with strategy_type simple for 1 executors parsl.dataflow.strategy DEBUG: Strategizing for executor multi parsl.dataflow.strategy DEBUG: Slot ratio calculation: active_slots = 6, active_tasks = 3 parsl.dataflow.strategy DEBUG: Executor multi has 3 active tasks, 6/0 running/pending blocks, and 7 connected workers parsl.dataflow.strategy DEBUG: Strategy case 4b: more slots than tasks parsl.dataflow.strategy DEBUG: This strategy does not support scaling in parsl.process_loggers DEBUG: Normal ending for _general_strategy on thread JobStatusPoller-Timer-Thread-22680214849104 parsl.dataflow.strategy DEBUG: general strategy starting with strategy_type simple for 1 executors parsl.dataflow.strategy DEBUG: Strategizing for executor multi parsl.dataflow.strategy DEBUG: Slot ratio calculation: active_slots = 6, active_tasks = 3 parsl.dataflow.strategy DEBUG: Executor multi has 3 active tasks, 6/0 running/pending blocks, and 7 connected workers parsl.dataflow.strategy DEBUG: Strategy case 4b: more slots than tasks parsl.dataflow.strategy DEBUG: This strategy does not support scaling in parsl.process_loggers DEBUG: Normal ending for _general_strategy on thread JobStatusPoller-Timer-Thread-22680214849104 parsl.dataflow.strategy DEBUG: general strategy starting with strategy_type simple for 1 executors parsl.dataflow.strategy DEBUG: Strategizing for executor multi parsl.dataflow.strategy DEBUG: Slot ratio calculation: active_slots = 6, active_tasks = 3 parsl.dataflow.strategy DEBUG: Executor multi has 3 active tasks, 6/0 running/pending blocks, and 7 connected workers parsl.dataflow.strategy DEBUG: Strategy case 4b: more slots than tasks parsl.dataflow.strategy DEBUG: This strategy does not support scaling in parsl.process_loggers DEBUG: Normal ending for _general_strategy on thread JobStatusPoller-Timer-Thread-22680214849104 parsl.dataflow.strategy DEBUG: general strategy starting with strategy_type simple for 1 executors parsl.dataflow.strategy DEBUG: Strategizing for executor multi parsl.dataflow.strategy DEBUG: Slot ratio calculation: active_slots = 6, active_tasks = 3 parsl.dataflow.strategy DEBUG: Executor multi has 3 active tasks, 6/0 running/pending blocks, and 7 connected workers parsl.dataflow.strategy DEBUG: Strategy case 4b: more slots than tasks parsl.dataflow.strategy DEBUG: This strategy does not support scaling in parsl.process_loggers DEBUG: Normal ending for _general_strategy on thread JobStatusPoller-Timer-Thread-22680214849104 parsl.app.errors DEBUG: Reraising exception of type parsl.dataflow.dflow DEBUG: Task 4 try 1 failed parsl.dataflow.dflow ERROR: Task 4 failed after 1 retry attempts Traceback (most recent call last): File "/mmfs1/gscratch/dirac/shared/opt/conda/envs/lsst-scipipe-8.0.0/lib/python3.11/site-packages/parsl/dataflow/dflow.py", line 300, in handle_exec_update res = self._unwrap_remote_exception_wrapper(future) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ File "/mmfs1/gscratch/dirac/shared/opt/conda/envs/lsst-scipipe-8.0.0/lib/python3.11/site-packages/parsl/dataflow/dflow.py", line 566, in _unwrap_remote_exception_wrapper result.reraise() File "/mmfs1/gscratch/dirac/shared/opt/conda/envs/lsst-scipipe-8.0.0/lib/python3.11/site-packages/parsl/app/errors.py", line 123, in reraise reraise(t, v, v.__traceback__) File "/mmfs1/gscratch/dirac/shared/opt/conda/envs/lsst-scipipe-8.0.0/lib/python3.11/site-packages/six.py", line 719, in reraise raise value File "/mmfs1/gscratch/dirac/shared/opt/conda/envs/lsst-scipipe-8.0.0/lib/python3.11/site-packages/parsl/app/errors.py", line 146, in wrapper return func(*args, **kwargs) ^^^^^^^^^^^^^^^^^ File "/mmfs1/gscratch/dirac/shared/opt/conda/envs/lsst-scipipe-8.0.0/lib/python3.11/site-packages/parsl/app/bash.py", line 86, in remote_side_bash_executor raise pe.BashExitFailure(func_name, proc.returncode) ^^^^^^^^^^^^^^^^^ parsl.app.errors.BashExitFailure: bash_app calibrate failed with unix exit code 1 parsl.dataflow.memoization DEBUG: Storing app cache entry 7ac17bf061ed83a782dd7ceca00ba027 with result from task 4 parsl.dataflow.dflow WARNING: No tasks checkpointed so far in this run. Please ensure caching is enabled parsl.dataflow.dflow INFO: Standard output for task 4 available at /mmfs1/home/stevengs/dirac/DEEP/submit/DEEP/20190504/A1c/science#step1/20240313T004552Z/submit/DEEP/20190504/A1c/science#step1/20240313T004552Z/logs/calibrate/855239/afabbcb2-21b2-42ac-9a80-a7ca4dd3e01f_calibrate_855239_61.stdout parsl.dataflow.dflow INFO: Standard error for task 4 available at /mmfs1/home/stevengs/dirac/DEEP/submit/DEEP/20190504/A1c/science#step1/20240313T004552Z/submit/DEEP/20190504/A1c/science#step1/20240313T004552Z/logs/calibrate/855239/afabbcb2-21b2-42ac-9a80-a7ca4dd3e01f_calibrate_855239_61.stderr parsl.app.errors DEBUG: Reraising exception of type parsl.dataflow.dflow DEBUG: Task 7 try 1 failed parsl.dataflow.dflow ERROR: Task 7 failed after 1 retry attempts Traceback (most recent call last): File "/mmfs1/gscratch/dirac/shared/opt/conda/envs/lsst-scipipe-8.0.0/lib/python3.11/site-packages/parsl/dataflow/dflow.py", line 300, in handle_exec_update res = self._unwrap_remote_exception_wrapper(future) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ File "/mmfs1/gscratch/dirac/shared/opt/conda/envs/lsst-scipipe-8.0.0/lib/python3.11/site-packages/parsl/dataflow/dflow.py", line 566, in _unwrap_remote_exception_wrapper result.reraise() File "/mmfs1/gscratch/dirac/shared/opt/conda/envs/lsst-scipipe-8.0.0/lib/python3.11/site-packages/parsl/app/errors.py", line 123, in reraise reraise(t, v, v.__traceback__) File "/mmfs1/gscratch/dirac/shared/opt/conda/envs/lsst-scipipe-8.0.0/lib/python3.11/site-packages/six.py", line 719, in reraise raise value File "/mmfs1/gscratch/dirac/shared/opt/conda/envs/lsst-scipipe-8.0.0/lib/python3.11/site-packages/parsl/app/errors.py", line 146, in wrapper return func(*args, **kwargs) ^^^^^^^^^^^^^^^^^ File "/mmfs1/gscratch/dirac/shared/opt/conda/envs/lsst-scipipe-8.0.0/lib/python3.11/site-packages/parsl/app/bash.py", line 86, in remote_side_bash_executor raise pe.BashExitFailure(func_name, proc.returncode) ^^^^^^^^^^^^^^^^^ parsl.app.errors.BashExitFailure: bash_app calibrate failed with unix exit code 1 parsl.dataflow.memoization DEBUG: Storing app cache entry 1311e7b153c8c919a4c655b0827f90d9 with result from task 7 parsl.dataflow.dflow WARNING: No tasks checkpointed so far in this run. Please ensure caching is enabled parsl.dataflow.dflow INFO: Standard output for task 7 available at /mmfs1/home/stevengs/dirac/DEEP/submit/DEEP/20190504/A1c/science#step1/20240313T004552Z/submit/DEEP/20190504/A1c/science#step1/20240313T004552Z/logs/calibrate/855235/ee291926-868e-4af4-9a7c-688cd03d4a89_calibrate_855235_61.stdout parsl.dataflow.dflow INFO: Standard error for task 7 available at /mmfs1/home/stevengs/dirac/DEEP/submit/DEEP/20190504/A1c/science#step1/20240313T004552Z/submit/DEEP/20190504/A1c/science#step1/20240313T004552Z/logs/calibrate/855235/ee291926-868e-4af4-9a7c-688cd03d4a89_calibrate_855235_61.stderr 2024-03-13 00:50:04 proc_lsst.multi:146 [INFO] found job 45623 in provider local proc_lsst.multi INFO: found job 45623 in provider local 2024-03-13 00:50:04 proc_lsst.multi:146 [INFO] found job 17006648 in provider astro proc_lsst.multi INFO: found job 17006648 in provider astro parsl.providers.slurm.slurm DEBUG: Executing squeue --noheader --format='%i %t' --job '17006648,17006649,17006650,17006651,17006652' parsl.providers.slurm.slurm DEBUG: squeue returned 17006650 R 17006651 R 17006652 R 17006648 R 17006649 R parsl.providers.slurm.slurm DEBUG: Updating job 17006650 with slurm status R to parsl state JobState.RUNNING parsl.providers.slurm.slurm DEBUG: Updating job 17006651 with slurm status R to parsl state JobState.RUNNING parsl.providers.slurm.slurm DEBUG: Updating job 17006652 with slurm status R to parsl state JobState.RUNNING parsl.providers.slurm.slurm DEBUG: Updating job 17006648 with slurm status R to parsl state JobState.RUNNING parsl.providers.slurm.slurm DEBUG: Updating job 17006649 with slurm status R to parsl state JobState.RUNNING 2024-03-13 00:50:04 proc_lsst.multi:146 [INFO] found job 17006649 in provider astro proc_lsst.multi INFO: found job 17006649 in provider astro parsl.providers.slurm.slurm DEBUG: Executing squeue --noheader --format='%i %t' --job '17006648,17006649,17006650,17006651,17006652' parsl.providers.slurm.slurm DEBUG: squeue returned 17006650 R 17006651 R 17006652 R 17006648 R 17006649 R parsl.providers.slurm.slurm DEBUG: Updating job 17006650 with slurm status R to parsl state JobState.RUNNING parsl.providers.slurm.slurm DEBUG: Updating job 17006651 with slurm status R to parsl state JobState.RUNNING parsl.providers.slurm.slurm DEBUG: Updating job 17006652 with slurm status R to parsl state JobState.RUNNING parsl.providers.slurm.slurm DEBUG: Updating job 17006648 with slurm status R to parsl state JobState.RUNNING parsl.providers.slurm.slurm DEBUG: Updating job 17006649 with slurm status R to parsl state JobState.RUNNING 2024-03-13 00:50:04 proc_lsst.multi:146 [INFO] found job 17006650 in provider astro proc_lsst.multi INFO: found job 17006650 in provider astro parsl.providers.slurm.slurm DEBUG: Executing squeue --noheader --format='%i %t' --job '17006648,17006649,17006650,17006651,17006652' parsl.providers.slurm.slurm DEBUG: squeue returned 17006650 R 17006651 R 17006652 R 17006648 R 17006649 R parsl.providers.slurm.slurm DEBUG: Updating job 17006650 with slurm status R to parsl state JobState.RUNNING parsl.providers.slurm.slurm DEBUG: Updating job 17006651 with slurm status R to parsl state JobState.RUNNING parsl.providers.slurm.slurm DEBUG: Updating job 17006652 with slurm status R to parsl state JobState.RUNNING parsl.providers.slurm.slurm DEBUG: Updating job 17006648 with slurm status R to parsl state JobState.RUNNING parsl.providers.slurm.slurm DEBUG: Updating job 17006649 with slurm status R to parsl state JobState.RUNNING 2024-03-13 00:50:05 proc_lsst.multi:146 [INFO] found job 17006651 in provider astro proc_lsst.multi INFO: found job 17006651 in provider astro parsl.providers.slurm.slurm DEBUG: Executing squeue --noheader --format='%i %t' --job '17006648,17006649,17006650,17006651,17006652' parsl.providers.slurm.slurm DEBUG: squeue returned 17006650 R 17006651 R 17006652 R 17006648 R 17006649 R parsl.providers.slurm.slurm DEBUG: Updating job 17006650 with slurm status R to parsl state JobState.RUNNING parsl.providers.slurm.slurm DEBUG: Updating job 17006651 with slurm status R to parsl state JobState.RUNNING parsl.providers.slurm.slurm DEBUG: Updating job 17006652 with slurm status R to parsl state JobState.RUNNING parsl.providers.slurm.slurm DEBUG: Updating job 17006648 with slurm status R to parsl state JobState.RUNNING parsl.providers.slurm.slurm DEBUG: Updating job 17006649 with slurm status R to parsl state JobState.RUNNING 2024-03-13 00:50:05 proc_lsst.multi:146 [INFO] found job 17006652 in provider astro proc_lsst.multi INFO: found job 17006652 in provider astro parsl.providers.slurm.slurm DEBUG: Executing squeue --noheader --format='%i %t' --job '17006648,17006649,17006650,17006651,17006652' parsl.providers.slurm.slurm DEBUG: squeue returned 17006650 R 17006651 R 17006652 R 17006648 R 17006649 R parsl.providers.slurm.slurm DEBUG: Updating job 17006650 with slurm status R to parsl state JobState.RUNNING parsl.providers.slurm.slurm DEBUG: Updating job 17006651 with slurm status R to parsl state JobState.RUNNING parsl.providers.slurm.slurm DEBUG: Updating job 17006652 with slurm status R to parsl state JobState.RUNNING parsl.providers.slurm.slurm DEBUG: Updating job 17006648 with slurm status R to parsl state JobState.RUNNING parsl.providers.slurm.slurm DEBUG: Updating job 17006649 with slurm status R to parsl state JobState.RUNNING parsl.dataflow.strategy DEBUG: general strategy starting with strategy_type simple for 1 executors parsl.dataflow.strategy DEBUG: Strategizing for executor multi parsl.dataflow.strategy DEBUG: Slot ratio calculation: active_slots = 6, active_tasks = 1 parsl.dataflow.strategy DEBUG: Executor multi has 1 active tasks, 6/0 running/pending blocks, and 7 connected workers parsl.dataflow.strategy DEBUG: Strategy case 4b: more slots than tasks parsl.dataflow.strategy DEBUG: This strategy does not support scaling in parsl.process_loggers DEBUG: Normal ending for _general_strategy on thread JobStatusPoller-Timer-Thread-22680214849104 parsl.dataflow.strategy DEBUG: general strategy starting with strategy_type simple for 1 executors parsl.dataflow.strategy DEBUG: Strategizing for executor multi parsl.dataflow.strategy DEBUG: Slot ratio calculation: active_slots = 6, active_tasks = 1 parsl.dataflow.strategy DEBUG: Executor multi has 1 active tasks, 6/0 running/pending blocks, and 7 connected workers parsl.dataflow.strategy DEBUG: Strategy case 4b: more slots than tasks parsl.dataflow.strategy DEBUG: This strategy does not support scaling in parsl.process_loggers DEBUG: Normal ending for _general_strategy on thread JobStatusPoller-Timer-Thread-22680214849104 parsl.dataflow.strategy DEBUG: general strategy starting with strategy_type simple for 1 executors parsl.dataflow.strategy DEBUG: Strategizing for executor multi parsl.dataflow.strategy DEBUG: Slot ratio calculation: active_slots = 6, active_tasks = 1 parsl.dataflow.strategy DEBUG: Executor multi has 1 active tasks, 6/0 running/pending blocks, and 7 connected workers parsl.dataflow.strategy DEBUG: Strategy case 4b: more slots than tasks parsl.dataflow.strategy DEBUG: This strategy does not support scaling in parsl.process_loggers DEBUG: Normal ending for _general_strategy on thread JobStatusPoller-Timer-Thread-22680214849104 parsl.dataflow.strategy DEBUG: general strategy starting with strategy_type simple for 1 executors parsl.dataflow.strategy DEBUG: Strategizing for executor multi parsl.dataflow.strategy DEBUG: Slot ratio calculation: active_slots = 6, active_tasks = 1 parsl.dataflow.strategy DEBUG: Executor multi has 1 active tasks, 6/0 running/pending blocks, and 7 connected workers parsl.dataflow.strategy DEBUG: Strategy case 4b: more slots than tasks parsl.dataflow.strategy DEBUG: This strategy does not support scaling in parsl.process_loggers DEBUG: Normal ending for _general_strategy on thread JobStatusPoller-Timer-Thread-22680214849104 parsl.dataflow.strategy DEBUG: general strategy starting with strategy_type simple for 1 executors parsl.dataflow.strategy DEBUG: Strategizing for executor multi parsl.dataflow.strategy DEBUG: Slot ratio calculation: active_slots = 6, active_tasks = 1 parsl.dataflow.strategy DEBUG: Executor multi has 1 active tasks, 6/0 running/pending blocks, and 7 connected workers parsl.dataflow.strategy DEBUG: Strategy case 4b: more slots than tasks parsl.dataflow.strategy DEBUG: This strategy does not support scaling in parsl.process_loggers DEBUG: Normal ending for _general_strategy on thread JobStatusPoller-Timer-Thread-22680214849104 parsl.dataflow.strategy DEBUG: general strategy starting with strategy_type simple for 1 executors parsl.dataflow.strategy DEBUG: Strategizing for executor multi parsl.dataflow.strategy DEBUG: Slot ratio calculation: active_slots = 6, active_tasks = 1 parsl.dataflow.strategy DEBUG: Executor multi has 1 active tasks, 6/0 running/pending blocks, and 7 connected workers parsl.dataflow.strategy DEBUG: Strategy case 4b: more slots than tasks parsl.dataflow.strategy DEBUG: This strategy does not support scaling in parsl.process_loggers DEBUG: Normal ending for _general_strategy on thread JobStatusPoller-Timer-Thread-22680214849104 2024-03-13 00:50:34 proc_lsst.multi:146 [INFO] found job 45623 in provider local proc_lsst.multi INFO: found job 45623 in provider local 2024-03-13 00:50:34 proc_lsst.multi:146 [INFO] found job 17006648 in provider astro proc_lsst.multi INFO: found job 17006648 in provider astro parsl.providers.slurm.slurm DEBUG: Executing squeue --noheader --format='%i %t' --job '17006648,17006649,17006650,17006651,17006652' parsl.providers.slurm.slurm DEBUG: squeue returned 17006650 R 17006651 R 17006652 R 17006648 R 17006649 R parsl.providers.slurm.slurm DEBUG: Updating job 17006650 with slurm status R to parsl state JobState.RUNNING parsl.providers.slurm.slurm DEBUG: Updating job 17006651 with slurm status R to parsl state JobState.RUNNING parsl.providers.slurm.slurm DEBUG: Updating job 17006652 with slurm status R to parsl state JobState.RUNNING parsl.providers.slurm.slurm DEBUG: Updating job 17006648 with slurm status R to parsl state JobState.RUNNING parsl.providers.slurm.slurm DEBUG: Updating job 17006649 with slurm status R to parsl state JobState.RUNNING 2024-03-13 00:50:34 proc_lsst.multi:146 [INFO] found job 17006649 in provider astro proc_lsst.multi INFO: found job 17006649 in provider astro parsl.providers.slurm.slurm DEBUG: Executing squeue --noheader --format='%i %t' --job '17006648,17006649,17006650,17006651,17006652' parsl.providers.slurm.slurm DEBUG: squeue returned 17006650 R 17006651 R 17006652 R 17006648 R 17006649 R parsl.providers.slurm.slurm DEBUG: Updating job 17006650 with slurm status R to parsl state JobState.RUNNING parsl.providers.slurm.slurm DEBUG: Updating job 17006651 with slurm status R to parsl state JobState.RUNNING parsl.providers.slurm.slurm DEBUG: Updating job 17006652 with slurm status R to parsl state JobState.RUNNING parsl.providers.slurm.slurm DEBUG: Updating job 17006648 with slurm status R to parsl state JobState.RUNNING parsl.providers.slurm.slurm DEBUG: Updating job 17006649 with slurm status R to parsl state JobState.RUNNING 2024-03-13 00:50:34 proc_lsst.multi:146 [INFO] found job 17006650 in provider astro proc_lsst.multi INFO: found job 17006650 in provider astro parsl.providers.slurm.slurm DEBUG: Executing squeue --noheader --format='%i %t' --job '17006648,17006649,17006650,17006651,17006652' parsl.providers.slurm.slurm DEBUG: squeue returned 17006650 R 17006651 R 17006652 R 17006648 R 17006649 R parsl.providers.slurm.slurm DEBUG: Updating job 17006650 with slurm status R to parsl state JobState.RUNNING parsl.providers.slurm.slurm DEBUG: Updating job 17006651 with slurm status R to parsl state JobState.RUNNING parsl.providers.slurm.slurm DEBUG: Updating job 17006652 with slurm status R to parsl state JobState.RUNNING parsl.providers.slurm.slurm DEBUG: Updating job 17006648 with slurm status R to parsl state JobState.RUNNING parsl.providers.slurm.slurm DEBUG: Updating job 17006649 with slurm status R to parsl state JobState.RUNNING 2024-03-13 00:50:35 proc_lsst.multi:146 [INFO] found job 17006651 in provider astro proc_lsst.multi INFO: found job 17006651 in provider astro parsl.providers.slurm.slurm DEBUG: Executing squeue --noheader --format='%i %t' --job '17006648,17006649,17006650,17006651,17006652' parsl.providers.slurm.slurm DEBUG: squeue returned 17006650 R 17006651 R 17006652 R 17006648 R 17006649 R parsl.providers.slurm.slurm DEBUG: Updating job 17006650 with slurm status R to parsl state JobState.RUNNING parsl.providers.slurm.slurm DEBUG: Updating job 17006651 with slurm status R to parsl state JobState.RUNNING parsl.providers.slurm.slurm DEBUG: Updating job 17006652 with slurm status R to parsl state JobState.RUNNING parsl.providers.slurm.slurm DEBUG: Updating job 17006648 with slurm status R to parsl state JobState.RUNNING parsl.providers.slurm.slurm DEBUG: Updating job 17006649 with slurm status R to parsl state JobState.RUNNING 2024-03-13 00:50:35 proc_lsst.multi:146 [INFO] found job 17006652 in provider astro proc_lsst.multi INFO: found job 17006652 in provider astro parsl.providers.slurm.slurm DEBUG: Executing squeue --noheader --format='%i %t' --job '17006648,17006649,17006650,17006651,17006652' parsl.providers.slurm.slurm DEBUG: squeue returned 17006650 R 17006651 R 17006652 R 17006648 R 17006649 R parsl.providers.slurm.slurm DEBUG: Updating job 17006650 with slurm status R to parsl state JobState.RUNNING parsl.providers.slurm.slurm DEBUG: Updating job 17006651 with slurm status R to parsl state JobState.RUNNING parsl.providers.slurm.slurm DEBUG: Updating job 17006652 with slurm status R to parsl state JobState.RUNNING parsl.providers.slurm.slurm DEBUG: Updating job 17006648 with slurm status R to parsl state JobState.RUNNING parsl.providers.slurm.slurm DEBUG: Updating job 17006649 with slurm status R to parsl state JobState.RUNNING parsl.dataflow.strategy DEBUG: general strategy starting with strategy_type simple for 1 executors parsl.dataflow.strategy DEBUG: Strategizing for executor multi parsl.dataflow.strategy DEBUG: Slot ratio calculation: active_slots = 6, active_tasks = 1 parsl.dataflow.strategy DEBUG: Executor multi has 1 active tasks, 6/0 running/pending blocks, and 7 connected workers parsl.dataflow.strategy DEBUG: Strategy case 4b: more slots than tasks parsl.dataflow.strategy DEBUG: This strategy does not support scaling in parsl.process_loggers DEBUG: Normal ending for _general_strategy on thread JobStatusPoller-Timer-Thread-22680214849104 parsl.dataflow.strategy DEBUG: general strategy starting with strategy_type simple for 1 executors parsl.dataflow.strategy DEBUG: Strategizing for executor multi parsl.dataflow.strategy DEBUG: Slot ratio calculation: active_slots = 6, active_tasks = 1 parsl.dataflow.strategy DEBUG: Executor multi has 1 active tasks, 6/0 running/pending blocks, and 7 connected workers parsl.dataflow.strategy DEBUG: Strategy case 4b: more slots than tasks parsl.dataflow.strategy DEBUG: This strategy does not support scaling in parsl.process_loggers DEBUG: Normal ending for _general_strategy on thread JobStatusPoller-Timer-Thread-22680214849104 parsl.dataflow.strategy DEBUG: general strategy starting with strategy_type simple for 1 executors parsl.dataflow.strategy DEBUG: Strategizing for executor multi parsl.dataflow.strategy DEBUG: Slot ratio calculation: active_slots = 6, active_tasks = 1 parsl.dataflow.strategy DEBUG: Executor multi has 1 active tasks, 6/0 running/pending blocks, and 7 connected workers parsl.dataflow.strategy DEBUG: Strategy case 4b: more slots than tasks parsl.dataflow.strategy DEBUG: This strategy does not support scaling in parsl.process_loggers DEBUG: Normal ending for _general_strategy on thread JobStatusPoller-Timer-Thread-22680214849104 parsl.app.errors DEBUG: Reraising exception of type parsl.dataflow.dflow DEBUG: Task 8 try 1 failed parsl.dataflow.dflow ERROR: Task 8 failed after 1 retry attempts Traceback (most recent call last): File "/mmfs1/gscratch/dirac/shared/opt/conda/envs/lsst-scipipe-8.0.0/lib/python3.11/site-packages/parsl/dataflow/dflow.py", line 300, in handle_exec_update res = self._unwrap_remote_exception_wrapper(future) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ File "/mmfs1/gscratch/dirac/shared/opt/conda/envs/lsst-scipipe-8.0.0/lib/python3.11/site-packages/parsl/dataflow/dflow.py", line 566, in _unwrap_remote_exception_wrapper result.reraise() File "/mmfs1/gscratch/dirac/shared/opt/conda/envs/lsst-scipipe-8.0.0/lib/python3.11/site-packages/parsl/app/errors.py", line 123, in reraise reraise(t, v, v.__traceback__) File "/mmfs1/gscratch/dirac/shared/opt/conda/envs/lsst-scipipe-8.0.0/lib/python3.11/site-packages/six.py", line 719, in reraise raise value File "/mmfs1/gscratch/dirac/shared/opt/conda/envs/lsst-scipipe-8.0.0/lib/python3.11/site-packages/parsl/app/errors.py", line 146, in wrapper return func(*args, **kwargs) ^^^^^^^^^^^^^^^^^ File "/mmfs1/gscratch/dirac/shared/opt/conda/envs/lsst-scipipe-8.0.0/lib/python3.11/site-packages/parsl/app/bash.py", line 86, in remote_side_bash_executor raise pe.BashExitFailure(func_name, proc.returncode) ^^^^^^^^^^^^^^^^^ parsl.app.errors.BashExitFailure: bash_app calibrate failed with unix exit code 1 parsl.dataflow.memoization DEBUG: Storing app cache entry aa318d7095aa430df1e119729183fe2f with result from task 8 parsl.dataflow.dflow INFO: DFK cleanup initiated parsl.dataflow.dflow INFO: Summary of tasks in DFK: parsl.dataflow.dflow INFO: Tasks in state States.unsched: 0 parsl.dataflow.dflow INFO: Tasks in state States.pending: 0 parsl.dataflow.dflow INFO: Tasks in state States.running: 0 parsl.dataflow.dflow INFO: Tasks in state States.exec_done: 0 parsl.dataflow.dflow INFO: Tasks in state States.failed: 24 parsl.dataflow.dflow INFO: Tasks in state States.dep_fail: 15 parsl.dataflow.dflow INFO: Tasks in state States.launched: 0 parsl.dataflow.dflow INFO: Tasks in state States.fail_retryable: 0 parsl.dataflow.dflow INFO: Tasks in state States.memo_done: 0 parsl.dataflow.dflow INFO: Tasks in state States.joining: 0 parsl.dataflow.dflow INFO: Tasks in state States.running_ended: 0 parsl.dataflow.dflow WARNING: No tasks checkpointed so far in this run. Please ensure caching is enabled parsl.dataflow.dflow INFO: End of summary parsl.dataflow.dflow INFO: Standard output for task 8 available at /mmfs1/home/stevengs/dirac/DEEP/submit/DEEP/20190504/A1c/science#step1/20240313T004552Z/submit/DEEP/20190504/A1c/science#step1/20240313T004552Z/logs/calibrate/855154/cceb2f44-be57-4f20-9192-4ebc5906f4c0_calibrate_855154_61.stdout parsl.dataflow.dflow INFO: Standard error for task 8 available at /mmfs1/home/stevengs/dirac/DEEP/submit/DEEP/20190504/A1c/science#step1/20240313T004552Z/submit/DEEP/20190504/A1c/science#step1/20240313T004552Z/logs/calibrate/855154/cceb2f44-be57-4f20-9192-4ebc5906f4c0_calibrate_855154_61.stderr parsl.dataflow.dflow WARNING: No tasks checkpointed so far in this run. Please ensure caching is enabled parsl.dataflow.dflow INFO: Closing job status poller parsl.dataflow.dflow INFO: Terminated job status poller parsl.dataflow.dflow INFO: Scaling in and shutting down executors parsl.dataflow.dflow INFO: Scaling in executor multi parsl.executors.high_throughput.executor DEBUG: Scale in called, blocks=6, block_ids=[] parsl.executors.high_throughput.executor DEBUG: Scale in selecting from 6 blocks parsl.executors.high_throughput.executor DEBUG: Sending hold to manager: 004d5f121c6f parsl.executors.high_throughput.executor DEBUG: Sent hold request to manager: 004d5f121c6f parsl.executors.high_throughput.executor DEBUG: Sending hold to manager: d4b7eaa637cd parsl.executors.high_throughput.executor DEBUG: Sent hold request to manager: d4b7eaa637cd parsl.executors.high_throughput.executor DEBUG: Sending hold to manager: eebb9440fbc6 parsl.executors.high_throughput.executor DEBUG: Sent hold request to manager: eebb9440fbc6 parsl.executors.high_throughput.executor DEBUG: Sending hold to manager: 33202cc66885 parsl.executors.high_throughput.executor DEBUG: Sent hold request to manager: 33202cc66885 parsl.executors.high_throughput.executor DEBUG: Sending hold to manager: 5982611b4e35 parsl.executors.high_throughput.executor DEBUG: Sent hold request to manager: 5982611b4e35 parsl.executors.high_throughput.executor DEBUG: Sending hold to manager: dfc1db01270b parsl.executors.high_throughput.executor DEBUG: Sent hold request to manager: dfc1db01270b 2024-03-13 00:50:47 proc_lsst.multi:146 [INFO] found job 45623 in provider local proc_lsst.multi INFO: found job 45623 in provider local 2024-03-13 00:50:47 proc_lsst.multi:201 [INFO] cancelling 45623 on provider local proc_lsst.multi INFO: cancelling 45623 on provider local parsl.providers.local.local DEBUG: Terminating job/proc_id: 45623 2024-03-13 00:50:47 proc_lsst.multi:146 [INFO] found job 17006651 in provider astro proc_lsst.multi INFO: found job 17006651 in provider astro 2024-03-13 00:50:47 proc_lsst.multi:201 [INFO] cancelling 17006651 on provider astro proc_lsst.multi INFO: cancelling 17006651 on provider astro 2024-03-13 00:50:47 proc_lsst.multi:146 [INFO] found job 17006648 in provider astro proc_lsst.multi INFO: found job 17006648 in provider astro 2024-03-13 00:50:47 proc_lsst.multi:201 [INFO] cancelling 17006648 on provider astro proc_lsst.multi INFO: cancelling 17006648 on provider astro 2024-03-13 00:50:47 proc_lsst.multi:146 [INFO] found job 17006649 in provider astro proc_lsst.multi INFO: found job 17006649 in provider astro 2024-03-13 00:50:47 proc_lsst.multi:201 [INFO] cancelling 17006649 on provider astro proc_lsst.multi INFO: cancelling 17006649 on provider astro 2024-03-13 00:50:47 proc_lsst.multi:146 [INFO] found job 17006652 in provider astro proc_lsst.multi INFO: found job 17006652 in provider astro 2024-03-13 00:50:47 proc_lsst.multi:201 [INFO] cancelling 17006652 on provider astro proc_lsst.multi INFO: cancelling 17006652 on provider astro 2024-03-13 00:50:47 proc_lsst.multi:146 [INFO] found job 17006650 in provider astro proc_lsst.multi INFO: found job 17006650 in provider astro 2024-03-13 00:50:47 proc_lsst.multi:201 [INFO] cancelling 17006650 on provider astro proc_lsst.multi INFO: cancelling 17006650 on provider astro parsl.dataflow.dflow INFO: Shutting down executor multi 2024-03-13 00:50:48 proc_lsst.multi:40 [INFO] Cancelling all provider resources proc_lsst.multi INFO: Cancelling all provider resources 2024-03-13 00:50:48 proc_lsst.multi:47 [INFO] new jobs since last cancel ['17006648', '17006649', '17006650', '17006651', '17006652', '45623'] proc_lsst.multi INFO: new jobs since last cancel ['17006648', '17006649', '17006650', '17006651', '17006652', '45623'] 2024-03-13 00:50:48 proc_lsst.multi:146 [INFO] found job 17006648 in provider astro proc_lsst.multi INFO: found job 17006648 in provider astro 2024-03-13 00:50:48 proc_lsst.multi:201 [INFO] cancelling 17006648 on provider astro proc_lsst.multi INFO: cancelling 17006648 on provider astro 2024-03-13 00:50:48 proc_lsst.multi:146 [INFO] found job 17006649 in provider astro proc_lsst.multi INFO: found job 17006649 in provider astro 2024-03-13 00:50:48 proc_lsst.multi:201 [INFO] cancelling 17006649 on provider astro proc_lsst.multi INFO: cancelling 17006649 on provider astro 2024-03-13 00:50:48 proc_lsst.multi:146 [INFO] found job 17006650 in provider astro proc_lsst.multi INFO: found job 17006650 in provider astro 2024-03-13 00:50:48 proc_lsst.multi:201 [INFO] cancelling 17006650 on provider astro proc_lsst.multi INFO: cancelling 17006650 on provider astro 2024-03-13 00:50:48 proc_lsst.multi:146 [INFO] found job 17006651 in provider astro proc_lsst.multi INFO: found job 17006651 in provider astro 2024-03-13 00:50:48 proc_lsst.multi:201 [INFO] cancelling 17006651 on provider astro proc_lsst.multi INFO: cancelling 17006651 on provider astro 2024-03-13 00:50:48 proc_lsst.multi:146 [INFO] found job 17006652 in provider astro proc_lsst.multi INFO: found job 17006652 in provider astro 2024-03-13 00:50:48 proc_lsst.multi:201 [INFO] cancelling 17006652 on provider astro proc_lsst.multi INFO: cancelling 17006652 on provider astro 2024-03-13 00:50:48 proc_lsst.multi:146 [INFO] found job 45623 in provider local proc_lsst.multi INFO: found job 45623 in provider local 2024-03-13 00:50:48 proc_lsst.multi:201 [INFO] cancelling 45623 on provider local proc_lsst.multi INFO: cancelling 45623 on provider local parsl.providers.local.local DEBUG: Terminating job/proc_id: 45623 parsl.providers.local.local WARNING: Failed to kill PID: 45623 and child processes on local 2024-03-13 00:50:49 proc_lsst.multi:50 [INFO] no new jobs since last cancel, resuming executor shutdown proc_lsst.multi INFO: no new jobs since last cancel, resuming executor shutdown parsl.executors.high_throughput.executor INFO: Attempting HighThroughputExecutor shutdown parsl.executors.high_throughput.executor INFO: Finished HighThroughputExecutor shutdown attempt parsl.dataflow.dflow INFO: Shut down executor multi parsl.dataflow.dflow INFO: Shutting down executor _parsl_internal parsl.executors.threads DEBUG: Shutting down executor, which involves waiting for running tasks to complete parsl.executors.threads DEBUG: Done with executor shutdown parsl.dataflow.dflow INFO: Shut down executor _parsl_internal parsl.dataflow.dflow INFO: Terminated executors parsl.dataflow.dflow INFO: DFK cleanup complete parsl.process_loggers DEBUG: Normal ending for cleanup on thread MainThread lsst.ctrl.bps.submit INFO: Completed submitting to a workflow management system: Took 273.6320 seconds lsst.ctrl.bps.drivers INFO: Run 'DEEP_20190504_A1c_science#step1_20240313T004552Z' submitted for execution with id 'None' lsst.ctrl.bps.drivers INFO: Completed submit stage: Took 273.6420 seconds; current memory usage: 0.340 Gibyte, delta: 0.002 Gibyte, peak delta: 0.002 Gibyte lsst.ctrl.bps.drivers INFO: Completed entire submission process: Took 288.9701 seconds; current memory usage: 0.340 Gibyte, delta: 0.203 Gibyte, peak delta: 0.203 Gibyte lsst.ctrl.bps.drivers INFO: Peak memory usage for bps process 0.340 Gibyte (main), 0.340 Gibyte (largest child process) Run Id: None Run Name: DEEP_20190504_A1c_science#step1_20240313T004552Z parsl.dataflow.dflow INFO: python process is exiting, but DFK has already been cleaned up real 4m51.906s user 0m35.910s sys 0m8.809s