I have some command line jobs that I need progress update. Based on docs I figured out that all it needs is printing Progress: #% to STDOUT but it doesn’t seem to do anything.
Example from the job log:
2018-09-05 05:58:57: 0: STDOUT: Progress: 11%
2018-09-05 05:58:57: 0: STDOUT: Progress: 12%
2018-09-05 05:58:57: 0: STDOUT: Progress: 13%
2018-09-05 05:58:57: 0: STDOUT: Progress: 14%
From my understanding this should update the job progress, but it doesn’t. I’m still on Deadline 9.0.2. Am I doing something wrong or it’s an bug ?
The command line job actually starts up python with script and some additional arguments. It’s simple ftp uploader, and the code that I use for printing looks like this:
Looks like that progress feature landed in 8.1.5, and I think your encoding and format are good here.
Considering the Slave updates its progress every 7 seconds, maybe it’s going too fast? Looking at your output it looks like it’s just flying through that work. Looking at the Slave UI directly might show it updating faster.
When I run the script locally I see it’s being printed in real time over that 12 minutes or so… But when it runs on the slave it’s different. Is there some in-between process that catches the output ?
Did anyone get this to work? It doesn’t seem to be a python buffer issue for me. The progress gets outputted to the log correctly over the course of 7-10 minutes but the task progress bar still doesn’t update.
I just got this working in Powershell, which probably doesn’t super help, but one thing I found was that the percent needed to be an integer for the progress bar to update:
I’ve spent at least a day on this problem and I’m beginning to think the failure to write progress is a bug. I’m using deadline 10.1.20.3
I wrote a quick python script that prints progress:
import time, sys
num = 10
for _ in range(num):
out=‘Progress: %s%%’%((_+1)*10)
sys.stdout.write(out+’\n’)
sys.stdout.flush()
time.sleep(5)
Then submit this python script as a job to the deadline:
deadlinecommand.exe -SubmitCommandLineJob -executable python.exe -arguments c:/path/to/script/progress.py
The script runs fine but gives no progess feedback. Am I doing something wrong?
If command line progress really does work can someone from Thinkbox Software please show me with a working example like the one above?
I think I found my own answer. I just need to submit the job as a managed process by using “execute in shell”. The give away was the line in the CommandLine.py:
StdOut/Err will NOT be captured here as unmanaged process.
This makes sense. At least you will not get the output until the process completes and the caller prints its whole output.
The best solution that I found is to use RPC and send messages from the subprocess, which the plugin then prints with self.LogInfo.
The other way would be to connect the subprocess’s stdout to the current process’s when calling Popen.