AWS Thinkbox Discussion Forums

Initial Deadline Setup Issues & Help Required

  1. For already submitted jobs - how do we edit its “Job Info Parameters” as well as its “Plugin Info Parameters”?
    If the property you want to change isn’t exposed in the Job’s Modify Job Properties window you’ll have to resubmit the job with the desired settings.
  2. How can we edit the already submitted Gaffer job to add the “-threads 16” parameter to it? and how can we get the respective job’s execution also to pick it up and respect that flag for the actual command line execution?
    You’re referring to this Gaffer plugin?. It looks like you’d have to either edit the plugin or reach out to the author.
  1. Can we have the Job’s “Concurrent Tasks” be a dynamic or programmatically calculated value? For example, for the Deadline-Gaffer job, we want to drive this through the job’s submission parameter “-threads int” value. And hoping to implement it in a way like the one below:

You can modify the concurrent tasks programmatically, the issue is that’ll apply to the whole job and not just tasks picked up by individual Workers. Depending on how high the --threads option can go you might be able to interrogate the Worker for its CPU thread count and build that argument then. The nitty-gritty on how to do that I’m not certain of, but it should be possible.

You could also always submit with --threads 16 and set the maximum concurrent tasks for each size of worker. Then on the job set the concurrent tasks to some high number, and the Workers will bring it down to their local maximum set in ‘Concurrent Task Limit Override’.

The deadline.ini didn’t get attached above so I’m going to assume it’s got SlaveDataRoot= in it, which will mean Deadline uses the default path for the operating system which should be %PROGRAMDATA%\Thinkbox\Deadline[VERSION]\workers\[WORKERNAME].

But for some reason that output isn’t acceptable, so I’d like to know what’s being used. To that end, run the attached script getslavedataroot.py (464 Bytes) on the fprdsk113 machine. I’d expect to see something odd there that’ll give us some direction.

sorry about that @Justin_B maybe I would have missed attaching it or it was

Sorry, the file you are trying to upload is not authorized (authorized extensions: jpg, jpeg, png, gif, 7z, tar, targz, zip, exr, py).

which I may have missed addressing.

Attaching the deadline.ini in a zip format here:
deadline.zip (572 Bytes)

Please find below the output of getslavedataroot.py script:

"%DEADLINE_PATH%"\deadlinecommand -ExecuteScript C:\Users\bsukhadia\Documents\Deadline\forums\from_support\getslavedataroot.py
'C:\Users\bsukhadia\AppData\Local\Thinkbox\Deadline10\pythonAPIs\2022-11-22T212046.0000000Z' already exists. Skipping extraction of PythonSync.
DeadlineClientLocalDataHome = C:\ProgramData\Thinkbox\Deadline10
The path being used as SlaveDataRoot is C:\ProgramData\Thinkbox\Deadline10\workers

Hi @Justin_B

Can you please help with maybe a deadlinecommand example snippet where we can re-use all the settings of an already submitted Gaffer job with -threads n added/amended to it and resubmit it as a new job?

Yes that’s the Gaffer plugin from @egmehl we are using and trying to get -threads n working.

Thanks for this suggestion @Justin_B it sounds like a fair workaround and we will give it a try soon. However, the edge case I feel would be, we will not be able to prevent the worker from picking up the gaffer job where worker cpus are less than the requested rendering threads of the gaffer rendering job, isn’t it?

For this example case, if the job is submitted with -threads 16 and the Deadline worker is 12 CPU/Cores it will still pick up or assign this job, right?

Thanks,
Bhavik
Fractal Picture

Unfortunately not. From the Gaffer thread, there isn’t an option to set -threads. Unless there’s a setting that’ll take in arbitrary flags the plugin will have to be re-worked to add that option. As-is there’s not a way in Deadline to append to the arguments used for a task without some modifications to the application plugin.

If that does exist, you can export the job’s submission files with this deadlinecommand flag:

GenerateSubmissionInfoFiles <Job ID> <Job Info File> <Plugin Info File>
  Generates a Job Info file and a Plugin Info file that can be used to submit a
  new Job, based on an existing one.
    Job ID                   The ID of the Job on which to base the
                             Submission Parameters.
    Job Info File            The file to which the Job Submission Info will
                             be output.
    Plugin Info File         The file to which the Plugin Submission Info
                             will be output.

Yep, the Worker doesn’t query job’s render settings before dequeuing tasks. If you want a breakdown of what is considered, job scheduling is broken down here.

Privacy | Site terms | Cookie preferences