Hello,
I’ve been wanting this feature for quite some time now and I believe it would make the standard submission scripts support a much wider range of pipelines if implemented.
The idea is very simple, having callbacks for certain events during the submission of a job. When a job is about to be submitted, when a job is finally submitted, when the submission window/node is created, etc. This would enable processes like setting defaults for the submission window/node based on context or DCC and could also make including specific pipeline configuration for jobs possible without editing the source code.
Ideally the implementation for DCCs with Python could be having a set of globals somewhere and you could just register your callbacks. So in the submission folder having a SubmissionGlobals.py
containig:
import collections
callbacks = collections.defaultdict(lambda x: [])
And then on a pipeline, you could just:
import SubmissionGlobals
SubmissionGlobals.callbacks[“JobAboutToBeSubmitted”].append(lambda kwargs: print(kwargs))
Other possible implementation (for the MEL/Python submission scripts) could be something like adding a string parameter on the submission node/window called “callbacks” and executing them in said events. For example, the user could have in that parameter:
from myCompany import dl;dl.handle_callback(
<CALLBACK>
,<KWARGS>
)
And in the event of a submission, the <> would be replaced and executed:
from myCompany import dl;dl.handle_callback(“JobAboutToBeSubmitted”, {“files”: ["/path/job.info", “/path/plugin.info”]})
With callbacks in place, editing the source code for things like including the REZ environment onto the job or worrying about the users not setting the proper groups/pools for the current job would not be necessary. I find myself having to write my own submission scripts with less than half the features the default scripts have just because some key features are not available, and editing the source code is unreliable and prone to failure.
Thanks.