Task Dependencies - Unreal/MRQ sequence rendering

I am using Deadline to perform MRQ renders in Unreal using the plugins provided. It’s working fine launching one Job per render :+1:

I have scripts running Pre & Post task:

  • Pre - Perforce Syncs the UE project to the render node
  • Post - does ffmpeg, and reloates the results to a target directory

This means all pre syncronisation, rendering, and post processing being done in one job.

I would like to split this into 3 jobs.
These jobs would need to run sequentially on the same worker ( [syncJob] > [rendJob] > [postJob] )
:question: How can this depencency/assignment be stipulated in Deadline :question:

Thats certainly enough to get me started - However as a follow-on, I’m intending to split long renders into parts…
in this case I’d want to render the multiple concurrent parts (as above) in parallel, but with a ‘chaser’ job to execute when the last part is complete. Chaser could run on any Worker

( 1 [syncJob] > [rendJob] > [postJob] ) >
( 2 [syncJob] > [rendJob] > [postJob] ) > [chaserJob]
( 3 [syncJob] > [rendJob] > [postJob] ) >
:question: How can this dependency be achieved in Deadline :question:

Thanks! :pray:

This process is quite feasible, as I have been implementing it successfully for the past five years. There are a few methods to achieve this:

Could you provide some details on how you are currently submitting jobs to Deadline? Additionally, if you are using the latest version of Deadline, it includes direct support for Perforce, which could streamline your workflow. I also have some additional information that might be useful for you.

  1. Custom Submitter Approach: One effective strategy is to develop a custom submitter. This would configure three interdependent jobs: the first job handles the sync, the second manages the render, and the third job is responsible for creating the QuickTime video. This setup ensures that each stage logically follows the previous one, maintaining workflow coherence.

  2. Manual Submission: Alternatively, you can manually submit the sync job as needed. This method offers more control at each step but requires manual intervention.

Additionally, I recommend writing a custom event plugin that triggers specific actions based on the job type:

  • For UE Jobs: The plugin would automatically execute the Perforce sync script at the start of a Unreal Engine job.
  • Post-Processing: It’s straightforward to append a post-job script to generate a QuickTime video, utilizing tools like FFmpeg or Draft.

These approaches not only streamline the workflow but also enhance efficiency and reliability in managing tasks.

Sample Code Not Tested

class MyEvent(DeadlineEventListener):

    """This is the main DeadlineEventListener class for MyEvent"""

    def __init__(self):

        super().__init__()

        # Set up the event callbacks here

        self.OnJobSubmittedCallback += self.OnJobSubmitted

        self.OnJobFinishedCallback += self.OnJobFinished

    def Cleanup(self):

        del self.OnJobSubmittedCallback

        del self.OnJobFinishedCallback

    def OnJobSubmitted(self, job):

        # TODO: Connect to pipeline site to notify it that a job has been submitted

        # Set up your perforce sync command
        if 'UnrealEngine5' in job.PluginName
          command = ["p4", "sync"]
          # Execute the command
          try:
              subprocess.run(command, check=True)
              print("Perforce sync completed successfully.")
          except subprocess.CalledProcessError as e:
              print("Failed to sync Perforce:", e)

Cheers for the respose Derek. I will look into Event plugins for sure.

I am using using PreTask & PostTask scripts at the moment. They function fine and execute the [syncPreTask] > [render] > [ffmpegPostTask] within a single job.

My natural preference would be to define each step as an individual job (though we’d get multiple jobs in the queue for each render) - perhaps this is the wrong approach for Deadline?
Using Pre/Post Job/Task scripts seem like its the preferred mechnism for Deadline.

I’m currenly testing the difference between jobs scripts and Task scripts. TBH I dont really understand the difference between Jobs and Tasks yet.

The primary nature of my original query was really about the dependencies - can I ensure jobB follows jobA on the same rendernode? That now feels like perhaps I was barking up the wong tree…

If I keep the ‘pre’ as is, then thats fine - it runs my P4 Sync on the machine prior to the render

imagine we split a render into thirds. eg frames 1-80 , 81-160, 161-240. Each of those renders on a arbitrary node, with the resulting frames in a central dir. All good so far …
JobA ( 1-80 [syncTask] > [render] )
JobB ( 81-160 [syncTask] > [render] )
JobC ( 161-240 [syncTask] > [render] )

… now we must execute the post operation when all three jobs have finished (and they may finish in any order)
So one way to do that is with the same post script (or an event) on the end of each , but that postscript must test the completion of all jobs ids if it is to continue to execute…

JobA ( 1-80 [syncTask] > [render] > [ if a&b&c complete execute postOps ] )
JobB ( 81-160 [syncTask] > [render] > [ if a&b&c complete execute postOps ] )
JobC ( 161-240 [syncTask] > [render] > [ if a&b&c complete execute postOps ] )

Seems there are several ways to achive this - just need to pick the ‘right’ way !

… could you provide some details on how you are currently submitting jobs to Deadline?

I’m using the provided remote executor.
I send some extra info ( P4 stream, revision# etc).
I use a preTask pyscript to sync P4 to the provied revision#
I use a postTask pyscript to do ffmpeg, and move the results to a target directry.

Quoting a bunch of your post here to address questions :slight_smile:

For your workflow where you want all 3 steps to happen on the same machine you’ll be best using pre/post task scripts. A Worker will dequeue a Task → run the pre-task script → run the task itself → run the post-task script before it moves onto another Task.

Jobs are made up of tasks. So you could have a Job with a single Task, or 5000 Tasks.

Given this kind of job dependency I’d have another job auto-submitted that is dependent on the A,B,C jobs. That assumes that this doesn’t have to run on the same machine that ran A B and C. I don’t think that’s the case but otherwise your check should work.

Thanks for your response.

So I’m now using the python plugin to launch a chaser python job - Its got dependencies to the renderjobs, and it waits until they are finished before launching. This much is sucessful! :slight_smile:

Now though - I’m stuggling with the python script itself .

I’ve got several parameters in the “JobExtraInfo” fields which the python script needs to access.

Could you point me to a simple examnpel of a python script which has full access to the job which spawned it?

Note that our installation does not have the REST/WEB config - so the ‘standalone’ approach docs arent working for me . (Besides, I shouldn’t need standalone if its a python plugin launch- right?)

I just need my python script to be able to access all it’s job parameters - in particular the “JobExtraInfo” fields.

That’ll be a little complicated, since a Python job (specifically a Deadline Job running the Python application plugin won’t have access to the internal scripting API docs here.

Off the top of my head there’s two things you could do:

  1. Instead of using JobExtraInfo to store the things your Python script needs, use the Environment. Then you can use Python’s os.environ to pull the data you need.
  2. Make your own application plugin that can read in and pass that data to your script as command-line arguments.

#1 would be much quicker, but for #2 you’d just be calling JobExtraInfo as a property of the job object application plugins get. The starter on application plugins is here - Scripting Overview — Deadline 10.3.2.1 documentation

I went with option #3 : " while noone is looking, hastily stuff the relevant Extrainfo into arg strings, and pick them up as args in the py script"

Not particularly elegant, but did the job (no pun intended) :slight_smile: