AWS Thinkbox Discussion Forums

Task Dependencies - Unreal/MRQ sequence rendering

I am using Deadline to perform MRQ renders in Unreal using the plugins provided. It’s working fine launching one Job per render :+1:

I have scripts running Pre & Post task:

  • Pre - Perforce Syncs the UE project to the render node
  • Post - does ffmpeg, and reloates the results to a target directory

This means all pre syncronisation, rendering, and post processing being done in one job.

I would like to split this into 3 jobs.
These jobs would need to run sequentially on the same worker ( [syncJob] > [rendJob] > [postJob] )
:question: How can this depencency/assignment be stipulated in Deadline :question:

Thats certainly enough to get me started - However as a follow-on, I’m intending to split long renders into parts…
in this case I’d want to render the multiple concurrent parts (as above) in parallel, but with a ‘chaser’ job to execute when the last part is complete. Chaser could run on any Worker

( 1 [syncJob] > [rendJob] > [postJob] ) >
( 2 [syncJob] > [rendJob] > [postJob] ) > [chaserJob]
( 3 [syncJob] > [rendJob] > [postJob] ) >
:question: How can this dependency be achieved in Deadline :question:

Thanks! :pray:

This process is quite feasible, as I have been implementing it successfully for the past five years. There are a few methods to achieve this:

Could you provide some details on how you are currently submitting jobs to Deadline? Additionally, if you are using the latest version of Deadline, it includes direct support for Perforce, which could streamline your workflow. I also have some additional information that might be useful for you.

  1. Custom Submitter Approach: One effective strategy is to develop a custom submitter. This would configure three interdependent jobs: the first job handles the sync, the second manages the render, and the third job is responsible for creating the QuickTime video. This setup ensures that each stage logically follows the previous one, maintaining workflow coherence.

  2. Manual Submission: Alternatively, you can manually submit the sync job as needed. This method offers more control at each step but requires manual intervention.

Additionally, I recommend writing a custom event plugin that triggers specific actions based on the job type:

  • For UE Jobs: The plugin would automatically execute the Perforce sync script at the start of a Unreal Engine job.
  • Post-Processing: It’s straightforward to append a post-job script to generate a QuickTime video, utilizing tools like FFmpeg or Draft.

These approaches not only streamline the workflow but also enhance efficiency and reliability in managing tasks.

Sample Code Not Tested

class MyEvent(DeadlineEventListener):

    """This is the main DeadlineEventListener class for MyEvent"""

    def __init__(self):

        super().__init__()

        # Set up the event callbacks here

        self.OnJobSubmittedCallback += self.OnJobSubmitted

        self.OnJobFinishedCallback += self.OnJobFinished

    def Cleanup(self):

        del self.OnJobSubmittedCallback

        del self.OnJobFinishedCallback

    def OnJobSubmitted(self, job):

        # TODO: Connect to pipeline site to notify it that a job has been submitted

        # Set up your perforce sync command
        if 'UnrealEngine5' in job.PluginName
          command = ["p4", "sync"]
          # Execute the command
          try:
              subprocess.run(command, check=True)
              print("Perforce sync completed successfully.")
          except subprocess.CalledProcessError as e:
              print("Failed to sync Perforce:", e)

Cheers for the respose Derek. I will look into Event plugins for sure.

I am using using PreTask & PostTask scripts at the moment. They function fine and execute the [syncPreTask] > [render] > [ffmpegPostTask] within a single job.

My natural preference would be to define each step as an individual job (though we’d get multiple jobs in the queue for each render) - perhaps this is the wrong approach for Deadline?
Using Pre/Post Job/Task scripts seem like its the preferred mechnism for Deadline.

I’m currenly testing the difference between jobs scripts and Task scripts. TBH I dont really understand the difference between Jobs and Tasks yet.

The primary nature of my original query was really about the dependencies - can I ensure jobB follows jobA on the same rendernode? That now feels like perhaps I was barking up the wong tree…

If I keep the ‘pre’ as is, then thats fine - it runs my P4 Sync on the machine prior to the render

imagine we split a render into thirds. eg frames 1-80 , 81-160, 161-240. Each of those renders on a arbitrary node, with the resulting frames in a central dir. All good so far …
JobA ( 1-80 [syncTask] > [render] )
JobB ( 81-160 [syncTask] > [render] )
JobC ( 161-240 [syncTask] > [render] )

… now we must execute the post operation when all three jobs have finished (and they may finish in any order)
So one way to do that is with the same post script (or an event) on the end of each , but that postscript must test the completion of all jobs ids if it is to continue to execute…

JobA ( 1-80 [syncTask] > [render] > [ if a&b&c complete execute postOps ] )
JobB ( 81-160 [syncTask] > [render] > [ if a&b&c complete execute postOps ] )
JobC ( 161-240 [syncTask] > [render] > [ if a&b&c complete execute postOps ] )

Seems there are several ways to achive this - just need to pick the ‘right’ way !

… could you provide some details on how you are currently submitting jobs to Deadline?

I’m using the provided remote executor.
I send some extra info ( P4 stream, revision# etc).
I use a preTask pyscript to sync P4 to the provied revision#
I use a postTask pyscript to do ffmpeg, and move the results to a target directry.

Privacy | Site terms | Cookie preferences