AWS Thinkbox Discussion Forums

How to submit a Python/MEL job to assemble the scene and also render it directly on the worker node

Currently, I have a MEL script that assembles a 3d scene and then outputs a .ma file. I submit jobs via a Python script which runs on my computer.

It does three steps: 1) calls the MEL script to generate a .ma file, 2) generates a JobInfo and PluginInfo, 3) submit the .ma, JobInfo, and PluginInfo to Deadline via the standalone Python API.

This is too slow because step 1) runs on my computer for each job submission. It takes ~70s for each one, so if I do 1000 jobs then it’s 1000 minutes just to submit my Deadline jobs!

I want to modify my approach to submit MEL scripts as Deadline jobs. I want the worker node to run the MEL script to generate a .ma file and then render them.

How can I achieve this?

This sounds like a good use-case for a Pre Job Script.

  1. Create a (MayaBatch) render job, pointing to <public folder>/FileToBeCreated.ma
  2. Attach all the necessary metadata for your generator script
  3. Generator script runs before all render chunks/frames
  4. Frames render as usual, because FileToBeCreated.ma exists and is good to go
1 Like

Thanks Daniel, I have some follow-up questions if you don’t mind!

Is there anywhere that I can see examples of JobInfo and PluginInfo files for MayaBatch submissions? I can’t find any in the Deadline docs. This is what I’ve come up with so far:

job_info = {
    "Name": "Some Job Name",
    "Plugin": "MayaBatch",
    "OutputDirectory0": "C:/path/on/artist/machine/to/render/job/output"
}

plugin_info = {
    'Renderer': 'Vray',
    "SceneFile": "C:/path/on/worker/node/to/FileToBeCreated.ma"
}

Some questions:

  1. How do I specify a pre-job script? Is it a setting in the JobInfo or PluginInfo file?
  2. The PluginInfo “SceneFile” should specify a location on the worker node, correct?
  3. The “SceneFile” will not exist until the Pre Job Script runs on the worker node and generates it, correct?
  4. When you say “generator script”, you’re referring to my Pre Job Script (Python or MEL) which assembles the scene, correct?
  5. What do you mean by “Attach all the necessary metadata”? Is that the JobInfo/PluginInfo settings?

This is my first foray into VFX rendering (I usually do web development) so forgive any naive questions :slight_smile:

Sure thing, as a general intro to submitting jobs manually, have a look here: https://docs.thinkboxsoftware.com/products/deadline/10.0/1_User%20Manual/manual/manual-submission.html#manual-submission-ref-label

To see what your current jobs have been submitted like, you can check out their properties in the Deadline Monitor: Double-click job>Submission params

There’s also an export option from this page, if you want to export the information.

As for what “stuff” a plugin actually looks for, that’s a trickier one. Usually you’d use the above method to figure out the common stuff, but you may need to dig into the plugin itself or at least the submitter provided by Thinkbox to find all possible values.

As an example, here’s a standard MayaBatch render from our farm (apologies for the heavy redaction):

To give this a little bit more context, here’s the Python dictionaries that were used to generate the job:

maya_info = {
    'Plugin': 'MayaBatch',
    'Name': name,
    'Comment': comment,
    'BatchName': batchName,
    'NotificationTargets': notificationTarget,
    'PostJobScript': postJobScript,
    'PreTaskScript': config.FILE_PRETASKSCRIPT,
    'ExtraInfo0': slackUserID,
    'ExtraInfo1': assetCode,
    'ExtraInfo2': assetName,
    'ExtraInfo3': assetCategory,
    'ExtraInfo4': productionFolder,
    'ExtraInfo5': timecode,
    'ExtraInfo6': show,
    'ExtraInfo7': deleteRenderOutput,
    'ExtraInfo8': ffmpegOutputDir,
    'ExtraInfo9': config.PIPELINEBASE,
    'ExtraInfoKeyValue0=focalLength': focalLength,
    'ExtraInfoKeyValue1=camName': camName,
    'ExtraInfoKeyValue2=imagePlane': imagePlane,
    'ExtraInfoKeyValue3=hash': hash,
    'ExtraInfoKeyValue4=renderFolder': renderFolder,
    'ExtraInfoKeyValue5=renderFileType': renderFileType,
    'ExtraInfoKeyValue6=renderEngine': renderEngine,
    'ExtraInfoKeyValue7=isPartialRender': isPartialRender,
    'ExtraInfoKeyValue8=preRoll': preRoll,
    'ExtraInfoKeyValue9=trc': trc,
    'ExtraInfoKeyValue10=pushDownstream': pushDownstream,
    'ExtraInfoKeyValue11=isRsProxyThumb': isRsProxyThumb,
    'ExtraInfoKeyValue12=extraNotificationTargets': extraNotificationTargets,
    'Pool': pool,
    'SecondaryPool': '',
    'MachineLimit': '0',
    'Priority': priority,
    'OnJobComplete': 'Nothing',
    'ConcurrentTasks': concurrentTasks,
    'Department': '',
    'Group': group,
    'LimitGroups': '',
    'JobDependencies': '',
    'Whitelist': '',
    'InitialStatus': 'Active',
    'OutputFilename0': outputFilename,
    'Frames': frameList,
    'ChunkSize': chunkSize,
    'MinRenderTimeSeconds': 0,
    'TaskTimeoutSeconds': 0,
    'TaskTimeoutMinutes': taskTimeoutMins,
    'StartJobTimeoutSeconds': 0,
    'StartJobTimeoutMinutes': 0,
    'InitializePluginTimeoutSeconds': 0,
    'OnTaskTimeout': 'Error',
    'EnableTimeoutsForScriptTasks': 'False',
    'EnableFrameTimeouts': 'False',
    'EnableAutoTimeout': 'False'
}

maya_plugin = {
    'Animation': '1',
    'Renderer': 'redshift',
    'UsingRenderLayers': '0',
    'LocalRendering': '1',
    'RenderLayer': '',
    'RenderHalfFrames': '0',
    'FrameNumberOffset': '0',
    'StrictErrorChecking': '1',
    'GPUsPerTask': gpusPerTask,
    'GPUsSelectDevices': '',
    'RedshiftVerbose': '2',
    'Version': MAYA_VERSION,
    'UseLegacyRenderLayers': '0',
    'Build': '64bit',
    'ProjectPath': projectPath,
    'StartupScript': startupScript,
    'ImageWidth': imageWidth,
    'ImageHeight': imageHeight,
    'OutputFilePath': outputFilePath,
    'OutputFilePrefix': outputFilePrefix,
    'Camera': camera,
    'CountRenderableCameras': '1',
    'SceneFile': sceneFile,
    'IgnoreError211': '0',
    'UseLocalAssetCaching': '0'
}

Hopefully that’s somewhat useful, even with limited access to what most of those variable are.

As for your questions:

  1. You specify a PreJobScript in the Job Info section, in my case maya_info (though I don’t have that parameter in the dictionary above, because we’re not using it – yet)
  2. Correct, but you can use Deadline’s path map translations and whatever else you need in order to make the paths as global as possible. However, if you’re strictly using local rendering and the file will only be accessed by the worker that created it, you could absolutely use a local drive/path
  3. Yes this was my assumption. So you kick off a job for a file that doesn’t exist, but will be created by the PreJobScript. This script runs before any render tasks start. You would also probably delete the file with a PostJobScript, after a successful upload of the output. I’m just guessing here
  4. Yes, sorry
  5. I just mean all the stuff in my/your maya_info object that you will need to build the scene. More specifically the ExtraInfo and ExtraInfoKeyValue fields. Alternatively if you are using an asset manager of some description, you could just include an ID/hash that you can query your API server with, to receive all the relevant data on-demand

Probably a good time to mention that this is just how I do it, and by extension how our studio does it. I’m not sure if there is a right or wrong way – just do whatever works for you. As you can see, we are not generating files on-demand like you are, but we are doing quite a lot of processing in the PostJobScript (like sanity checks, versioning, auto comps, tracking, notifications, etc).

Best of luck!

Privacy | Site terms | Cookie preferences