AWS Thinkbox Discussion Forums

MayaBatch - per task commands

Hi there,

How could I go about creating a MayaBatch job (mayapy) where each task perform a unique python command?

A “real life” use-case would be, I want to load a maya scene file and have each task perform a geo-cache export for each asset in the scene.
So I expect the process something along these lines:

  • The Job command will open the maya scene file.
  • each task will cache one asset in the already loaded maya scene file.
    task1 execute: "import cache;cache.export(‘asset1’)
    task2 execute: "import cache;cache.export(‘asset2’)
    task3 execute: "import cache;cache.export(‘asset3’)

The idea with this approach is to have the scene loaded once (on the job level) and each task execute it’s own portion.
I believe that would required a custom MayaBatch plugin, where I can rework the def RenderTasks(self) to grab the task’s command, somehow.

So I guess my question has two parts:

  • Would you recommend this approach in Deadline (I’m fairly new to Deadline) if not, what approach would/have you taken?
  • Am I on the right path with creating a branch from MayaBatch into plugin into a new custom one? Is there a better/easier alternative?

Thank you in advance for any insights,
Asi

So this is totally possible, with out to many modifications to the base plugin, athough you may need to provide some extra plugin attributes or job environment variables, depending on how you like to work.

so say you have 5 assets.
you would need to build an index for the tasks and either store it as a python list or dict, as either a plugin attribute or environment variable.
[‘asset1’,‘asset2’,‘asset3’]

There is a global proc that the deadline folks make a few convenience mel global procs available in the interactive session.

so your job, you submit , the maya file and your custom python file, and the amount of frames per asset (do not use tasks per chunk feature, should only be 1 ), same count as what you have populated your list above with.

In your python script you want to do something like this:

lookup global proc DeadlineValue(“StartFrame”) , this will give you the current frame.
if you set your asset list as an job env, then do a ast.eval(os.environ.get(“MY_LIST_VAR_NAME”))
if you set it as a plugin attribute, then access the mel globalproc DeadlinePluginInfo(“NAME_OF_THE_ATTRIBUTE”), convert it to a list after.

with the (current frame -1) , get the asset index from your list , that will give you the asset name you want to export and then , run your export code.

This should be a high enough overview of the work you want to do and should mean that you will not need to modify the base plugin…
Im also making the assumption that your using one of the latest deadline 10* versions.

Edwin, may have extra info to add based on this possible workflow.

Hope this helps.

Cheers
Kym

I think that’s a good idea. I guess some helpful bits, then open ended questions I’ll need to dig into… :smiley:

Kym meant to make sure the task size is set to “1”. That will make sure that the StartFrame and EndFrame parameters will always been the same and you don’t have to add additional looping yourself. Within MayaBatch the scene stays loaded.

Deadline can seem kind of magical at times (maybe it is :stuck_out_tongue:) but if you enable the “log script contents to render log” in “Tools -> Configure Plugins -> MayaBatch”, you’ll be able to see the Melscript the plugin generates and how your changes will affect the process. I’d also make a copy of “[repo]/plugins/MayaBatch” and drop it in “[repo]/custom” so that upgrades to the Repo won’t overwrite your custom code.

The plugin should support throwing your script in at submission time. Here’s some info on that guy:
docs.thinkboxsoftware.com/produ … script-job

Now, the thing I actually don’t know how to do yet is to have your custom code called at each task start. I have a terrible feeling that’s not supported yet and you’ll have to throw some custom Melscript into the plugin to call your script function… I’m going to go off and bug the dev team now.

The execution is of the script is called in render tasks, i can confirm that a process runs as expected, even if all 3 tasks hit the same slave one after another.

Another thing i forgot from my first post, make sure the plugin attribute “ScriptJob”, is set.
The only thing you need to remember with this, the same maya file stays loaded, even when the tasks are run on the same slave, so if you are doing something destructive to the maya file, you may want to add a scene file reload into your python script…

Might be worth making a sample job in the github examples?

Privacy | Site terms | Cookie preferences