Export Scene and gather/include assets

I’m thinking of developing my own tool, but before I go down that path I want to make sure some functionality isn’t already in Deadline.

I’d like a Deadline submitter that submit the scene along with any needed (local) assets.
I don’t believe this is possible with Deadline yet?

With the amount of remote working today, we do have users (myself included) working on local machines, and/or custom drives. While we can connect to Deadline over VPN and submit jobs, the render nodes naturally cannot access any assets or other files stored locally.

I’m first and foremost just looking for a Houdini export submitter. Export render files, and include necessary assets. Houdini’s python tools already make it fairly easy to get a list over all scene file references. Looks fairly straight forward to develop something there, but if anything already exists it’ll definitively save me some time :slight_smile:

I’m not aware of such a tool. It would be relatively straightforward with hou.fileReferences

Are the render nodes in the cloud? There is Resource Tracker Overview — Deadline documentation
though I have not used it.

Just a small correction - the AWS Resource Tracker is NOT related to assets. It tracks instances launched by the Spot Event Plugin, or the AWS Portal, to ensure they are terminated if connectivity with the Repo/Database is severed for any reason.

Several Deadline submission plugins (Houdini, Maya, 3dsMax) contain a function to introspect the scene being submitted and include metadata with the Job describing all assets needed in the form AWSAsssetFileX= where X is a 0-based index. In Houdini, this is controlled by the parameter shouldprecache. Once the submission is performed, the submitter calls deadlinecommand -AWSPortalPrecacheJob (JobId) to push the files to S3 for AWS Portal to use, but even if you are not rendering on AWS, you could enable this value to have a list of all required files in your JobInfo submission parameters. Then you could run your own script to do something with them, so there is probably no reason to reimplement this yourself. But you could just use the same Houdini functionality (as Mois mentioned) to collect in your own script.

1 Like

Thanks! Cloud, yes… I don’t think the resource tracker can help with this though.

hou.fileReferences is a great starting point. That’s where I’m at now.
I already found this repository that seems like a great kickstarter for an asset collector. Looks like there’s some good code in there for sorting out the necessary assets too. Just reading through at this stage.

Update asset paths, export scene, copy everything to a shared file server, then submit the job. Doesn’t sound too tricky in theory. Just have to do it without messing up the original scene.

Ahh, thanks Bobo! You beat me to it! Some good information here! I’ll definitively look more into the precache.

We do render on AWS. I just need to move the assets to the on-prem file server, which sounds easy enough with this information, and update the paths somehow.

As always things do become a bit more tricky than expected. Gathering cached file sequences etc. that doesn’t get collected with hou.FileReferences() But I believe I got that all working! I plan to share the script on GitHub once I feel it’s release-worthy.

I think I can run this as a pre-submit job, but looking for some advice on the next step.

Is there a way to submit job specific path-remapping settings when you submit a job?
I don’t want to make changes to the working scene file itself. Thinking to just replace, for example S:\ with \ourserver\share\jobsubmission\ - and replicate/copy the file structure and files there.

Thinking ahead, these path-remaps might get remapped again on our AWS nodes. Making it perhaps even more tricky. Any tips?

1 Like

Actually now I see that submission-time mapping rules do exist, though they are more of a system to replace tokens during submission:
It’s not quite clear how this can be controlled in a custom plugin.

Privacy | Site terms | Cookie preferences