AWS Thinkbox Discussion Forums

Export Scene and gather/include assets

I’m thinking of developing my own tool, but before I go down that path I want to make sure some functionality isn’t already in Deadline.

I’d like a Deadline submitter that submit the scene along with any needed (local) assets.
I don’t believe this is possible with Deadline yet?

With the amount of remote working today, we do have users (myself included) working on local machines, and/or custom drives. While we can connect to Deadline over VPN and submit jobs, the render nodes naturally cannot access any assets or other files stored locally.

I’m first and foremost just looking for a Houdini export submitter. Export render files, and include necessary assets. Houdini’s python tools already make it fairly easy to get a list over all scene file references. Looks fairly straight forward to develop something there, but if anything already exists it’ll definitively save me some time :slight_smile:

I’m not aware of such a tool. It would be relatively straightforward with hou.fileReferences

Are the render nodes in the cloud? There is Resource Tracker Overview — Deadline 10.0.29.0 documentation
though I have not used it.

Just a small correction - the AWS Resource Tracker is NOT related to assets. It tracks instances launched by the Spot Event Plugin, or the AWS Portal, to ensure they are terminated if connectivity with the Repo/Database is severed for any reason.

Several Deadline submission plugins (Houdini, Maya, 3dsMax) contain a function to introspect the scene being submitted and include metadata with the Job describing all assets needed in the form AWSAsssetFileX= where X is a 0-based index. In Houdini, this is controlled by the parameter shouldprecache. Once the submission is performed, the submitter calls deadlinecommand -AWSPortalPrecacheJob (JobId) to push the files to S3 for AWS Portal to use, but even if you are not rendering on AWS, you could enable this value to have a list of all required files in your JobInfo submission parameters. Then you could run your own script to do something with them, so there is probably no reason to reimplement this yourself. But you could just use the same Houdini functionality (as Mois mentioned) to collect in your own script.

1 Like

Thanks! Cloud, yes… I don’t think the resource tracker can help with this though.

hou.fileReferences is a great starting point. That’s where I’m at now.
I already found this repository that seems like a great kickstarter for an asset collector. Looks like there’s some good code in there for sorting out the necessary assets too. Just reading through at this stage.

Update asset paths, export scene, copy everything to a shared file server, then submit the job. Doesn’t sound too tricky in theory. Just have to do it without messing up the original scene.

Ahh, thanks Bobo! You beat me to it! Some good information here! I’ll definitively look more into the precache.

We do render on AWS. I just need to move the assets to the on-prem file server, which sounds easy enough with this information, and update the paths somehow.

As always things do become a bit more tricky than expected. Gathering cached file sequences etc. that doesn’t get collected with hou.FileReferences() But I believe I got that all working! I plan to share the script on GitHub once I feel it’s release-worthy.

I think I can run this as a pre-submit job, but looking for some advice on the next step.

Is there a way to submit job specific path-remapping settings when you submit a job?
I don’t want to make changes to the working scene file itself. Thinking to just replace, for example S:\ with \ourserver\share\jobsubmission\ - and replicate/copy the file structure and files there.

Thinking ahead, these path-remaps might get remapped again on our AWS nodes. Making it perhaps even more tricky. Any tips?

1 Like

Actually now I see that submission-time mapping rules do exist, though they are more of a system to replace tokens during submission:
https://docs.thinkboxsoftware.com/products/deadline/10.0/1_User%20Manual/manual/cross-platform.html#example-2-using-tokens-with-submission-time-rules
It’s not quite clear how this can be controlled in a custom plugin.

Thanks mois! That’s what I’m wondering about too. If I can just submit remap rules (somehow). But then AWS render nodes need to remap the job submission remaps - and not the scene paths. Wonder if the devs could clarify if/how that would work?

As i see it;
A = paths saved in scene
B = remapped paths to new location, stored as job options.
C = AWS remaps

Submitting stores A → B remaps with the job info.
Then I assume AWS nodes needs to remap B to C. Not A to C.

If not, I guess the hard(er) way is to loop through the required nodes in the scene, remap, save and submit and then revert everything back to original.

Yeah, there is an argument to be made that if you know the mapping and the original paths, you might as well apply it on the client. However, this is stuff that might take a bit of time, and I’d prefer it to happen on the farm so I don’t have to wait in my dcc.

I think it’s the same whether the mapping would be B → C or A → C

Privacy | Site terms | Cookie preferences