AWS Thinkbox Discussion Forums

Job dependencies, timestamps

Hey,

I’ve been looking into sending jobs remotely, and how to sync the files before rendering. We are using Bittorrent Sync for the transfer, but the jobs can be sent before the assets are synced.

Would it be possible to have a timestamp attribute for the asset dependencies, so we can make sure that the assets are synced correctly?

Hey!

Can you explain how a timestamp attribute would help here? Do you think a file size attribute would work instead?

Note that Deadline itself doesn’t currently sync any files (that’s something we’re looking into for Deadline 8.1). Currently, the asset dependencies are used to check if those files already exist, which is why I’m not sure how the timestamp would help.

Thanks!
Ryan

So the problem occurs when you have been working on a file, and its already synced with the remote harddrive. When submitting the Deadline job, the render would start immediately even if you have the scene file as a dependency, since it already exists on disk. The latest changes I did just before submitting to Deadline, won’t be in the first couple of renders before it gets synced correctly.

If the timestamp of the scene file would be validated against a timestamp parameter I submit with the job, the job won’t starting rendering until the file has been updated/synced and the timestamp of the file is correct.

I could do this with some custom parameters, and a script dependency, but thought it might be a good addition to Deadline.

Instead of overwriting the existing file when making changes, would it be an option to save out a new version of the file (ie: v001, v002, etc)? There are a couple of benefits for doing this:

  • You don’t run into the issue you’re currently having.
  • You can re-render old jobs that used the previous version of the file.

As I mentioned, we’re working on a version transferring system for Deadline for Deadline 8.1, and it is being designed around the assumption that new versions of existing scenes/assets will be created instead of overwriting existing ones. Otherwise, you get into cases where you could have multiple jobs in the queue that require different versions of assets, and that can cause all sorts of problems and unexpected results.

Cheers,
Ryan

That is a good point. The biggest problem with this, is obviously the increased transfer size since you would need to copy/transfer all files referenced in the render. With point-cache and textures, but would be a lot of bandwidth.
The second problem is the disk space, which could be accounted for by deleting the job files similar to AUX files.

This should be mitigated if you only re-version files that have been modified. For example, if you have a scene called scene_v001.max that references texture_v001.exr, you would initially transfer both files to the remote location before rendering the job. If you then make changes to the scene only and save it as scene_v002.max, then you only need to transfer scene_v002.max to the remote location, since it is still using texture_v001.exr.

This is a necessary tradeoff, but we feel it’s worth it knowing that you won’t have a job accidentally render with the wrong version of an asset. In the system we’re designing, we’re not initially supporting destructive behavior (like asset deletion or overwriting) because the last thing we want is Deadline accidentally wiping out existing assets. :slight_smile:

This seems quite restrictive on the studios versioning system, unless this is working in the background so the studios version names aren’t changed?

Sorry this is redirecting the topic to what you guys are working on:) Think I’ll explore job parameters, with a script dependency.

I was just using the “v001, v002, …” naming as an example. It can be whatever you want. Deadline won’t care about how the files are named, it’s just necessary for new versions to be named uniquely because Deadline won’t overwrite existing files.

Thanks Ryan:)

I’ll definitely have a look at this workflow.

Privacy | Site terms | Cookie preferences