AWS Thinkbox Discussion Forums

AWS Submission - Ignore Duplicate Images?

Hi,

Can Deadline be set to ignore duplicate images/paths when submitting to AWS? I find myself having to spend 30 mins or so manually cleaning a scene before it allows me to submit to the repo.

Submission logs say this (depending on which file is duplicated)

  1. \piknas003\Projects\2018\R&D\DavidC\ARCHV\TMP\d2e611dca1c6.jpg
    Error: More than one auxiliary file is named “d2e611dca1c6.jpg”, auxiliary file names must be unique. (Deadline.Submission.DeadlineSubmissionException)

Any ideas? (Apart from archiving the max file and remapping everything to the new archives folder)

Thanks.

Actually, that error is coming up because you’re submitting your assets with the job (and this is only supported with 3DS Max using "Submit External Files With Scene) which defeats almost all of the fancy caching we have built in. For the maximum performance benefits you’d want to leave the assets where they are so they can be re-used across submissions. Quick details (before I say how to force what you’re doing):

When accessing anything within the Repository, Deadline has to rely on the Remote Connection Server. That server uses individual TCP sessions per asset and really doesn’t scale to S3 levels. There’s a little secret caching going on, but it’s designed for little files like scripts. You’re likely transferring every file once per machine that requests it.

The AWS Portal has the Asset Transfer System which uploads file revisions into S3 then pulls those files down as needed. These are driven by the paths configured in the ATS and as long as it can figure out where the files exist on-prem it should be significantly faster.

Now, all that said: If you want to upload the assets anyway, I think there was a way to rename the files as they were being copied into the temporary directory, but I’m not finding it… I’ll have to ask.

Privacy | Site terms | Cookie preferences