We’ve recently started using Deadline as our manager software and are so far very pleased. However, there is one thing we’re actually missing from BackBurner, the option of compressing the 3ds Max file when submitting to the farm, thus reducing network traffic. BB submits a compressed file to the network folder, and then instructs the slave to copy the compressed file locally, followed by decompressing before starting the render. Often this reduces the filesize and network load by a 1/3 or so.
Is this something that is possible to implement within the Deadline SMTD? 3ds Max provides the maxzip application that I assume could be used for this process?
If we were to support this, we would probably want to make this a general feature of Deadline jobs instead of making it 3dsmax specific. We’ll add it to the wishlist. If it’s straightforward enough to do, we could probably look at it for version 6.1 or soon after.
Is there a reason why your MAX files are not saved with the built in compression already? (Customize > Preferences > Files > Compress On Save). The file size would still be reduced significantly without an additional compression/decompression step in the job submission and rendering process.
AFAIK, the main reason Backburner uses MaxZip is to pack the scene and (most) external references in one file before submitting. But since most modern file formats (JPG, PNG, EXR etc.) barely compress at all, there is little to be gained except for slightly better file transfer performance from submitting one large file vs. hundreds smaller ones. Whether the time wasted creating the archive and uncompressing it later is wasted or not is something I have not benchmarked as I was blessed with access to Deadline during all my production days and have never had to use Backburner…
Most studios with well-designed pipelines reference external files from a dedicated network storage and do not send them with the job. In fact, you have the ability not to send the MAX scene with the Deadline job too - you can reuse a network copy of the scene. Or you can copy the file only once and reused from multiple jobs in cases like rendering dozens of State Sets etc… At least when using the integrated submission script (SMTD).
We do indeed reference all files from our network. However we have a local farm and an off-site farm on the same network where the render nodes are connected to different file servers that transfers files between each other.
We work on poly-heavy scenes that for various reasons cannot use external model referencing and thus end up quite large. The “compress on save” function slows down saving considerably whilst working on a project (as also auto-backup compresses the files). This is why it would have been nice to have Deadline compress the files upon submitting the jobs.
We are very pleased with Deadline and due to our particular setup this is really the only tiny thing we are missing.
Hi Peter,
Just thinking of an alternative approach…as compressing the max file on submission to Deadline will of course dramatically increase your submission time as well!
Instead of submitting the max file with the Deadline job, could you instead select to reference the scene file from your local file server and use the “RequiredAssets” feature which would have 1 x reference to the Max scene file’s location on the destination file server (after your server syncing, use Path Mapping to let the source file server swap out to destination file server), then the job won’t start until the Max file is in place on your remote file server and you don’t have the overhead of compressing and submitting the max file.
I’m assuming whatever sync system you are using between your local & remote servers is capable of compression during transit?