I have been testing running Python scripts with Deadline. So far, this is pretty easy. I do have a question. When submitting a Python job, I noticed that ‘Submit Script File With Job’ is checked by default. From reading the logs, I can see Deadline copied my Python script over to the nodes. Does Deadline erase the script after the job is done?
The script is sent as an auxiliary file, similar to how other job-related files like scene files, and sometimes external references can be sent to the Repository’s Job folder, and then get copied to the render node’s temp. folder when a task is dequeued. When the Slave finishes a task, it checks for new jobs. If the same job is found to still be the highest priority for that Slave, and the temp. folder already contains all necessary auxiliary files, it just reuses the data (and in some cases even keeps the rendering application and scene in memory).
Even if the Slave stops rendering a task (because you paused the job for example), and then the job is requeued and the same job ends up on the same Slave, the existing content of the temp. folder will be reused without resyncing. (In older versions of Deadline, like 5.0 and earlier, a checksum of the auxiliary scene files would be calculated in the Repository and the slave’s folder, so that a modified .MAX or Maya .MB file could be detected if a technical director modified it in place in the Job’s folder. However with 1GB large files, we discovered this was too slow, so the check was removed). Now in the Job Properties > General tab, there is a checkbox that reads “Re-synchronize Auxiliary Files Between Tasks”. When you check it, the auxiliary files including scripts will be copied from the repository between tasks. This means that you can open the Repository’s Job folder, modify the submitted script, and when you requeue the Job, any Slave that might have the older copy of the script in its temp. folder would copy the new one over before rendering…
To answer your questions, if another Job turns up on top of the priority list, the old job will be dropped, and the temp. folder will be cleaned up before the new auxiliary files are copied from the new job. But the original script submitted with the Job will continue to exist in the Job’s folder in the Repository until the Job is deleted. Submitting the script with the job is used to ensure that if somebody modified a network copy of the script while the job was rendering, the changes won’t affect the jobs being processed, and even a year later an archived job could be brought back to life including all the right versions of its auxiliary files.
Hope this helps.
Ah okay. Cool. Good to know.