We have a problem happening with our 3dsmax users where if they don’t re-name their Max files before submitting a job to the queue their output can end up coming from a previously submitted job.
So, if someone is working on “palace_13.max” and they render their frames then make changes to the Max file and re-submit some of the frames, they get new frames rendered without the changes. Only if they re-name the source file to “palace_14.max” will they get DL to recognize the newly submitted job. It’s like the old job is cached and the new job is unable to overwrite it.
I’m not a Max user at all, so am baffled by this - we don’t have any problems like this happening with our After Effects work.
We’re using DL v 5.0.0.44528 and 3dsmax 2012. Problem occurs intermittently (but with increasing frequency apparently). This happens to all four of our Max users. Jobs are being submitted via the SMTD, and the Sanity Check doesn’t offer any helpful warnings or suggestions.
If I knew how to use Max I’d work on trouble-shooting this myself, as it seems it must be something quite basic. Any help will be much appreciated.
In SMTD, under the “Options” tab, there is an option called “Submit Scene File With Job”.
It should be checked by default and causes SMTD to save a temp. file of the scene during the submission, then pass this file with the submission data to be stored in the Repository. Pressing Ctrl+D on a Max job in Monitor would then reveal the MAX file in the job folder, with the name and the content at the time of submission. This is how Deadline has operated for many years and I have never seen this method cause problems. But the drawback of this approach is that if the scene in question is huge (and I have seen scenes of 1.5 to 2GB being sent to Deadline), this both slows down the submission (file is being saved, then copied to the Repository), and can cause a huge hit on the Repository server which is often not designed to serve gigabyte-large files to many machines.
So the “Submit Scene File With Job” option was added and when unchecked, the scene is NOT submitted to Deadline. Instead, the original source that the user opened from some network location designed to host and serve large scene (and possibly texture) files is passed as a path reference to the Deadline job. This means that if the user does not perform a File>Save or Ctrl+S on the Max scene after making changes and before submitting the job, the ORIGINAL scene will be seen by the slaves and the changes will not kick in. This might explain why resaving the scene before submitting appears to fix the issue. Another drawback is that if somebody would make further changes to the network copy referenced by the job while slaves are working on it, you could get different slaves rendering different versions of the scene.
This is a relatively new feature and I just discovered it is not even properly documented in the online help. We will address this ASAP.
Since the option is sticky between sessions, if it was unchecked in the SMTD, it would stay unchecked until checked again.
Please see if it is on or off and let us know.
Your options are to either check it again if it is unchecked (assuming you don’t render gigabytes-large scenes on 200+ machines as we did in my last company), or instruct your Max users to always perform a manual save before submitting if they insist on not submitting the scene with the job.
A third option would be to save the scene before submission to a dedicated folder that is used only for network rendering and is never modified after the job starts. This would still allow you to use a fast dedicated server (large companies use Isilon systems or something like that) to host the scene file, and at the same time make sure nobody is using that version of the file for actual work. I think we should automate this option by allowing the user to specify a special path to store job scenes and perform the saving and copying of the file there (as a kind of a mix of the two existing options). We used to have a custom solution for this at my last production job and it would be useful to support in SMTD, too.
If the above does not solve your problem, please let us know!
Hi Bobo - thanks for the quick response! It looks like the “submit scene file with job” option is ticked, but our 3D lead said he’ll check that out specifically when the problem arises next. Some options seem to come un-stuck sometimes.
Since this problem seems to be getting worse, I was wondering if its related to clutter building up in the repository. We’ve got jobs set to archive after 2 days and to delete 2 days after that. Sometimes I’ll delete things earlier via the monitor. But looking in the jobs folder on the repo, I noticed there’s 550 0k folders with empty task folder inside them. Am I ok to just delete these through the finder? And are there any other repository cleaning and maintenance steps you’d recommend to keep things running smoothly?
Btw: our Max jobs all look to be >100MB, and we’ve got 40-odd 3D nodes and 8 After Effects machines.Repo hosted on a linux box w/ raid-0 ssd’s and a 6-port gigabit enet trunk.
Just to confirm, are the jobs resubmitted through SMTD, or are they resubmitted from the job’s right-click menu in the Monitor. I’m assuming it’s the former, but I just want to make sure!
Yes, you should be okay deleting these manually. These are essentially “corrupted” jobs. You can configure Deadline to notify you when a job becomes corrupted so you can stay on top of them. There is also an option to have Deadline auto-cleanup corrupted jobs. Both of these options can be configured in the Repository Options from the Monitor, which you can access from the Tools menu while in super user mode. I wouldn’t expect this to cause the problem you’re seeing with the Max renders, but it wouldn’t hurt cleaning things up to see if it makes a difference.