@Bobo
I downloaded it and put it in the folder, the menu is not coming, can I get any advice in this regard?
Assuming you copied the file to the folder and then opened the SMTD UI, the only thing that comes to mind is that your Client might be connecting to a different Repository. The title bar of SMTD shows in your case a cache path due to connecting to a Remote Connection Server (you are not connecting to the Repository directly). So it is hard to tell what Repository you are connecting to, but there is a chance it is not the one shown in the File Explorer.
Is the RCS connecting to the Repository on your C: drive shown in the screenshot? Is the RCS running on that same machine or somewhere else? Can you look at your deadline.ini to check what the RCS settings are?
Thank you very much @Bobo.
Yes, my client was connected to a different repository. When I copied that repository
Then it’s working perfectly.
one thing more, it’s a very nice tool but I create a batch render with it, it renders to Max’s own frames instead of the VRay frame buffer. Is it possible to render with Vray frame buffer?
I don’t have access to 3ds Max on my current machine (I am on vacation on a different continent), so I cannot verify what is happening with the Frame Buffer.
The Workflow is supposed to respect the SMTD settings. I would expect a job submitted by SMTD without the MultiCamera Workflow to use the same settings as one submitted by the Workflow…
Hello,
is there any way to prevent Deadline from copying the same assets in the repository scene for each camera separately? I have 8 cameras with different textures and when I submit job to worker each time copy all files to repository separately. Thanks for your help. BTW @Bobo great multi-camera script. It’s save me a lot of time
The copying of asset files to the Repository is optional (and not recommended, if you ask me). It was added to SMTD because Backburner was doing it, and a bunch of people came over from Backburner and insisted on having that option for when their files were in local folders. If you have good path mapping and central network storage, no files except for the optional scene file would ever be copied to the Repository. But if you rely on copying files to the Repo, there is currently no way to avoid that duplication.
It would be possible to modify SMTD to copy just once though. It would involve a central storage path (similar to the Global or Custom Network Path option we have for scene files). SMTD would then dump every file in a single common folder there, and only copy again if the file is older. This way, the first submission would copy over everything, then the next N submissions would just check the remote folder and skip the operation. Then the jobs would get a Session Path Mapping rule to look for any external files at that network location.
So it is doable, but it does not exist at the moment.
Thanks for the explanation @Bobo .
I do not see any problem with keeping the materials in the local network so that the repository takes materials, but other colleagues who work with the materials on a daily basis do not want such a solution, so I have to duplicate / send tasks as many as cameras in the scene. Maybe in the future such a solution will be implemented, as you wrote above.
It would be perfect if isolating the camera along with the models and materials in the scene would reduce the size of the files being sent - just like rendering in Chaos Cloud.
Is there a way for a multi camera job to always keep one 3ds max instance rendering?
The default commit state now is that every time a camera shot is rendered, the 3dsmax process disappears and then restarts and reopens the same scene.
Then a problem arises: when the scene file reaches a certain size, the startup speed of max will become very slow. Usually takes 2 minutes or more
Is it possible to shorten the time by avoiding launching the same scene repeatedly?
Open a max instance once, load the scene once, and render all cameras in one go.
This will save a lot of time
Hello
When you submit the job how many jobs and how many task appear in Deadline. I believe 1 task each camera?
In our docs we do recommend hooking your own code into max scripts to be able to handle multiple cameras. Here’s more information: 3ds Max — Deadline 10.3.1.4 documentation Look for the word 3ds max cameras
in the FAQs.
I have less Max scripting skills, may be someone in the community would know more.
Please, please, please implement this feature - this is so important(!)
I have asked about this a couple times (post1, post2) and struggled to put into words exactly what you have just described here.
Our current 3ds Max workflow requires the entire job (scene) be sent over a network to a remote render farm location, however, sending a 4.5Gb scene from TX to CA for rendering, especially over a WAN.
In the current batch camera workflow, SMTD uses the data from the Batch Render settings in Max, but it RESUBMITS the entire .max scene file for each camera - this is obviously inefficient compared to sending the job once and reusing the data on the server side.
@Bobo would you consider modifying the batch camera .zip or SMTD to support this type of behavior as you have described it? It would massively optimize our workflow for submitting data over the network for rendering, especially for batched camera scenarios.
Hi, it’s been a couple weeks - could y’all take a look at the request?
It seems like Bobo had outlined a workaround for this, could we see an implementation that meets their specifications for Deadline users in 3ds Max?