AWS Thinkbox Discussion Forums

Itoo Forest Camera problem

We used another render manager beforehand and now have this problem with Forest jobs on Deadline:

https://forum.itoosoft.com/forest-pro-(*)/deadline-7-2-and-forestpack-issues/

Basically, Forest can be set to automatically clamp the distributed instances to the visible camera cone of the actual render-camera, sometimes minimizing the count of distributed geometry by a great margin, keeping memory consumption and render time at bay.
Deadline seams to disable this auto-camera-recognition.

The Itoo guys say they can’t do something about it.

We use to submit a lot ob batched jobs all the time, and each time setting the actual camera in all Forest objects in the scenes by hand is not manageable and very error-prone.

Is there a solution planned?
A checkbox for deadline to set the actual camera per submit script in all Forest objects would be perfect.

I’m not sure I understand the problem - I just finished a job where I had a massive forest, and the camera functions for forest pack were in fact working (newest FPP, 3ds Max 2017, and Deadline 9 at the time).

How are you submitting? SMTD within 3ds Max worked great for scene states and the cameras

Looking at that forum post, it says

That is a big no-no. It basically means that if 5 jobs with 5 different cameras are submitted from the same scene, each one is saving over the same MAX file. Thus, the MAX file will be in the state saved by the last submission, and the first submission can get wrong data. However, since the Job file contains overrides for most things including the Camera, the job might actually enforce the right camera at render time, but it would be too late for the ForestPro to learn about it, so it would stick to what was saved in the MAX scene.

Any of the other 3 options left in Deadline 10 would work well. We removed the ability to reuse the MAX file without saving over it from Deadline 10, because it was even worse for some special cases.

In general, people should either submit the MAX file with the job, or use the User-Defined or Global Network path to store a unique copy of the MAX file for each submission.

The option to reuse the Network path is there because somebody asked for it, but now I think it shouldn’t have been added to start with. It is basically asking for trouble.

makes sense! I actually remember I had a weird issue when I was submitting from offsite and figured “well, I could save the upload time and just save/submit with original network path” and yeah, it did not like the scene states and multiple cameras I was rendering, just didn’t work out and had to submit it normally. And now I know why, so thanks!

Ok, so what’s about the real problem?
Deadline submitted jobs won’t assign the right camera to the Forest objects?
Can Deadline be updated to somehow force it into the 3dsmax file? By reading the camera from the scene/batchlist and setting it explicit on submit?

If I understand the issue correctly, there shouldn’t be a problem if SMTD is set to save and submit the Max scene with the job.
If you need to enforce a specific camera in a batch-type submission, I would expect both State Sets and the Batch dialog to work correctly. Both would set the active viewport to the current camera, and save the scene to a new MAX file which will go with the job. I would expect Forest to “see” the correct camera in those cases, and others have chimed in confirming it works.
Are you saying you have SMTD set to save and submit the scene with the job, and it is still not working correctly?

Could you please elaborate on this? Which option needs to be changed in order to change the default SMTD configuration to this?
Because we have correct batchlists with cameras set for every batchjob and Forest didn’t pick the right camera for any of them. I suspect Forest picking the camera of the current viewport for all jobs.
We use SMTD in default configuration with just “Use Data from 3ds Max Batch Render” clicked in the “Misc” tab in order to render our enabled batch-jobs from the Batchlist.

When performing a Batch submission, SMTD sets the active viewport to the camera specified by the Batch dialog. If the MAX scene saving option of SMTD is set to “Save and Submit Current Scene File…”, a unique copy of the scene will be saved for each entry in the Batch dialog, and the result will be submitted with the job. That Max file will be loaded by the Slaves, and rendered. I would assume that if a unique MAX file is saved for each job, and the correct camera is set in the viewport, and in the job file, Forest would pick it up at render time. (I don’t have Forest though, so I have not tested this).

The forum post about Deadline and Camera problems posted earlier mentioned specifically the “Save and Use Current Scene’s ORIGINAL NETWORK PATH” saving option, which I assumed was the cause of the problem. But I could be wrong.

So if your SMTD is set to “Save and Submit Current Scene File with the Job to the REPOSITORY”, and it is still having problems with the camera culling, then there is a real bug somewhere between Forest and SMTD.

UPDATE: I am trying to install Forest Pack Pro here to see what is happening…

Sorry I took so long to answer, just too much work and projects.

If you mean this setting, then we are using it by default:

Settings.jpg

And 3dsmax Forest cameras gets messed up over batch.

I meant these controls:

docs.thinkboxsoftware.com/produ … es-rollout

The user posting in the iToo FPP forum mentioned he was using SAVE And Use Current Scene’s ORIGINAL NETWORK PATH which is a bad idea in this case. What option have you selected?

Ok, sorry, we are using “SAVE and Submit Current Scene File with the Job to the REPOSITORY” by default in the assets tab (never touched it). So by all logic and your words it SHOULD work, but it does not.

Ok,
to ease our efforts I’ve prepared a 3dsmax >=2016 test-scene showing the problem.
There are 4 batch-jobs each with a different camera. Job file and the rendered views included in the rar.

This is the scene:

The whole ground is theoretically covered with grass, BUT automatic camera selection is enabled in Forest, in the pic the camera in front of “C” is active and Forest is building only the instances visible from the camera (plus a small security area to both sides).

In real huge landscape scenes this is a very efficient optimizing method saving millions of instances and billions of tris and many GB ram on render time.
What should happen is that every batch-job should tell it’s actual camera to Forest. From the actual cameras point of view forest covers the field to look completely covered by grass.
Over DL with all setting set correctly (see my last posts) the last camera set inside the last active camera-viewport is chosen for all batch-jobs as THE forest camera.

Perhaps it would be only necessary for SMTD to replace a viewport and look through the chosen camera the second before saving the job-3dsmax-file.
So the forest camera-choosing algorithm can make it’s thing.
Or SMTD could look into every Forest object on submit and set the actual camera, disabling the auto selection.
Or understanding what Forest needs to automatically select the camera and fulfil it’s magic requisites.

Our problem is that we are using batching and Forest a lot. We have scenes with dozens of batch-jobs. Hand submitting every one at a time is not possible.
DL-Forest-Test.rar (1.66 MB)

Thanks for the detailed info!

Can you try the following changes and see if that fixes the problem for you?

  • Navigate to your Repository
  • Go into \submission\3dsmax\Main subfolder
  • Create a backup copy of the file SubmitMaxToDeadline_Functions.ms
  • Open the SubmitMaxToDeadline_Functions.ms for editing (for example in the MAXScript Editor of 3ds Max)
  • Navigate to the function definition

--SUBMIT BATCH JOBS LOCALLY fn spawnBatchJobsLocal = (

  • Scroll down and find the following code:
						local forceCamera = ""
						--Set Camera if specified
						if isValidNode theView.camera then
						(
							forceCamera = theView.camera.name
							if def_Camera == undefined do needMaxReload = true
						)	
						else if def_Camera != undefined do forceCamera = def_Camera.name
  • Modify the code to look like this
						local forceCamera = ""
						--Set Camera if specified
						if isValidNode theView.camera then
						(
							forceCamera = theView.camera.name
							try(viewport.setCamera (getNodeByName forceCamera))catch() --> NEW CODE!
							if def_Camera == undefined do needMaxReload = true
						)	
						else if def_Camera != undefined do forceCamera = def_Camera.name
  • Save the modified script.
  • Restart SMTD and try a batch submission with Forest.

So you choose to switch the actual viewport to the hero camera before saving out the file. Looks like you nailed the problem.
Thanks a lot !!!
I’m assuming this fix will be included into the following official updates.

Frankly, I was totally convinced that we were already doing that. Upon closer inspection it turned out we were only restoring the default camera in case it was updated by the Scene State, but we were not setting the camera explicitly to the viewport. We were just passing the camera to the submission function, which added it to the Job file, and enforced it to the scene at render time via the Lightning.dlx which lets the Deadline Slave control 3ds Max. But Forest could not detect that.

Sorry it took so long to figure it out!

There is a second place in the code I need to add the same fix - in the function that submits Batch Jobs via a Slave running 3ds Max in Workstation mode on the render farm. Once I have made that change, I will commit the fix to be included in future builds of Deadline 10 (and possibly Deadline 9).

Please put the code line(s) into your actual release script.
Last 10.9.4 does not have it. I’m inserting them every time by hand or just skip updates (which I would prefer not to).

The 10.0.9 was released on December 14th. You can bet that it was built days earlier before going through tests.
As you can see from this thread, I provided the fix on December 12th and you confirmed it on the 13th. There was no way to include my fixes in the same week’s build.
I have created a merge request for my changes in our system, but it has not been accepted yet.
It will make it into the official releases eventually, but I cannot guarantee it will be in the very next one. But we will try.

EDIT: My changes have been merged. Not sure if they will make it into the next build though.

Privacy | Site terms | Cookie preferences