Modo 501, Deadline 4.1.0.43205, Win7 x64

Hi there,
I’m testing out Deadline 4.1.0.43205 and trying to render a Modo 501 scene on Windows 7 x64 which takes about 1.3GB of RAM to load the scene.

Well, firstly I had to add these lines to RenderTasks in Modo.py to even get a scene loaded in Modo 501, does anyone else use Modo 501 and not have to do this?
modo_cl.exe would look at a blank scene unless I made it do “scene.open” myself:

def RenderTasks( self ): sceneFilename = GetPluginInfoEntryWithDefault( "SceneFile", GetDataFilename() ) sceneFilename = CheckPathMapping( sceneFilename ) self.SendCommand( "scene.open {%s}" %sceneFilename , True )

With that modification, a lighter scene (a simple sphere) already distributed and rendered fine, but I can’t the big scene to work.

What happens is the slaves pick up the job, calls modo_cl.exe, starts loading the scene (the RAM usage goes up & up), and errors out
when it hits about 700-750 MB of RAM, then it keep retrying until I kill the queue.

It fails with:

0: STDOUT: Command 'scene.open {\\depo\projects\TEST\modo\limit_render.16.lxo}' failed with -2147483648

If I do this manually locally, it loads fine though.
To just understand how all this stuff works, I ran the following and supplied the open command by saving over job0.txt:

"C:\Program Files\Luxology\modo\501\modo_cl.exe" -cmd:"@\\deadline\plugins\Modo\render.pl job0.txt ack0.txt \\depo\projects\TEST\modo\limit_render.16.lxo" @start modo_cl [40017] Luxology LLC Job filename: job0.txt Ack filename: ack0.txt Sending ack: READY Received command: EXECUTE: scene.open {\\depo\projects\TEST\modo\limit_render.16.lxo} Sending ack: SUCCESS

Google says the number “2147483648” is “the largest negative number that can be stored in a four-byte “long integer””.
Any idea why I get this, and how can I get rid of it?

Thanks a bunch,

You shouldn’t have to do this. Based on modo’s documentation, modo will load the scene if you pass it as the last argument in the command line, which we do. We haven’t been able to reproduce this behavior here. Is it possible all the textures/assets the scene requires aren’t available on the render nodes? That could explain why the scene isn’t loaded.

Is that on the same machine you’re rendering on with Deadline, or is on the machine you’re submitting the job from (assuming they’re different machines)?

Cheers,

  • Ryan

OK, so I took this a step further in desperation,
I decided to try & use a “Command Script Job” to submit my Modo renders, but that attempt failed as well.

This is how I set it up:
I call this command from a DOS Prompt:

"C:\Program Files\Luxology\modo\501\modo_cl.exe" "-cmd:@\\depo\projects\TEST\submissionScripts\modoCmd.746.txt"

And that file contains this:

#LXMacro# scene.open {\\DEPO\projects\TEST\modo\limit_render.16.lxo} log.toConsole true @ChangeRenderFrameRange.pl 746 746 render.animation {*} app.quit

This runs beautifully on my Command Prompt, but when I submit the same exact command to Deadline, I get the same behaviour;
it starts loading my Modo scene, but restarts the modo_cl.exe process on the slave once it hits 600-700MB of memory.

Feels like Deadline’s process itself has a memory limit. Is this a limitation in the StartMonitoredManagedProcess call?

Looks like in it’s current state, Deadline is unusable for heavy Modo renders.

Thanks,

Hey Ryan,
Erm… Seems like a texture issue o_0

Deleting all the textures made it properly load the scene, even though I’m setting my slave-pool to only my local machine.
I think it is a full UNC path vs mapped drive issue, but not exactly sure.
(weird error it throws though, impossible to debug!)
I’ll post the findings for other people that run into this issue.

Thanks!

Do you have the option enabled to submit the modo scene with the job? If so, try disabling that option. This way, Deadline loads the modo scene from its original location, instead of copying it locally to the slave first. It could be that there are relative asset paths in the modo scene, and when Deadline loads the scene from a different location, those paths break.

Cheers,

  • Ryan