AWS Thinkbox Discussion Forums

Official Houdini Solaris Husk submitter

As there is no license attached, you should conform to the AWS Intellectual Property License and AWS Customer Agreement (linked from ThinkoxEULA.txt). I don’t have a conclusion, though:

Neither you nor any End User will use the Services or AWS Content in any manner or for any purpose other than as expressly permitted by this License and the Agreement. Neither you nor any End User will, or will attempt to (a) modify, distribute, alter, tamper with, repair, or otherwise create derivative works of any Content included in the Services or AWS Content (except to the extent such Content is provided to you under a separate license that expressly permits the creation of derivative works), or (b) sublicense the Services or AWS Content. These license restrictions will continue to apply following the termination of this License.

1 Like

damn! someone who actually read the EULA and checks the T’s & C’s !!

That’s pretty strict considering a lot of the forum posts are modifying, altering, repairing and distributing the content!

Does the ‘Husk’ part of the submitter fall into this category as it doesn’t interfere but rather extends the capabilities of the program?

I guess it’s ok here on the forum, as most visitors are customers.
Good question about Husk, I don’t know. IANAL

Hello, i have patches from the forum for 20.0, you wrote that for 20.5 i need to alter some param file. Can you give me hint which file it is?

Probably you mean plugins/Houdini/Houdini.parma

1 Like

make sure to choose the right one, if you installed patched files might want to check /DeadlineRepository10/custom/plugins/Houdini

1 Like

ItWorks :smiley: I edited files and replace… my deadline now renders ok. Thanks

1 Like

@zainali @Justin_B

Hey, again, what is the status with either an aws supported submitter or the option to put what is being produced here on github for easier maintenance? After 1 1/2 years (!) there must be some sort of conclusion to either one.

6 Likes

Just thought I’d mention I just ran into this same issue again when rendering a standalone Redshift job. The file ends in _v01, so I went back and made a proper check:

if paddingSize > 0 and re.search(r'[\W_]\d{%d}[\W]' % paddingSize, filename):
    newPadding = StringUtils.ToZeroPaddedString( self.GetStartFrame(), paddingSize, False )
    usdFile = FrameUtils.SubstituteFrameNumber( usdFile, newPadding )

Redshift:

if paddingSize > 0 and re.search(r'[\W_]\d{%d}[\W]' % paddingSize, filename):
    newPadding = StringUtils.ToZeroPaddedString( self.GetStartFrame(), paddingSize, False )
    filename = FrameUtils.SubstituteFrameNumber( filename, newPadding )

This also allows underscores before the frame number. If you’d like to only allow periods, change [\W_] to [\W].
If you use periods throughout the file name this could cause falsey truths with other numbering sequences that aren’t frame-related, but I don’t believe that’s common practice.

1 Like

Ok, so we are trying to switch to 20.5 rendering with Karma XPU. Submission seems to work, but once a task starts, I am getting this:

2024-09-05 10:45:09: 0: STDOUT: 17140: Fatal error: Segmentation fault

using the exact same commandline that deadline is calling on the exact same machine works like a charm.

Rendering in 20.0 also works fine.

Any idea what this might be?

Maybe the environment variables are different, and it’s unable to load some library? Just a guess.
You could try comparing the environment.

How would I see all the environment variables that a deadline “session” has. I mean I am loading some varialbes with the JobPreLoad.py (for husk) and I see that working before it constructs the husk render command. what is really weird is that all this just works in 20.0

Just add somewhere in the Houdini plugin
print(os.environ["PYTHONPATH"])
and/or
print(sys.path)

By the way when you say you execute the command, do you run the full command? Meaning the one with hrender_dl.py in it?

1 Like

Sorry, I need to clarify. I am rendering through husk (with husk standalone). So this is already a husk standalone job. So actually husk is crashing not hython.

Oh alright, I overlooked it.
You can print the subprocess’s environment by
plugin.GetProcessEnvironmentVariable("PATH")
(it is possibly the library include path, and not PYTHONPATH)
https://docs.thinkboxsoftware.com/products/deadline/10.1/2_Scripting%20Reference/class_deadline_1_1_plugins_1_1_deadline_plugin.html#ac233247e900b1e80fef4f6edfbdf72a3

I don’t see any issues with what env vars are set (they are the same for 20 and 20.5). Another thing I do suspect is some sort of access restriction that has not mattered until now or some python version issues. But this is what you get from using an unsupported tool like Deadline…

Anyways, just out of curiosity: have you run renderjobs (with Karma) from H20.5? And if so, did you have any issues?

@Ronald_Anzenberger we recently started to using karma from H20.5 without any issues so far. Is it failing just one particular job or all of them?

EDIT: could you provide whole log with that SEGFAULT?
EDIT2: is it xpu or cpu? Wouldn’t be it caused by graphic drivers in case of XPU?

Every job, every machine. we are trying to render with XPU. XPU in H20.5 itself works, it also works when running husk from within houdini (hitting render on a usd render rop) or executing the command the submission creates for each task in a command prompt. But once it is submitted and a task is executed (by a deadline worker) it has the seg-fault.

I shall see to provide more of the log, but as far as I can remember, the additional info was rather cryptic.

Are you using the same user? I’m hitting an issue where if Deadline runs as ‘root’ then Redshift bombs out (another thread on this forum). Someone mentioned this could be something from logs generated.

If Deadline is running as the same user, or you’re running deadline directly on the machine you’re submitting from this should point the issue more at deadline

when you say “deadline runs as root” you mean the worker, right?

As we are on windows, there is no root user, so the worker runs with the user of the machine if it’s a workstation and in case of renderfarms with a special renderfarm user. But no matter the combination, it does not work. And again, this is all not a problem with 20.0. I can literally submit a job out of 20.0 which works and submit the same thing out of 20.5 (without changing anything in deadline) and it will crash husk. the only difference is which houdini/husk version is used.

I do in fact suspect something beeing wrong with deadline and I am glad that people seem to have it working with 20.5. But with the current evidence I have, its hard to track down what is wrong.

1 Like
Privacy | Site terms | Cookie preferences