Been back and forth with SideFX several times already on this ( your support never replied to me…) and the conclusion was that this is probably an issue with Deadline as SideFX couldn’t reproduce it in HQueue or running the command manually.
Houdini tasks are cooking parts of the network which should not be cooked when the scene file is loaded (hou.hipfile.load() called in hrender_dl.py). This includes OBJ nodes which have their Display Flag off and no dependents, and DOPnets which are not referenced or required by the ROP submitted to Deadline.
This is particularly bad if a scene is submitted when the timeline is not set to frame 1. In this case it seems any Dopnets in a scene are run to e.g frame 100 even though there is a cache loaded further down the network tree. This leads to hipfile load times of 10 or 20 minutes instead of 10-20 seconds.
I’ve worked around it by modifying the submitter to force the timeline back to $FSTART. But there is clearly something odd going on.
I wasn’t able to find a ticket in our system for the e-mail you have registered here. Feel free to reach out to me (either via DM or e-mail to amsler@amazon.com) with that so I can dig in deeper on that front.
You’re going to have to forgive me because I’m no Houdini export here. My level of understanding has me guessing that this is a simulation and we’re re-computing some work that shouldn’t be because we’re on the wrong frame. Can you provide something we could use to test with on our end and the steps required to use it? Ideally something with no external assets.
I’m experiencing similar issues. I can render my shots, no problem, in the houdini GUI, same with hython / hrender. But via deadline, it seems to recompute a lot of unnecessary stuff. The process chews up all the ram, all the swaps and brings everything down.
I ran into this issue pre-Houdini 18.5 (forget if it was 18 or 17.5). I noticed it when moving from 10.1.7.1 to 10.1.10.6. I ended up diffing the Houdini scripts and saw that quite a few blocks of code were added in and something in there was the culprit. Ended up just rolling back to 10.1.7.1. My temporary fix was to put a null node in my dopnets and set the cook flag to that…obviously not a great long term fix.
I managed to recover a hrender.py file from 10.0.26.0 and exchanged the broken one of the current version with it. And voila, the scene renders without recooking random things I never asked it to, using all my ram and ultimately crashing.
and when I diff my working hrender_dl.py and the most recent one, the changes in code lies in those lines:
def pathmap_envs(tempdir, envs):
“”"
Performs path mapping on a list of DLPathmapEnvs
:param tempdir: The temporary directory that we will use as part of pathmapping
:param envs: A list of envs that we will be performing pathmapping
:return: None
“”"
# Write out a temporary file with each path on a separate line
pathmap_file_name = os.path.join(tempdir, “env_pathmap.txt”)
with open(pathmap_file_name, ‘w’) as pathmap_handle:
pathmap_handle.write(
“\n”.join(env.val for env in envs)
)
# Perform pathmapping on the file, so each line is swapped to the new mapped location.
pathmap_file(pathmap_file_name, pathmap_file_name)
# Read back in the mapped file and update all parms to the new values
with open(pathmap_file_name, 'r') as pathmap_handle:
for orig, line in zip(envs, pathmap_handle.readlines()):
val = line.strip()
if val != orig.val:
print('Setting variable "{name}" to {val}'.format(name=orig.name, val=val) )
hou.putenv(orig.name, val)
# Varchange updates all nodes in the scene to use the latest versions of vals
hou.hscript('varchange')
def gather_envs_to_map():
“”"
Builds up a list of all variable in the houdini env that need to be pathmapped
and puts them into a list
:return: list of EnvToMap that may need pathmapping
“”"
env_to_map = namedtuple(‘EnvToMap’, [‘name’, ‘val’])
found_envs = []
# gather all globally set houdini variables
# this will return everything in 2 strings the first string of the format:
# ENV\t= VAL\n
# While the second is the Errors from the command
set_envs = hou.hscript('setenv')
for line in set_envs[0].split('\n'):
if line.strip():
env, val = line.split('\t= ', 1)
found_envs.append(env_to_map(env.strip(), val.strip()))
return found_envs
This sounds like another case of hou.fileReferences() running amok. Type that in the python window on your .hip file and see how long it takes to execute. DL calls it to get all the paths to map.