Having some issues with the dependency chaining when using ROP subnets.
I’m currently developping pipeline tools for our studio, and intent is to wrap a deadline submission node inside a custom pipeline cache HDA.
The issue I seem have come across, is deadline not supporting ROP subnets in the recursive input lookup inside the SubmitDeadlineRop.py?
Can anyone confirm that subnets are not supported to work with the deadline dependency submission chain? Or maybe someone has a better work-around than having to pull things out of the subnet using a fetch node…
Doesn’t seem like too hard of a problem to resolve, but maybe this bumps with some core fundamentals on deadline’s side.
I’m not finding any other asks about this in our ticket system - would the Deadline ROP be placed in the subnet or would the whole subnet be upstream of the Deadline ROP?
I am attaching an example hip file in case it helps.
Subnet ROPs do not have an actual wire output, so I think when walking upstream through the rop network, the Deadline script should have a specific behaviour when finding a subnetROP, which is to dive inside and grab the node with the render flag ON or the last node of the chain, and start walking upwards from that.
One of the issues is that many standard nodes are subnets themselves.
We have tried doing this in a generalized way, but hit too many problems, and instead made specific logic for different node types.
Perhaps we missed some possibility, though.
Looking at the Deadline ROP code (DeadlineRepository10\submission\Houdini\Main\SubmitDeadlineROP.py) I think our issue is we’re only diving into the dependencies of fetch nodes, and are treating ROP subnets just ROP since callable(curNode.render) returns true.
If you’ve got the Python + Houdini chops it might be a simple change to get our ROP properly diving into those subnets, but in the meantime I’ll have to put this in as a feature request.
I believe one of the problems was that Wedge nodes are also subnets, and their submission gets completely broken. Don’t have the full context at hand now.