AWS Thinkbox Discussion Forums

Shotgun tk Write nodes and Deadline, how?

Hi,

This might be more of a Shotgun question, but I figure there must be someone here who’s got it working with Deadline, so it’s worth asking.

I’ve been reading this support article on farm integration and SG write nodes, first trying to get SG to load so that the render farm can actually open the ShotgunWrite-node, but that fails because there’s no user logged in to the farm machines, and apparently I have to add code to programatically add a user at runtime, all this seems way to cumbersome in a rendering context to be honest.

The other way to do it is to convert SG writes to normal ones, but I can’t figure out where to put that code, I do not want to save converted nodes in the Nuke script so it has got to happen on the fly at render time, but running the script at render time would again require me to load the SG API on the farm, presumably requiring a user again, or can I load that part of the API without a user? If so, where should this code go?

How does everyone do it? Skip the SG write entirely? Convert all nodes before submitting? Thankful for any insight you might have!

Hi,
You can configure Deadline to run as a user at render time:
docs.thinkboxsoftware.com/produ … job-s-user

What version & platform of Deadline are you using? As you will need the newer UCS4 build to successfully: “import sgtk” if you are on Linux. Are you running Linux?

You can use the JobPreLoad.py function to control the “NUKE_PATH” env var and force a custom “init.py” file to be loaded, which in-turn would import tank: docs.thinkboxsoftware.com/produ … preload-py

Thanks, we’re running Windows for the most part, repo version is 10.0.9.4.

I had a look at the first link, but I don’t think that covers it, loading SGTK seems to require an actual Shotgun user account to be logged in to the process and it has to be linked to your SG site, it could be created on the fly through the SG API, but I don’t want to go that route if I can avoid it.

I had it working with a init.py file, but once I ran into the issue of requiring a SG user to be logged in, I abandoned that route.

The best option we have currently is to skip SG write nodes, the only other way I can imagine is if I can make Deadline run a small script which converts SG writes to normal ones before submitting the job, without altering the original script it was submitted from. Is that possible?

I believe this is the best and recommended way to do it. I’m pretty sure its how other customers do it:
support.shotgunsoftware.com/hc/ … m-scripts-

Section called: “Non-user-facing scripts”

You could pull in the Shotgun ScriptName and API application key from the Deadline Shotgun event plugin:
docs.thinkboxsoftware.com/produ … connection

What “user” does it need to be? I mean, if your on Windows, are you wanting to pass the current “Windows” username running as a “service” to sgtk? The “username” of the current deadline job? A hard-coded “username” in your pipeline?

DEADLINE_USER ENV VAR will give you the current Deadline job user: docs.thinkboxsoftware.com/produ … job-s-user

Deadline’s in-app Nuke submitter is all written in Python, so you could fork it into: “…/repo/custom/submission/” as you see fit for custom workflows at submission time.

UPDATE: I had a good chat with one of the ADSK SGTK street team developers at Siggraph and suggested a potentially really nice way that they could implement a context-aware Nuke2Deadline submitter in SGTK. Hopefully, they will have time to take a look. :wink:

Hey Mike,

Wanted to know if your chat with Shotgun got any traction.

Thanks!

It did not it would seem. I told them what they needed to do…“bake the tk-write nodes under ‘context’ in Nuke and then after submission exits; revert the ‘context’ back to pre-submission in Nuke.” We should probably try and reboot the conversation again. Could you submit a ticket explaining what your after to both the Thinkbox and Shotgun official private ticketing systems and then I’ll try and escalate. I’m assuming you were a fan of the ‘bake’ option instead of the Shotgun user account be logged into the process?

That’s really interesting, we had internal discussions along similar lines - integrate slightly more toward the Shotgun side of things, so that the submitter can hand off a largely independent scene to Deadline.

For the time being, with Nuke, we still use tk-nuke-writenodes. In JobPreLoad.py and one other place, we do some initializatiion to bring up a tk-nuke engine with the correct context, etc.
Having to authenticate a script user makes sense - it is a security feature :slight_smile:

Hi all,

My issue is actually with Houdini ifds and Mantra nodes. Just simply converting the mantra nodes and having to render out ifds as well does not work. Nuke has never been an issue for me when working with Shotgun. I’ve narrowed that process down to converting the node and submitting to the farm OR putting the sgtk nuke write node gizmo in my nuke path. This process does not work the same for houdini. I’ve been trying to find a workaround for the Sgtk mantra node but keep hitting road block after road block.

The only thing that nodes does is version up the mantra node’s outputs and create the online ifd and render directories.

If that’s the case, I’d love to know if anyone has added this logic to a job preload script already.

Thanks,

Dan

Hi all,

Just an update on the Houdini/Mantra submission issue. Turns out that the only good results I’m getting is if I have “Export Mantra Locally” enabled or if I generate the IFDs prior to submission via the Mantra nodes “Render to Disk”. Not sure what’s happening here.

Thinkbox?

The image below is what my Mantra node looks like after I perform a SGTK Mantra to Native Mantra conversion prior to the render submission happens.


With distributed configs, your nuke sgtk node is likely cached locally, so you can’t just add that path to NUKE_PATH.
To avoid bootstrapping, another option is to just grab that gizmo at submission time and submit it as an auxilliary file, and on job-preload add the auxiliary (job folder) path to NUKE_PATH et voila.
SG are heavily against bootstrapping on jobpreload due to denial of service type issues when submitting jobs to massive farms… that said, I’m also using job-pre-load bootstrapping for certain jobs (eg deadline SG Publishing).

Hi Patrick,

To date I haven’t had any issues with SGTK Nuke on the farm in either scenario [converting the node or having the gizmo in my NUKE_PATH]. Is there something under the hood with Deadline that I’m missing that could be an issue down the road?

In terms of the Houdini Mantra node, I’m just getting ready to use native Mantra nodes because submitting Houdini IFD generation and the dependency image seq rendering via Mantra does not work plus Shotgun has butchered the Mantra node and they’ve hidden or left out a few default items on that node including Cryptomatte support. I had to specifically ask to have that added to a Mantra version they updated for me.

For all the time invested in getting this to work, it’s probably easier to write my own modifiers of the native nodes. I mean what do these nodes really buy you anyway?

Thanks for the reply!

It’s true, they buy you pathing.
I’ve implimented a really simple alternative previously by sticking an expression in the file attribute that builds a sg template compliant path based on the current scene name and version. It seemed to do the trick.

Privacy | Site terms | Cookie preferences