Running bash script when worker starts

I have a centralized script that sets environment variables, located at:

/path/to/my/environment/scripts/on/a/server.sh

This works well for our Linux workstations from profile.d, and we plan to migrate this into our Windows workstations later this year so that everything is under one umbrella. However, I’m having difficulty getting these scripts to stick for a Linux render node running Deadline as a daemon since I’m not logging in as a user, and profile.d isn’t being evaluated.

I’m looking for a way to run the command:

source /path/to/my/environment/scripts/on/a/server.sh

in the same session as my worker, so that I can set a number of environment variables like $HOUDINI_PACKAGE_DIR or $MAYA_MODULE_PATH. I tried running it as a one-time systemd service, but I think it ran under a different session than the Deadline service. Modifying the Deadline unit is an option, but I’d prefer to avoid it.

Currently, I’m thinking of running the script from a Deadline job script using subprocess. However, this would run every time a job is submitted, and I’m wondering if there’s a better way to do this. Ideally, I’d like to run the script once and in the same session as the Deadline daemon, so that all the variables are available to the Deadline worker.

Hello,

I don’t know what your systemd service unit file looks like, but you could call the ExecStart with something like:
ExecStart=/usr/bin/bash -l -c "run this thing"
which forces a non-interactive shell with --login so it should read your file in /etc/profile.d/
or
ExecStart=/usr/bin/bash -c '. /path/to/my/environment/scripts/on/a/server.sh && run this thing'

if you want to keep using that server.sh file.

Otherwise, you could just write a JobPreLoad.py for whichever plugin you’re using e.g. Redshift, MayaBatch, etc. ) and set the environment variables there (see snippet below). I wouldn’t run subprocess unless you have a specific need to do that (my opinion).

if os_name == 'Linux':
	rs_coredata = '/opt/redshift/3508'
	deadlinePlugin.SetProcessEnvironmentVariable('REDSHIFT_COREDATAPATH', rs_coredata)
	rs_localdata = '/home/ec2-user/redshift'
	deadlinePlugin.SetProcessEnvironmentVariable('REDSHIFT_LOCALDATAPATH', rs_localdata)
1 Like

Hi @jarak thanks for your reply.

I did end up going with a “pipeline bootstrap” approach using GlobalJobPreLoad.py.

Which I knew nothing about until now - pretty slick.