I have a centralized script that sets environment variables, located at:
/path/to/my/environment/scripts/on/a/server.sh
This works well for our Linux workstations from profile.d, and we plan to migrate this into our Windows workstations later this year so that everything is under one umbrella. However, I’m having difficulty getting these scripts to stick for a Linux render node running Deadline as a daemon since I’m not logging in as a user, and profile.d isn’t being evaluated.
I’m looking for a way to run the command:
source /path/to/my/environment/scripts/on/a/server.sh
in the same session as my worker, so that I can set a number of environment variables like $HOUDINI_PACKAGE_DIR or $MAYA_MODULE_PATH. I tried running it as a one-time systemd service, but I think it ran under a different session than the Deadline service. Modifying the Deadline unit is an option, but I’d prefer to avoid it.
Currently, I’m thinking of running the script from a Deadline job script using subprocess
. However, this would run every time a job is submitted, and I’m wondering if there’s a better way to do this. Ideally, I’d like to run the script once and in the same session as the Deadline daemon, so that all the variables are available to the Deadline worker.