We have some users that work remotely – via Remote Connection – and we have some submission scripts for the Monitor that depend on a fairly substantial stack of pipeline python code. Of course, as remote users, they may not always have full access to all that code. Even if we try to put everything we need into the Deadline Repository tree, it doesn’t seem that all the files we need get cached, and thus we fail.
So, for instance, I’ve got a little library in a folder in “Custom”. I added it to the repo’s python paths, and it works fine if the user has access to the repo. But if not, trying to import anything in that lib fails. Makes sense that Deadline wouldn’t know which files to bring into the cache. But then! If I look in the AppData cache for Deadline, I do discover some other whole directories (not even with python in them) have been cached. So it would be nice to know what the behavior is when it comes to caching scripts so we can figure out how to get stuff to run without a connection to the repo directory.
Thanks!