AWS Thinkbox Discussion Forums

Feature request: Scrips and plugins from outside the repo

Hi,
maybe worth some thought: Many folks are coding their own stuff; I think it would be best practice to seperate it in some way from the repository. Maybe in some kind of different folder there or best include pathes via ENV-Vars. This could be cached on the repo, if you want to (like OFX-Plugins). This way users would not have to change things inside the repository directly and it would be easier to upgrade an existing repo.

What do you think of

DL6_USER_PLUGIN_PATH
DL6_USER_SCRIPT_PATH

Cheers,
Daniel

I would be very interested in something like this as well. However, I would go one step further and suggest adding the ability for the actual repository to have an awareness of an external (but still central) scripts/plugins tree that it didn’t blow away or stomp out every time Deadline was updated.

We can add this to the wishlist, but it’s probably not something we’ll have time to visit for 6.0.

I should note that the Repository installer only updates the plugins/scripts that ship with it out of the box. Any other plugins or scripts will be untouched. So for example, if you have a plugin called MyPlugin in the Deadline plugin folder, it will not be wiped out by the installer.

Cheers,

  • Ryan

Zipping the bin folder kind of kills the ability to do partial merges though for client sync. Which is one of a dozen reasons the new zipped system is annoying.

Out of curiosity, when are you doing partial merges? I know there was the case when we sent you that updated deadlineslave.exe file, but generally we don’t expect our users to do partial updates, especially outside of the beta. Also, the only difference between how updates for 5 and 6 work is that the files are zipped. If you are doing partial updates, just zip up the contents of your bin folder.

The new system is much faster and more reliable, and that has to take precedence. If you could let us know the other reasons why this zipped system is annoying, maybe we can come up with a compromise.

Cheers,

  • Ryan

I was ‘abusing’ the bin folder to push out files to all the render nodes for plugin syncing.

While completely unsupported, that’s still doable. :slight_smile:

  1. Unzip the bin.zip file to a temp location, drop in your new plugins, zip them back up, and then copy them to repository.
  2. Modify the Version file and you’re set.

I believe (1) can even be simplified by dragging the new files and dropping them on the zip file.

Yeah but the next patch will then overwrite the zip and I have to remember to do that every time. They used to just chill out in the bin folder from version to version unmolested.

Maybe part of the auto-update could be to trigger it to zip it up after the version changes but reside another copy on the repository that’s unzipped?

In case it’s of interest to anyone, the way I will likely be approaching this for the time being is by defining the plugin files in the Deadline repository as stubs that import an external (version-controlled) module containing the actual plugin classes. In other words, the files in the Deadline repository will basically look like:

[code]import luma.render

def GetDeadlinePlugin():
return luma.render.PythonCallable()[/code]

This way, I can maintain multiple parallel versions of any given plugin, and switch them based on which path gets appended to sys.path in PluginPreLoad.py.

Unfortunately this doesn’t address the problem of the Deadline installer re-writing plugin scripts (and submission GUIs) when updates are installed, if you are repurposing a name that Deadline already ships a plugin for. The simplest solution I can think of is making an hg/git repo out of the Deadline repository and excluding the binaries, but that doesn’t exactly make me feel warm and fuzzy inside…

Privacy | Site terms | Cookie preferences