Hi, just a question,
what exactly do I need to reference (I assume a .dll) to have full access to Deadline’s classes etc… so I can, for example, query what’s being submitted outside of Deadline (say in my own Python tools)?
Thank you in advance!
Hi, just a question,
what exactly do I need to reference (I assume a .dll) to have full access to Deadline’s classes etc… so I can, for example, query what’s being submitted outside of Deadline (say in my own Python tools)?
Thank you in advance!
You can reference any of the *.dll files in the Deadline bin folder. However, none of the classes are (or will be) documented or supported, and there is no guarantee that things won’t change.
In Deadline 5.0, we’ll be adding some classes like Job and SlaveInfo to the Script API (which will be documented and supported).
Cheers,
So, basically, there is no documentation what so ever?
Too bad, I was thinking of integrating some of the Deadline’s functionality into a custom IronPython-based tool
Unfortunately, that’s correct. We prefer to expose APIs to allow users to customize certain areas of Deadline, and you could certainly write an IronPython script that is launched through “deadlinecommand.exe -executescript” so that you have access to this API. Since you have access to the 5.0 beta, I would suggest giving the Scripting section of the beta docs a read over, and if there is specific functionality that you are looking for that isn’t supported, let us know and we can consider exposing that functionality in a future release.
Cheers,
So, basically, the suggested way of doing this would be writing an IPy script that gets launched via deadlinecommand.exe instead of the regular IPy.exe (or internally via IronPython Engine via .NET)?
By running it through deadlinecommand, you can access Deadline API, so in this case yes, that would be the recommended approach.
Theoretically, it should be possible to expose this API without having to use deadlinecommand in the future. You would just have to import deadline.dll and then call an Initialize function to set the environment. In its current state though, the Deadline API requires that any standalone scripts referencing it be run through deadlinecommand.
I just took a look at the Deadlinecommand options and I can’t, for example, find a way to get the mapped paths settings in the repository.
Is this something possible to get from outside tools (except for directly parsing the networkSettings.xml file)?
Hi,
Interesting. Possible Options:
(1) walk the xml file. Urgh.
(2) How about checking the system instead of checking Deadline? Deadline may even fail to make the mapping?! Check after the event to be sure. msdn.microsoft.com/en-us/library … .name.aspx
(3) Persuade Ryan to add Repository object to scripting api:
from Deadline.Repository import *
Mike
Hey Mike,
thanks for the suggestions.
Ad 1) yes, that’s the most “dirty”, yet most probable solution I’ll go for, unfortunately.
Ad 2) Honestly, I didn’t get this. I just need to fetch settings in the Deadline Repository config, which has nothing to do with the OS or paths in general.
Ad 3) That’d be cool! But even cooler would be a simple Python API which we could import to our 3rd party tools and have access to the Repository, jobs, machines etc… Similarly like we do with Tactic or Shotgun.
Hi,
(2) I assumed that you want to add/remove network paths on a machine. So if you can’t access the repository options and hence, the network paths which will automatically be made. Well, assuming you are in knowledge of what those network paths are on your system, then you can assume they have been made on all slaves for all jobs. Now, if this is the case, then you can have preLoad.py scripts on a deadline job which will alter/update/delete any network path. This can be cross-referenced by scanning the actual network paths that have been set on a slave AFTER the preLoad.py script has been executed. This could be done via a preJob or preTask py script using the link I earlier sent: msdn.microsoft.com/en-us/library … .name.aspx
(3) Yeah, nice idea, but I guess, as Ryan said, things are changing at such a pace that it will just get broken to much to be reliable?
HTH,
Mike
Exposing the functions that perform the path mapping shouldn’t be a problem at all. That way, you don’t even have to worry about handling the paths yourself.
Cheers,
Ryan, great news! That’d be very handy!
Mike, I want to write a simple MAXScript that takes all the Deadline’s mapped path settings and re-maps all the paths in the scene file as a PostLoad script that I recently added as an option to the Deadline 3ds Max submitter. This way, we have a similar functionality like with Nuke or Maya’s .ma files, where we can have local paths in the resources in the scenes, but when submitted to the farm it gets automatically re-mapped according to these settings in the Deadline repository options.
Nothing fancy, just a bit more automation I need for my pipeline here. And since I’d like to release the scripts to the public, I’d like to make them a bit more generic and flexible, I don’t like to use hard-coding.
Ah, OK. Have you checked out Session path files? *.mxp in the maxscript.chm?
Deadline already has the ability to load a session path file per job and this will then look for maps exactly where you tell it to look after submission.
All you need to do is strip all the paths from all assets during submission of the max file. Bobo already implements this approach in the SMTD functions - somewhere? STOP PRESS, found it - (for sending “all local assets files” to the repository - under the render tab, first drop-down list in the SMTD)
Of course if you’re not using SMTD and submitting via the monitor script, then this option won’t be possible, without first creating the *.mxp. You could write a custom *.mxp file which doesn’t contain the normal 3dsMax project directory paths BUT only your custom network path for your textures. Untested, but this should then force 3dsMax to look there for all its textures.
Mike
Sure, the .mxps are one way, but, not really realiable.
Besides, I need to set paths for things that aren’t being tracked in the Asset Manager (GI caches, shadow maps etc…) for 3rd party plugins, so doing it in MAXScript is a 100% sure way for me.
And, it all works just fine with my modifications.
One last note, I submit all my jobs via the Monitor, nothing gets submitted via the SMTD, because I usually work remotely from the server and render farm.