AWS Thinkbox Discussion Forums

Python API Package

So… I added the deadline API to my sys.path:

import sys sys.path.append(r'\\sfs-file\deadlinerepository6\api\python') import Deadline

However I don’t think it’s being imported as a package. If I then try:

Deadline.DeadlineConnect() I get:

Traceback (most recent call last): File "<pyshell#44>", line 1, in <module> Deadline.DeadlineConnect AttributeError: 'module' object has no attribute 'DeadlineConnect'

Ok I tweaked it:

import sys sys.path.append(r'\\sfs-file\deadlinerepository6\api\python\\Deadline') import DeadlineConnect

Shouldn’t the contents of the DeadlineConnect.py be the init.py?

Anyway… back to DeadlineConnect. I ran:

db = DeadlineConnect.DeadlineCon('server',6428) db.Jobs.GetJobs()

And got:

Traceback (most recent call last): File "<pyshell#57>", line 1, in <module> db.Jobs.GetJobs() File "\\sfs-file\deadlinerepository6\api\python\Deadline\Jobs.py", line 45, in GetJobs return self.__get__(script) File "\\sfs-file\deadlinerepository6\api\python\Deadline\Jobs.py", line 16, in __get__ return DeadlineSend.send(self.address,commandString, "GET") File "\\sfs-file\deadlinerepository6\api\python\Deadline\DeadlineSend.py", line 12, in send conn.request(requestType, message) File "C:\Python26\lib\httplib.py", line 874, in request self._send_request(method, url, body, headers) File "C:\Python26\lib\httplib.py", line 911, in _send_request self.endheaders() File "C:\Python26\lib\httplib.py", line 868, in endheaders self._send_output() File "C:\Python26\lib\httplib.py", line 740, in _send_output self.send(msg) File "C:\Python26\lib\httplib.py", line 699, in send self.connect() File "C:\Python26\lib\httplib.py", line 683, in connect self.timeout) File "C:\Python26\lib\socket.py", line 512, in create_connection raise error, msg error: [Errno 10061] No connection could be made because the target machine actively refused it

Is 6428 the correct port?

Ok, so I fed it the port for our pulse webservice. And now it just hangs indefinitely. Which I suppose is progress. But is db.Jobs.GetJobs() not what I want to do?

Ok instead of GetJobs() which seems to be borked I used .GetJobIds() and then .GetJob(“id”).

But the Job doesn’t return a dict or a list it just returns a string of a dict.

ok I tweaked the API for GetJob(self,id):

[code] job = self.get("/api/jobs?JobID="+jobId)
job = job.replace(‘null’,‘None’)
job = job.replace(‘true’,‘True’)
job = job.replace(‘false’, ‘False’)
job = eval(job)

    return job[/code]

The API returns strings containing the objects your are requesting in JSON format. You shouldn’t need to tweek the API in anyway to use the objects returned. If you call GetJobs(), you will receive all the data for every single job in your repository. Depending on how many jobs you have, this can potentially take some time. If you are encountering an issue trying to get all the Jobs in one request, iterating through the ids maybe your best bet. Note that for GetJobs() it is possible to specify a list of job ids you wish to retrieve job info for. Both GetJob and GetJobs return a JSON string that encodes a list of dicts, each one corresponding to a Job. Here’s an example of getting all the jobs and printing of their ids, assuming the Pulse web service is listening on port 6428 and the Pulse instance is local:

import Deadline.DeadlineConnect as Connect
import json

con = Connect.DeadlineCon(‘localhost’, 6428)

jobs = con.Jobs.GetJobs()

jobs = json.loads(jobs)

for job in jobs:

print job['_id']

Hope this information is helpful! :slight_smile:

This could be a very ignorant question but… why would I want the data in JSON format? This is a python api, so why make users jump through hoops to convert it from raw data? Presumably if they are using the Python API they want Python objects in a pythonic fashion. :stuck_out_tongue:

In fact I would go a step further. I would argue you should create a job class and return job info not as a dict but as a “job” and a slave as a “slave” class.

1 Like

Also this is probably going to be subject to debate but I would override the Get functionality for Jobs so that you could do:

con.Jobs[’_ID’] and con.Jobs would return con.Jobs.GetJobIds().

Remember that the Python API is just a wrapper around Deadline’s RESTful HTTP API in Pulse’s web service. All data sent to and from this web service is JSON, which is a common choice. The Python API simply wraps this process, and allows you to work with dictionaries instead of plain text JSON. It’s essentially how the Shotgun API works as well.

Maybe in the future we could expand the wrapper scripts to actually have proper Job and Slave classes, but that will have to be a future release.

Hmm, I don’t know about that. That call would have to hit the database, which probably isn’t expected behavior. When I’m indexing an array, dictionary, etc, I already expect that data to be in memory. Also, when you can index an object like that, I would expect things like len() to return the number of jobs, and while I guess that could be overridden as well, it still seems weird.

Well, that’s kind of my point. It’s a “wrapper” but it’s not really wrapping anything. The Shotgun API returns python objects not JSON strings.

I talked this over with other Ryan and we’ll be changing the functions to return json dictionaries. That was actually the intended behavior, but was overlooked accidentally.

Thanks for bring this to our attention!

How about the original package question. Am I importing it properly or are you intending DeadlineConnect to be the real package?

bump on the importing question.

Ill jump on this thread, if that is cool?

Was wondering about submitting jobs, and it seems to me that you still need to manually make the job and plugin info files?

To import correctly you need to import DeadlineConnect, from there you can do all communications through the DeadlineCon object. So:

from Deadline import DeadlineConnect

connectionObject = DeadlineConnect.DeadlineCon(hostname, portnumber)

You do need to have plugin info and job info files, which are just key-value pairs stored in a file. We will add another function to allow dictionaries to be passed in instead.

passing dictionaries into a function directly would be awesome. I would also suggest possibly having an object that has some basic requirements for a plugin and job info to work, and some default settings.

Not sure that having an object with defaults is really necessary, by default if you do not specify some job attribute Deadline will use the default value for that attribute. The plugin information varies depending on which plugin is specified.The defaults for job attributes can be found here: thinkboxsoftware.com/deadlin … _Info_File

The best way to determine which attributes need to be set for a particular plugin is to right-click on a job, that uses the same plugin, in the Monitor and go to Modify Job Properties->Submission Params. The key-value pairs on the left will show the properties that are not set to default for the job info, and the key-value pairs on the right show what the plugin info file contained for that job’s submission.

Submitting jobs by passing dictionaries should be available in the next Beta.

Hope this helps! :slight_smile:

did not know about the defaulting settings, cheers for that:)

Privacy | Site terms | Cookie preferences