However I don’t think it’s being imported as a package. If I then try:
Deadline.DeadlineConnect() I get:
Traceback (most recent call last):
File "<pyshell#44>", line 1, in <module>
Deadline.DeadlineConnect
AttributeError: 'module' object has no attribute 'DeadlineConnect'
Shouldn’t the contents of the DeadlineConnect.py be the init.py?
Anyway… back to DeadlineConnect. I ran:
db = DeadlineConnect.DeadlineCon('server',6428)
db.Jobs.GetJobs()
And got:
Traceback (most recent call last):
File "<pyshell#57>", line 1, in <module>
db.Jobs.GetJobs()
File "\\sfs-file\deadlinerepository6\api\python\Deadline\Jobs.py", line 45, in GetJobs
return self.__get__(script)
File "\\sfs-file\deadlinerepository6\api\python\Deadline\Jobs.py", line 16, in __get__
return DeadlineSend.send(self.address,commandString, "GET")
File "\\sfs-file\deadlinerepository6\api\python\Deadline\DeadlineSend.py", line 12, in send
conn.request(requestType, message)
File "C:\Python26\lib\httplib.py", line 874, in request
self._send_request(method, url, body, headers)
File "C:\Python26\lib\httplib.py", line 911, in _send_request
self.endheaders()
File "C:\Python26\lib\httplib.py", line 868, in endheaders
self._send_output()
File "C:\Python26\lib\httplib.py", line 740, in _send_output
self.send(msg)
File "C:\Python26\lib\httplib.py", line 699, in send
self.connect()
File "C:\Python26\lib\httplib.py", line 683, in connect
self.timeout)
File "C:\Python26\lib\socket.py", line 512, in create_connection
raise error, msg
error: [Errno 10061] No connection could be made because the target machine actively refused it
Ok, so I fed it the port for our pulse webservice. And now it just hangs indefinitely. Which I suppose is progress. But is db.Jobs.GetJobs() not what I want to do?
The API returns strings containing the objects your are requesting in JSON format. You shouldn’t need to tweek the API in anyway to use the objects returned. If you call GetJobs(), you will receive all the data for every single job in your repository. Depending on how many jobs you have, this can potentially take some time. If you are encountering an issue trying to get all the Jobs in one request, iterating through the ids maybe your best bet. Note that for GetJobs() it is possible to specify a list of job ids you wish to retrieve job info for. Both GetJob and GetJobs return a JSON string that encodes a list of dicts, each one corresponding to a Job. Here’s an example of getting all the jobs and printing of their ids, assuming the Pulse web service is listening on port 6428 and the Pulse instance is local:
import Deadline.DeadlineConnect as Connect
import json
This could be a very ignorant question but… why would I want the data in JSON format? This is a python api, so why make users jump through hoops to convert it from raw data? Presumably if they are using the Python API they want Python objects in a pythonic fashion.
In fact I would go a step further. I would argue you should create a job class and return job info not as a dict but as a “job” and a slave as a “slave” class.
Remember that the Python API is just a wrapper around Deadline’s RESTful HTTP API in Pulse’s web service. All data sent to and from this web service is JSON, which is a common choice. The Python API simply wraps this process, and allows you to work with dictionaries instead of plain text JSON. It’s essentially how the Shotgun API works as well.
Maybe in the future we could expand the wrapper scripts to actually have proper Job and Slave classes, but that will have to be a future release.
Hmm, I don’t know about that. That call would have to hit the database, which probably isn’t expected behavior. When I’m indexing an array, dictionary, etc, I already expect that data to be in memory. Also, when you can index an object like that, I would expect things like len() to return the number of jobs, and while I guess that could be overridden as well, it still seems weird.
I talked this over with other Ryan and we’ll be changing the functions to return json dictionaries. That was actually the intended behavior, but was overlooked accidentally.
You do need to have plugin info and job info files, which are just key-value pairs stored in a file. We will add another function to allow dictionaries to be passed in instead.
passing dictionaries into a function directly would be awesome. I would also suggest possibly having an object that has some basic requirements for a plugin and job info to work, and some default settings.
Not sure that having an object with defaults is really necessary, by default if you do not specify some job attribute Deadline will use the default value for that attribute. The plugin information varies depending on which plugin is specified.The defaults for job attributes can be found here: thinkboxsoftware.com/deadlin … _Info_File
The best way to determine which attributes need to be set for a particular plugin is to right-click on a job, that uses the same plugin, in the Monitor and go to Modify Job Properties->Submission Params. The key-value pairs on the left will show the properties that are not set to default for the job info, and the key-value pairs on the right show what the plugin info file contained for that job’s submission.
Submitting jobs by passing dictionaries should be available in the next Beta.