While browsing the docs, Deadline Scripting Reference: Welcome, I was unable to find any helpful information on finding batches and can’t develop a solution that doesn’t involve processing every job in the repo. Is there a solution that I’m missing?
Yeah, unfortunately I don’t think you can search by batch. It is just a string label in the job properties…
I would have liked having a function to query jobs by different criteria. Since Deadline uses pydotnet, they can put together some crazy Linq-style thing.
Bummer! Thank you so much though. I didn’t know about pydotnet and Deadline’s use of it until now. Super helpful to know! I’ll post my solution here when I get around to implementing it.
Just wanted to give back to this community, so here’s a quick solution of those wondering of a way to grab jobs in a batch:
def getJobBatch(batchName, states = [], checkAll=False, invalidateCache=False):
"""Gets the other jobs in the batch.
Args:
batchName (string): Name of the batch.
Returns:
jobBatch (list): list of jobs in batch
"""
if checkAll:
searchJobs = RepositoryUtils.GetJobs(invalidateCache)
elif any(states):
searchJobs = RepositoryUtils.GetJobsInState(states) # no cache parameter?
else:
searchJobs = RepositoryUtils.GetJobsInState(['Pending','Active'])
jobBatch = filter(lambda job: job.JobBatchName == batchName, searchJobs)
return jobBatch
Looks like you’ve arrived at the only solution I’ve ever found to this as well: query all jobs & filter client-side.
Yeah I guess we can query by job-state, but whether or not this is also just a thin wrapper around client-side filtering remains to be determined.
The job query system is horribly inefficient imho, so some of our guys tend to resort to direct MongoDb queries. In an ideal scenario I think that particular route is the wrong way to go, but performance numbers kind of speak for themselves and if you’re just reading data it’s a low-risk adventure. Be careful writing data directly to the database though, that’s not always gone so well for us in the past.