AWS Thinkbox Discussion Forums

Standalone API - SubmitJob - Aux Files

Hello,

I’m trying to create a simple job using the python API. Here is what I do :

plugInfo = {
	"StartupDirectory":"c:/"
}

jobInfo = {
	"Plugin":"CommandLine",
	"Frames":"0",
	"Name" : "Test API standalone",
	"UserName" : "ft"
}

auxFiles = [
	"c:\command.txt"
]

job = conn.Jobs.SubmitJob(info=jobInfo,plugin=plugInfo,aux=auxFiles)
print job

here is what I get in the console :

Error: Could not find file 'c:\command.txt'. (System.IO.FileNotFoundException) "Error: Could not find file 'c:\\command.txt'. (System.IO.FileNotFoundException)"

I tried to move the file on d: as follow

plugInfo = {
	"StartupDirectory":"c:/"
}

jobInfo = {
	"Plugin":"CommandLine",
	"Frames":"0",
	"Name" : "Test API standalone",
	"UserName" : "francois.tarlier"
}

auxFiles = [
	"d:\command.txt"
]

job = conn.Jobs.SubmitJob(info=jobInfo,plugin=plugInfo,aux=auxFiles)
print job

and this is what I get :

Error: Le périphérique n’est pas prêt.
  (System.IO.IOException)
"Error: Le p\u00e9riph\u00e9rique n\u2019est pas pr\u00eat.\r  (System.IO.IOException)"

basicaly in the second case it says “the device is not ready”.

In both cases I believe it is odd. What have I done wrong ?

Thanks

F.

Hello Francois,

So the machine running this job is actually the Pulse machine, which means the file likely didn’t exist, or the pulse application didn’t have permission to access it in either of those locations. The instance of the D drive means it’s likely a CD Drive on that machine, based on the error message. Can you try the file being run from a network location mapped to the correct location across all machines? Thanks.

Oh ok that make sense then.
It’s too bad I can’t create a file locally and send it to Pusle via the API.
Is there any mechanism to store it in some kind of temp folder in the repo ?
Will the file always be copied in the repository, or does the slave will also need to have access to the file ?

thanks

F.

I would second this request. It makes more sense to me to do a three stage submission through JSON. Submit the file names along with the job info through JSON/PythonAPI, use Python to copy the file from the submission machine directly to the repository (We sometimes submit local files too) and then finish up the submission through pulse by either completing the database entries for the aux files or else delete the job since it was a failure.

As far as I am aware, there is no functionality to copy over assets for an API submission , thus why we would advise making sure they are in a network location, but I will check with the devs asap to see how that could be done.

After talking to the devs, they advised that currently my suggestion is the only thing that will work to ensure your pulse machine has access to the file, unless you want to have the assets stored on the pulse machine directly. You should be able to add a copy command to your python script to perform the copy for you.

Privacy | Site terms | Cookie preferences