AWS Thinkbox Discussion Forums

Copy Files Script

I’m currently on a project which is outputting stereo frames that are 25mb+ each and need to push them to a remote location. I’d like to create script that transfers after each task is completed instead of copying 20GB of files at the end of the job. I don’t want it to be a part of the task because the upload time will take away from rendering the next frame. This sounds like a job for a Pulse script as it is evaluated periodically, independent from the slaves and looks to be the only part of the scirpting that has events relating to tasks inside of jobs. Am I on the right mind track here?

Pulse doesn’t actually run any scripts on a regular basis. The only scripts that Pulse does run are the WebService scripts, and those are only run when it receives a request to do so. There also are no task-based events for the Event plugins, just job-based ones, so that won’t work for you either.

How are you currently transferring the files? Is it a manual process, or do you have some sort of custom file transfer job in Deadline? Maybe the transfer could be a dependent job? I guess that still takes away from actual rendering time.

Maybe you could use a post-task script that launches a background process to perform the copy. That way, the script returns immediately, and a render can continue while the transfer occurs in the background. However, if your renders finish faster than the transfers, that won’t be good because you could hit bandwidth problems…

Just tossing out ideas here. They all seem to have their pros and cons…

Currently I have no automation. However, I need to duplicate the files to a dropbox folder and dropbox takes care of the syncing. So I guess I just need the script to copy files from one folder to antoher, which surely would be less than the rendertime of the frame, so I think post-task script sounds like a winner. Thanks for the reply.

i was looking for something similar, like a thing that i can put after a job has been finished (could use a trigger plugin also) and that could call a system command (or a batch script) with one item (the output filename) as the parameter. couldn’t find it yet, but if someone has any idea, feel free to share :slight_smile: )

i think that an event with an OnJobComplete should be able to deal with what you want to achieve…

My issue is that I was uploading to a remote locatin, with an abysmal upload speed. This meant that an onJobComplete would queue a 4GB upload that would take days, whereas if I was uploading on a per task it would have slowly uploaded each frame instead of one lump sum at the end.

Privacy | Site terms | Cookie preferences