Submit Job from home office

Hi,
is it possible to submit deadline jobs remotely (from home office) to a renderfarm (other office location)?
I could use tools like anydesk or similar tools to solve this problem but it would be really nice to submit it directly via the submitter script.
Thank you!

if you had a VPN connection to the office you could do this, but you’d either have to work from the files in the office, or transfer them and all the assets when you submit a render.

If you have all assets mapped in the same locations this would save you having to do this.

2 Likes

Without a VPN, the Deadline Remote Connection Server could be configured to allow secure access to the Repository over public internet. However, the same issue with assets remains - how does the job deal with the mapping or transfer of assets between the home office and the renderfarm?

1 Like

Thank you anthonygelatka & Bobo!
Just to give an better overview. We are working with Maya/Redshift. If I need to render with the renderfarm I’m going to pack everything with the maya archive tool and transfer it to the our server (Google Drive at the moment). After that I’m using Anydesk to grab it from google Drive and put it on the local drive of the renderfarm. Finally I’m going to unpack the scene and submit it with the deadline submitter.

My goal is to skip some steps and submit it directly from my open Maya scene.
So if I have a VPN connection to the renderfarm and a specific project path with all the linked assets like textures, maybe referenced geos etc., can I submit it from my home office?

Or is there a way to grab and transfer automatically all the local assets to the renderfarm machine into a specific foldersystem and start to submit the renderjob? I guess this is a scripting job right?
Thank you

In Maya we already have a feature designed to be used with AWS Portal (when rendering in the cloud) that collects the paths to all external dependencies to include in the job’s metadata. These include external references, textures, caches, and the Maya scene itself. In the AWS Portal case, that data is then handed over to the AWS Portal Asset Server to upload to the cloud storage. The entries are in the Plugin info file and are named AWSAssetFileX= where X is a 0-based index.

Since you don’t have the cloud component, you cannot simply call deadlinecommand to copy the data, however you could write a script to read the job metadata and perform the copying over VPN to a remote share. Then you could define a path mapping rule in Deadline to convert the local paths to the correct root of the remote share. (When using Windows, you could also map drives with the same letter for the local and remote shares to avoid the remapping). Not sure if this is helpful, but it might save you from reinventing the wheel when implementing an assets introspection script.

For the transfer step, you could approach this in several ways:

  • You could extend the Maya integrated submitter to call an external python script and pass it the list of files to copy.
  • You could extend the Maya integrated submitter to submit a second job that runs on your local Deadline Worker and performs the transfer.
  • You could write an OnJobSubmitted Event script that spawns the transfer job automatically. In both cases, the actual render job could be dependent on the transfer job, so it won’t start rendering until the transfer is done.
  • You could create a Job Utility script for the Deadline Monitor that lets you select a job, right-click to select the script, and perform the transfer manually. Not ideal, as it requires manual intervention. But you could have it as a plan B in addition to one of the automated approaches.

I don’t recommend using a VPN connection to access files directly, or a pre-render script to perform the copying of files over the VPN at render time, as it could cause dozens/hundreds of render nodes to do the same at the same time over a congested connection. It is a good idea to PUSH the data during the submission, or alternatively PUSH or PULL the data via a dedicated Deadline job that does it only once for the job before all render nodes jump on its tasks…

Of course, you could use a similar approach to automate the upload and download through a cloud storage service like Google Drive in place of a VPN connection. In that case, your local submission script could collect, compress, and upload the files to GD, then submit the job to the remote Repository over an RCS connection, and produce a Deadline Job that runs a python script to download data from the cloud storage.

Please let us know what you think about it.

1 Like

Hi Bobo!

Could you, or others, elaborate on how to allow access to the repo over public internet, please? Does that mean getting a static IP from my ISP, setting the repo server to use that IP and then repointing clients to the new IP address? I am already using the RemoteClient.pfx file and a passphrase, so I believe I’m already using TLS. I may just be missing some basic networking knowledge here.

I have users at home who want to connect to the farm, and all of our files are already cloud-based, so asking them to use a VPN just for the farm would be a drag.

Thanks!

Actually, I made a brand new post to get more visibility. Thanks

Hello, did you manage to achieve this? im dealing with the same situation right now.