AWS Thinkbox Discussion Forums

Amazon (or other cloud provider) storage question

Hi, I just want to ask, how is Deadline dealing with bandwidth bottlenecks in terms of having all our files stored on our local file server, but when we need cloud rendering on, say Amazon, cloud servers, how are the data being transferred?

Is there any mechanism, built-in, that’d take the necessary project data, upload them to a specific Amazon storage block and only from there would those files be rendered in the cloud, so that we wouldn’t have to upload tons of textures and caches each frame for each task being rendered?

Or is there any other way?

I’m actually thinking of completely getting rid of our hardware, except for the file server, and move entirely to the cloud, but I don’t want to constantly upload tons of data for each task.

Thanks in advance!

Hey Lukas,

At this stage, Deadline doesn’t deal with any of this, but in the future we want to explore ways to make managing the data between the local and cloud machines easier.

Currently, you’ll have to upload all of your assets manually. If you can keep your local and cloud storage paths identical, then there is no additional work required once you have your assets uploaded. One of the first things we want to do though is make the path mapping feature more flexible so that it’s not just based on the operating system. They would instead be applied to certain groups of machines. So you could configure your local slaves to not do path mapping, and then configure your cloud slaves to do path mapping to point to the cloud storage. This might be a 6.1 feature.

We also want to explore creating plugins for Aspera or other file transferring tools so that you could just have Deadline upload the files for you.

In 6.0, we just wanted to get some basic controls in there, and then build on it incrementally in future releases.

Cheers,

  • Ryan

Ok, thanks for the explanation.

So, a few other questions:

  • building plugins for Deadline 6 hasn’t changed?

  • does Deadline deal with waking up the Amazon cloud machines itself or are 3rd party scripts necessary?

Thank you!

This post describes the changes: viewtopic.php?f=156&t=8898&p=37699&hilit=deprecated+global#p37699

Not automatically. You can only manually control existing instances from the Cloud panel in the Monitor.

Cheers,

  • Ryan

Thank you.

So is this possible (presumably via IPy via the EC2 .NET SDK) to script in Deadline? Or Deadline doesn’t offer any such feature?

AWS recent monitoring API library update has some stuff which should now make this possible.
However, every time I look at it, its changing. Too alpha to touch at the moment. Same deal with Google Compute. In fact, their previous standalone py API just got dumped for a more generic, consolidated one! Not wasting time till it calms down.

Well, that’s all reasonable, but I don’t want to pay for uptime of machines I’m not using, that’s why I’m asking about this auto power up and shut down features. Well, shut down should be easy, but powering up requires some scripting. :slight_smile:

Currently, there are not any hooks in our scripting API to control the cloud instances. I think this is something we should look at for 6.1 as well. You should be able to use 3rd party libraries though in the meantime.

Thanks for the replies, I appretiate it. I’ll see what we can do.

Privacy | Site terms | Cookie preferences