Deadline output looks very different from Maya framebuffer

Hi there,

I’m having problems with render nodes output. What I render in the maya viewport looks very different from what gets out of deadline.
I’ve got color management off on globals, viewport is set to 32 bit float HDR with sRGB on both input and output.
I’ve tried different file formats and different .EXR compressions. I loaded the output on both AE and nuke, with same color space, 32bit etc. Even tried different colorspace just to see if they would make any difference.

The scene itself is very simple, a bird emitting fluid driven particles, which is being lit with just 1 spotlight behind it. MB is turned off for this example.
It’s also worth noting that this is not the only scene going wrong. I’ve got about 8 other shots that’s giving me the same result.

My only solution right now is to use a viewport renderer, which gives me exactly what I’m seeing when I do test renders within Maya framebuffer.

how it should look like :,7Sy5JUq#0
what comes out of deadline batch :,7Sy5JUq#1

In case you wanna take a look at the raw .exr output, here it is :

I dont think the .exr output has the same color information as the one produced on Maya framebuffer, I could not match grade or color correct it in any way to get the same look.

Both workstation and render nodes have been updated to the latest Krakatoa client (2.30 something with official magma release).

Please let me know what I’m missing here, just to be able to get accurate result of what Im seeing in Maya framebuffer on the batch output. Will appreciate any help on this, thanks in advance :].


Hi Ray,

In order to exclude the EXR from the equation, can you please create a Poly Sphere, turn to PRT Volume and render that with the same lighting as the bird setup both locally and on the network? If you get the same discrepancy, we will look deeper. If it renders the same, we will have to examine the bird trail particles closer.

To my eye the particle count / density of the two images you posted does not appear to be the same (which of course would affect both light attenuation and scattering), but I could be wrong. Can you describe in more detail your particle sources? Are you rendering one or more PRT file sequences via a PRT Loader, or are these dynamic Maya particles (cached or not)? Does it render the same in Batch mode on the local workstation compared to rendering a single frame interactively?

This is totally unexpected, but in order to debug it, we will have to exclude various factors. So let’s start with the simple sphere PRT Volume test which does not depend on external data, and go on from there…

Hi Bobo,

I’ll do the PRT Vol test as soon as possible, pretty flat out at work at the moment.

The bird geo is alembic cached. It is emitting fluid which is then driving the particles. The fluid is first cached, then I create multiple partitioning using PRT SVR.
I’m rendering multiple sequences via a PRT loader. I have not tried local batch render, I will definitely give that a go.
I usually start a fresh scene before using a PRT loader; so it’s usually just the PRT loader, a bunch of spotlights and some alembic matte objects.

I actually have found a scene where the viewport render is pretty much like output batch render, the only thing different in this scene that I can see so far is that I have 0 render layers, meaning I’m working on masterlayer.
However I’m not certain if that’s the problem yet. I will post some more screengrabs and possibly scene files if they’re not too large ( most things have caches ) when I get the chance.

So I did a local batch render and had a very interesting result. Local batch renders seems to give the correct result as I would see it on Maya framebuffer.

I did a fresh scene with same lighting and render setup, submitted to local and network at the same time; here’s a screenshot :

Note that I had to crank up the midtones of the network render pic for it to show, just for demonstration purposes. It was extremely dark with bright rim light that doesn’t even represent the lighting setup in the scene.
I have triple checked that it is indeed the same scene being rendered. Let me know if you want some of the PRT cache frames and/or scene file.

Right now I’m trying to batch render locally on one of network render nodes just for a test.
Will try a PRT Vol in the same scene soon.


Alright did a PRT VOL on a bunch of spheres, using the same scene as the bird, same lighting, 3 different render layers ( I also have multiple lights setup for different layers ). Rendering them over both local and network gives the same result as the birds.
Here’s a screenshot :

The only thing I have not tried yet is doing a fresh scene with no render layers… As I mentioned a couple of posts above, I did have a scene where both Maya framebuffer and network output looked very similar, some subtle differences which could come from colorspace perhaps ?
In this scene there are 2 spotlights, 0 renderlayers ( just masterlayer ) and the PRT loader node itself.
Screenshot :

Something worth noting, I’m submitting the job through the ‘submit job to deadline’ script, with the ‘Submit Render Layers As Separate Jobs’ ticked. I need to do a test with this off when I have multiple layers in the scene; I will also try the bird scene with no extra layers.

Hope this gets sorted soon!

I tried stripping the bird scene right down to 1 masterlayer, 2 spotlights, alembic cache geo matte, rendered on network. No dice; still the same incorrect result.

I guess it’s abit more complex than I thought, I hope these tests helped isolating some of the problems though.

Hi ray,

My initial guess is that there is a colorspace mismatch betwen the Maya viewer, and the EXR viewer.

Can you run this simple test using Maya and Nuke (see attached image):
a) Render a Krakatoa image in Maya.
b) In Maya’s “Render View” window, select “Display”, then “Color Management”.
c) Question: What are the “Image Color Profile” and “Display Color Profile” settings? (see attached image)
d) Open Nuke, create a “Read” node with the resulting EXR file.
e) Question: What is the Read node’s “Colorspace” setting, and what is Nuke’s viewer’s colorspace setting? (see attached image)

The settings in Maya and Nuke must match otherwise the image will be interpreted differently. Is there a difference in the images when using matching colorspace settings?

Hi Conrad,

Here’s a screenshot of the test :

The RGB does not seem to match the alpha.

You can have a look at the .EXR output files :
brokenbird.rar 6mb

brokenBird.EXR is the one getting output through deadline render nodes.
correctBird.EXR is the image getting saved on disk automatically when rendering on framebuffer.

I posted the example .EXRs with wrong resolution, if you haven’t downloaded it, take a look at this one instead :

The PRT Vol scene file is uploaded here :
Basically it’s the exact scene I used to render the bird. This scene produced the same discrepancy over networked render.

Thanks for double checking the colorspace settings for me. I just ask because 9 time out of 10 it is a colorspace issue.

I keep getting timeout errors when trying to download this file from filedropper. Do you mind uploading it to the forum? There is an “upload attachment” button in the reply window. You can zip up the files and upload here.

I wasn’t able to reproduce the problem here when running on Deadline vs. local, so I would like to take a look at the scene file you’ve sent.

I still wasn’t able to download the file… Is there another way you could help me reproduce this? I’d like to see if I can find the problem.

Privacy | Site terms | Cookie preferences