AWS Thinkbox Discussion Forums

quick run through of how draft treats colour spaces through a chain of processes

Heya Folks!

As per a quick question with Julie over the twitters, I wanted to get a clear understanding of how draft is reading / interpreting data as it’s going through it’s processes. The last place I was in didn’t have a colour or pipeline person and often what happened was when clients supplied luts, they didn’t really perform as advertised and there was a fall back of faking something in nuke or using the lut and putting additional cc’s to try and match the target look of a final movie. This is generally kind of unsatisfying to me so I wanted to get some clarity as to what’s going on under the hood in draft and get predictable results. So say for example, I have a dpx sequence that’s in either redlog or alexalog, I want to apply a lut to that footage and save it out as a quicktime.

In nuke I’d load the sequence and set the colour space drop down to whatever the correct one is for the footage and this should in theory linearize the data. I now use a vectorfield op to load in my lut to apply the film’s look and render the result out as an srgb quicktime - lets say photojpeg or some other high quality method. I normally get a pretty good match when I place my QT over the nuke viewer and compare results.

In draft, if I try to do the same thing so I can batch quicktimes I’d do the following:

  1. Load the dpx sequence in - At this point where we’ve only done a read, nothing else, what does draft treat the footage as in terms of colour space? Is it assuming that it’s always being fed linear data or is it autodetecting log versus lin based on header data / file extension?

  2. Strip the log curve aka convert to linear - as far as I know I had to remove the log curve and here I create a cineon /alexalog / redlog lut, invert it since we’re stripping the curve and not applying it and then apply it to each image. The images are now linear after this process.

  3. Apply the lut - normally I think we were using the ocio operator for this. So we load our Lut and apply it - again does ocio typically assume that it’s being fed linear? Excuse my ignorance but are there ever cases where people would wrap up the linearizing of a plate and a grade / look into a single ocio operation and we’d skip the linearizing step above? I’ll assume that it normally acts on linear and our lut is only a standard CC and not a major swing in colour space.

  4. Convert to srgb for quicktime - Since we’re off to screen space we need to do a linear to srgb op to convert into gamma space. I’ve read the links on the forum about the various issues with quicktime gamma and often I’ve had to do a slight gamma / colour correction to counteract whatever darkening or washing the QT will do, dependent on codec.

  5. Save out to quicktime - hurray!

Mainly Steps 1 and 3 are what I wanted to confirm as to how draft is loading things!

Cheers,

John

Hi John!

Thanks for all the details you put in your post. This is very helpful and clear! I will try to do the same…

To answer your first question, Draft always assumes it’s being fed linear data on read. Basically, Draft users are fully responsible to handle their entire color pipeline, from reading their image sequence up to encoding their QuickTime. On the other hand, Draft offers many options that allows you to get exactly the results you want.

In order to create LUTs, your first (and very simple) option is to use Draft built-in functions to create basic LUTs such as linear-to-Rec.709, linear-to-sRGB, linear-to-Cineon, linear-to-ALEXA V3 Log C LUTs and their inverse (see docs.thinkboxsoftware.com/produc … sform.html for a code snippet).

If you need something a bit more sophisticated, you can bake a LUT with Draft through OpenColorIO. As you described in your post, you can load a LUT from file (see opencolorio.org/FAQ.html#what-lu … -supported for supported format) and apply it to your image sequence. To answer your second question, I would say, in this case, that OCIO assumes it’s being fed linear data.

Alternatively, you can also create your LUT from an OCIO configfile (and then the fun begins…). You can create your own configfile by defining custom colorspaces or you can use a configfile that has been already created (ref. will follow). Then you can create LUTs based on the colorspaces defined in the chosen configfile – specifying a colorSpaceIn (the colorspace the input images will be in) and a colorSpaceOut (the desired colorspace for the processed images). If you decide to create your own configfile, you can define your colorspaces using a FileTransform, an ExponentTransform, a LogTransform, a MatrixTransform or combine those into a ColorSpaceTransform or even, a GroupTransform… There’s even more options but this is the basics. So you have all the flexibility you need and to answer your third question, I think that combining steps 2 and 3 is totally feasible (although I never tried this myself).

You can consult opencolorio.org/userguide/contex … te-example for a sample config.ocio file. You can also look in your Draft folder under ocio-configs for many other examples. You can also find the sample script encode_with_ocio_lut.py under Samples/ColorConversion that will help you putting everything together.

That’s a lot of info… Actually this should be in our Draft documentation (in a new section called LUT in Concepts). I will certainly put this info there relatively soon.

Please, feel free to ask any other questions. If you want a closer assistance, you can also open a ticket (support.thinkboxsoftware.com) and I will be happy to help crafting a LUT that is right for you.

Kind regards,

Julie

Perfect, cheers for that. So if I want to take something that’s alexa log and save out a quicktime with a lut burned in I’d have to go:

  1. Read dpx file.
  2. Apply inverse alexalog to convert dpx to linear
  3. Apply film lut using ocio
  4. Apply Linear to srgb so we get the 2.2 gamma for quicktime
  5. Save to quicktime

Thanks for the clarifications!

Hi there,

Those operations seem fine to me!

Just as I described in my last post, I can help you crafting any LUT you need, helping you with your script to apply each of your color transforms one after the other. As long as your color transforms are well-defined. The big question for me is how the LUT file in step 3 is defined. Since I’m not a color specialist, I can’t really help you much with that part. I can confirm that you’ll have linear data after step 2 and that step 4 assumes linear data as input.

Sorry I can’t help you more than that!

Julie

Hi there,

Want to thank John for his timely (for us) question on DPX, Luts and Draft and secondly Julie for such a nicely detailed walkthrough. This topic post has saved me so much time I feel I owe you both a coffee :smiley:

Many thanks

I’m very glad it was helpful!

This info (and more) will definitively be included in our next Draft Documentation.

Have a great day!

Julie

Privacy | Site terms | Cookie preferences