AWS Thinkbox Discussion Forums

Stereoscopic side-by-sides?

Hey Drafters

Downloading Draft beta 10 now. Thanks for releasing new updates so often!

I’d like to have a side-by-side movie option for the stereo shows in house. They’re almost always HD, albeit from different sources (often 2K S35). Before I start hacking away like a scared tourist who crash landed in the deepest Andes forests, I’m hoping someone has an example script already? Maybe something for the cookbook?

Please advise, and thanks!

Bill

I’ll start a cookbook entry. First, I have a couple questions:

  1. How do you change the filename for the left and right eyes?
  2. Let’s say the left and right eyes are each 1920x1080. Is the combined image also 1920x1080?

Hi Paul!

Re: naming convention for left vs right: we use the Nuke nomenclature of %V to sub out what nuke calls “Views.” In Maya, the eyes are defined by the camera that created them. In either case, you could have multiple Views or cameras, so you might want to think about that – ie. instead of just “left” and “right,” it might additionally be “left_wide” and “right_wide” - both are agnostic as far as what you name those passes. “left” and “right” are just what most people use for stereo - or “l” and “r” or “L” and “R”, etc.

Our files are saved as follows:

[value root.vendor_path]/[value root.show]/[value root.areaPath]/[value root.shot]/[value root.descriptor]/[value root.show][value root.shot][value root.version][value root.initials]%V/[value root.show][value root.shot][value root.version][value root.initials]%V.%04d.exr

which, in Nuke, reads as:

/pfcluster/Brek/DESIGN/NUKE/REN/6-2A/stable/Brek_6-2A_lookdev_stable_v01_bg_%V/Brek_6-2A_lookdev_stable_v01_bg_%V.%04d.exr

or

/pfcluster/Brek/DESIGN/NUKE/REN/6-2A/stable/Brek_6-2A_lookdev_stable_v01_bg_left/Brek_6-2A_lookdev_stable_v01_bg_left.%04d.exr
/pfcluster/Brek/DESIGN/NUKE/REN/6-2A/stable/Brek_6-2A_lookdev_stable_v01_bg_right/Brek_6-2A_lookdev_stable_v01_bg_right.%04d.exr

Note that there are two places where %V is used, both in the directory name as well as the filename.

Re: resolution, we resize to 1920x1080 for the single eye reference movie, and then the side-by-side would be the same resolution (each eye is scaled in X by .5).

Thank you for your detailed description! I didn’t know where that %v came from.

Is there a particular script you’d like me to base the example on? If not, I’ll just use something simple like Watermark_Demo_Template.py.

Just seeing this now! I’m attaching a script we’re using that resizes from 2K 16:9 to HD.
IS_1_HD.py.zip (2.58 KB)

+1 for just drafting (no pun intended) off of Nuke’s naming conventions. Except for their switch to #### from %04d The Foundry almost always does it right.

The first step is to handle the view names. Eventually I’d like to do this automatically, but for now we’ll hard code them. We build a leftInFilePattern and a rightInFilePattern based on the inFilePattern:leftInFilePattern = inFilePattern.replace( '%v', 'left' ) rightInFilePattern = inFilePattern.replace( '%v', 'right' )
This replaces Nuke’s ‘%v’ with ‘left’ and ‘right’. Please change ‘left’ and ‘right’ to whatever naming convention your facility uses.

Your script has a few lines where you determine inFile based on inFilePattern. We’ll change this to determine leftInFile and rightInFile based on leftInFilePattern and rightInFilePattern. For example, change: inFile = ReplaceFilenameHashesWithNumber(inFilePattern, frameNumber) to:leftInFile = ReplaceFilenameHashesWithNumber(leftInFilePattern, frameNumber) rightInFile = ReplaceFilenameHashesWithNumber(rightInFilePattern, frameNumber)

Similarly, anywhere you call Image.ReadFromFile() should be changed to read an image for both the left and right eyes. For example, change: inFrame = Image.ReadFromFile(inFile) to:leftInFrame = Image.ReadFromFile(leftInFile) rightInFrame = Image.ReadFromFile(rightInFile)

Next we need some way to resize the image. I think we want a ‘fit’ resize while halving the aspect ratio. Here’s a function that will do that for us:[code]def ResizeWithPixelAspectScale( self, width, height, pixelAspectScale ):
if width <= 0:
raise RuntimeError(‘width must be a positive number’)
if height <= 0:
raise RuntimeError(‘height must be a positive number’)
if pixelAspectScale <= 0:
raise RuntimeError(‘pixelAspectScale must be a positive number’)

sourceAR = float(pixelAspectScale) * self.width / self.height
destAR = float(width) / height

if sourceAR == destAR:
	self.Resize( width, height, 'distort' )
else:
	image = copy.deepcopy( self )
	self.Resize( width, height )
	self.SetToColor( ColorRGBA( 0, 0, 0, 0 ) )
	if sourceAR > destAR:
		image.Resize( width, int(round(width/sourceAR)), 'distort' )
	else:
		image.Resize( int(round(height*sourceAR)), height, 'distort' )
	self.CompositeWithGravity( image, PositionalGravity.CenterGravity, CompositeOperator.CopyCompositeOp )

Image.ResizeWithPixelAspectScale = ResizeWithPixelAspectScale[/code]We should probably add an option like this to Image.Resize().

Finally we resize the left and right image, and composite them onto the output frame:[code]leftInFrame.ResizeWithPixelAspectScale(proxyWidth/2, proxyHeight, 0.5)
rightInFrame.ResizeWithPixelAspectScale(proxyWidth/2, proxyHeight, 0.5)

bgFrame = Image.CreateImage(proxyWidth, proxyHeight)
bgFrame.CompositeWithPositionAndGravity(leftInFrame, 0.25, 0.5, PositionalGravity.CenterGravity, CompositeOperator.CopyCompositeOp)
bgFrame.CompositeWithPositionAndGravity(rightInFrame, 0.75, 0.5, PositionalGravity.CenterGravity, CompositeOperator.CopyCompositeOp)[/code]

Please find attached a copy of your script that includes these changes.

Please let me know if you have any questions, or if there’s anything we should change.
IS_1_HD_stereo.zip (2.77 KB)

Ug… my machine just crashed so I lost the draft of my reply I’ve been working on since yesterday. D’oh! I’ll rewrite it and simplify here (sorry if it feels rushed and terse, my previous email was warm, flowing and probably too long):

The script is great, you guys are amazing. I’m hoping to add a few things:

  1. We still need a flat version of the left and right eye for reference. Usually just one of the eyes is enough. Is there a way to do multiple videoEncoder commands within that one loop or is a new loop needed for each movie we want to output?

  2. We need to be able to easily indicate which are stereo and which aren’t. The %V could be used as the flag. I’m imagining this could be a bigger issue since %V could be used for multiple passes but in this case, it seems like left/right is the use 99% of the time.

Also, it seems like the %V substitution isn’t case sensitive, but in that case, can we just use %V (instead of %v) to conform to Nuke standards?

In my haste I forgot to mention that the stereo doesn’t actually seem to be working… it’s just duping the left frame.

Thank you for your kind works!

Is it possible that the inFile name is missing a “%v”?

Yes, you can encode multiple videos inside the one loop. Just make sure that you give the VideoEncoders different names. For example, the script now has “videoEncoder = VideoEncoder( … )”. The next video encoder must have a different name such as “leftVideoEncoder = VideoEncoder( … )”.

For now I will assume that you can detect this based on the presence of %V: isStereo = inFilePattern.lower().find('%v') >= 0 After this line, isStereo is True if the inFilePattern contains a %v, and False otherwise. The lower() makes the string lower-case (to handle the case issue you mentioned). find() searches for ‘%v’ in the lower-case string. If %v is found, the return value is >= 0. If it is not found, the return value is -1.

Yes, you could change it to %V instead. In the attached script, I changed it to a case-insensitive search using: leftInFilePattern = re.sub( '(?i)%v', 'left', inFilePattern ) re.sub() performs a string substitution. The (?i) indicates a case-insensitive search.

Please find attached an updated version of the script:

  • Now does a case-insensitive substitution for %v or %V.
  • If the inFile contains a %v, then it creates a stereoscopic side-by-side. Otherwise it creates a regular movie.
  • Now saves a left or right flat movie along side the stereoscopic side-by-side. You can control which eyes are saved in flat movies by changing thesaveFlatLeftEye = False saveFlatRightEye = Truevariables in the script.

Edit: I just remembered that you were resizing with ‘width’ instead of ‘fit’. Please find attached an updated version of the script that resizes with ‘width’.
IS_1_HD_stereo_and_flat_resize_width.zip (3.11 KB)
IS_1_HD_stereo_and_flat.zip (3.1 KB)

Hi Paul

I’ve FINALLY been able to take a moment to test the new script you sent. Thank you, by the way, for all the hard work. What probably took you hours would’ve taken me days, if not weeks.

I noticed that the stereo videoEncoder output seems to have the same name as the left output? Here’s the log file, note that the outFile variable print command returns sq190_s160_lookdev_v14_left_bg, not sq190_s160_lookdev_v14_stereo_bg as would be expected.

=======================================================
Log Message

0: Task timeout is disabled.
0: Loaded job: jack_sq190_s160_lookdev_v14a_bg.nk [DRAFT] (00j_100_005_450e7616)
0: INFO: StartJob: initializing script plugin Draft
0: INFO: Found Draft python module at: ‘/Local/Farm11/Draft/Draft.so’
0: INFO: About: Draft Plugin for Deadline
0: Plugin rendering frame(s): 0
0: INFO: Draft job starting…
0: INFO: Stdout Handling Enabled: False
0: INFO: Popup Handling Enabled: False
0: INFO: Using Process Tree: True
0: INFO: Hiding DOS Window: True
0: INFO: Creating New Console: False
0: INFO: Looking for bundled python at: ‘/Applications/Deadline/Resources/python/2.6.7/Python.framework/Versions/2.6/Resources/Python.app/Contents/MacOS/Python’
0: INFO: Render Executable: “/Applications/Deadline/Resources/python/2.6.7/Python.framework/Versions/2.6/Resources/Python.app/Contents/MacOS/Python”
0: INFO: Render Argument: -u “/Local/Farm11/jobsData/Jack_stereo_HD.py” username="" entity="" version="" width=2148 height=1152 frameList=1-175 startFrame=1 endFrame=175 inFile="/pfcluster/Jack/PRODUCTION/shots/sq190/sq190_s160/renders_comp/sq190_s160_lookdev_v14_bg/left/sq190_s160_lookdev_v14_left_bg.####.exr" outFile="/pfcluster/Jack/PRODUCTION/shots/sq190/sq190_s160/renders_comp/sq190_s160_lookdev_v14_bg/left/sq190_s160_lookdev_v14_left_bg.mov" deadlineJobID=00f_070_009_16f58902
0: INFO: Startup Directory: “/Local/Farm11/Draft”
0: INFO: Process Priority: BelowNormal
0: INFO: Process Affinity: default
0: INFO: Process is now running
0: STDOUT: Draft Beta 0.10.0.47119
0: STDOUT: Draft Beta v0.10.0.47119
0: STDOUT: Draft provides basic compositing functionality for Python scripts, and is designed for tight integration with the Deadline Render Management System.
0: STDOUT: ?Thinkbox Software 2010-2012
0: STDOUT: Developed by: Mike Yurick, Jon Gaudet, Ian Fraser
0: STDOUT: This software uses libraries from the FFmpeg project (ffmpeg.org) under the LGPLv2.1
0: STDOUT: This software also uses libraries from the ImageMagick project, attributed to ImageMagick Studio LLC (imagemagick.org)
0: STDOUT: 0.10.0.47119
0: STDOUT: Command line args:
0: STDOUT: username=
0: STDOUT: entity=
0: STDOUT: version=
0: STDOUT: width=2148
0: STDOUT: height=1152
0: STDOUT: frameList=1-175
0: STDOUT: startFrame=1
0: STDOUT: endFrame=175
0: STDOUT: inFile=/pfcluster/Jack/PRODUCTION/shots/sq190/sq190_s160/renders_comp/sq190_s160_lookdev_v14_bg/left/sq190_s160_lookdev_v14_left_bg.####.exr
0: STDOUT: outFile=/pfcluster/Jack/PRODUCTION/shots/sq190/sq190_s160/renders_comp/sq190_s160_lookdev_v14_bg/left/sq190_s160_lookdev_v14_left_bg.mov
0: STDOUT: deadlineJobID=00f_070_009_16f58902
0: STDOUT: width: 2148
0: STDOUT: height: 1152
0: STDOUT: aspectRatio: 1.86458333333
0: STDOUT: proxyWidth: 1920
0: STDOUT: proxyHeight: 1080
0: STDOUT: outBase: /pfcluster/Jack/PRODUCTION/shots/sq190/sq190_s160/renders_comp/sq190_s160_lookdev_v14_bg/left/sq190_s160_lookdev_v14_left_bg
0: STDOUT: outBaseFilename: sq190_s160_lookdev_v14_left_bg
0: STDOUT: movPattern: /pfcluster/Jack/PRODUCTION/shots/sq190/sq190_s160/renders_comp/sq190_s160_lookdev_v14_bg/mov/sq190_s160_lookdev_v14_left_bg.mov
0: STDOUT: movDir: /pfcluster/Jack/PRODUCTION/shots/sq190/sq190_s160/renders_comp/sq190_s160_lookdev_v14_bg/mov
0: STDOUT: [prores @ 0x10a002200] encoding with ProRes standard (apcn) profile
0: STDOUT: Output #0, mov, to ‘/pfcluster/Jack/PRODUCTION/shots/sq190/sq190_s160/renders_comp/sq190_s160_lookdev_v14_bg/mov/sq190_s160_lookdev_v14_left_bg.mov’:
0: STDOUT: Metadata:
0: STDOUT: encoder : Lavf53.32.100
0: STDOUT: Stream #0:0: Video: prores (apcn) (apcn / 0x6E637061), yuv422p10le, 1920x1080, q=2-31, 149299 kb/s, 24 tbn, 24 tbc
0: STDOUT: Processing frame: 1
0: STDOUT: Processing frame: 2
0: STDOUT: Processing frame: 3
0: STDOUT: Processing frame: 4
0: STDOUT: Processing frame: 5
0: STDOUT: Processing frame: 6
0: STDOUT: Processing frame: 7
0: STDOUT: Processing frame: 8
0: STDOUT: Processing frame: 9
0: STDOUT: Processing frame: 10
0: STDOUT: Processing frame: 11
0: STDOUT: Processing frame: 12
0: STDOUT: Processing frame: 13
0: STDOUT: Processing frame: 14
0: STDOUT: Processing frame: 15
0: STDOUT: Processing frame: 16
0: STDOUT: Processing frame: 17
0: STDOUT: Processing frame: 18
0: STDOUT: Processing frame: 19
0: STDOUT: Processing frame: 20
0: STDOUT: Processing frame: 21
0: STDOUT: Processing frame: 22
0: STDOUT: Processing frame: 23
0: STDOUT: Processing frame: 24
0: STDOUT: Processing frame: 25
0: STDOUT: Processing frame: 26
0: STDOUT: Processing frame: 27
0: STDOUT: Processing frame: 28
0: STDOUT: Processing frame: 29
0: STDOUT: Processing frame: 30
0: STDOUT: Processing frame: 31
0: STDOUT: Processing frame: 32
0: STDOUT: Processing frame: 33
0: STDOUT: Processing frame: 34
0: STDOUT: Processing frame: 35
0: STDOUT: Processing frame: 36
0: STDOUT: Processing frame: 37
0: STDOUT: Processing frame: 38
0: STDOUT: Processing frame: 39
0: STDOUT: Processing frame: 40
0: STDOUT: Processing frame: 41
0: STDOUT: Processing frame: 42
0: STDOUT: Processing frame: 43
0: STDOUT: Processing frame: 44
0: STDOUT: Processing frame: 45
0: STDOUT: Processing frame: 46
0: STDOUT: Processing frame: 47
0: STDOUT: Processing frame: 48
0: STDOUT: Processing frame: 49
0: STDOUT: Processing frame: 50
0: STDOUT: Processing frame: 51
0: STDOUT: Processing frame: 52
0: STDOUT: Processing frame: 53
0: STDOUT: Processing frame: 54
0: STDOUT: Processing frame: 55
0: STDOUT: Processing frame: 56
0: STDOUT: Processing frame: 57
0: STDOUT: Processing frame: 58
0: STDOUT: Processing frame: 59
0: STDOUT: Processing frame: 60
0: STDOUT: Processing frame: 61
0: STDOUT: Processing frame: 62
0: STDOUT: Processing frame: 63
0: STDOUT: Processing frame: 64
0: STDOUT: Processing frame: 65
0: STDOUT: Processing frame: 66
0: STDOUT: Processing frame: 67
0: STDOUT: Processing frame: 68
0: STDOUT: Processing frame: 69
0: STDOUT: Processing frame: 70
0: STDOUT: Processing frame: 71
0: STDOUT: Processing frame: 72
0: STDOUT: Processing frame: 73
0: STDOUT: Processing frame: 74
0: STDOUT: Processing frame: 75
0: STDOUT: Processing frame: 76
0: STDOUT: Processing frame: 77
0: STDOUT: Processing frame: 78
0: STDOUT: Processing frame: 79
0: STDOUT: Processing frame: 80
0: STDOUT: Processing frame: 81
0: STDOUT: Processing frame: 82
0: STDOUT: Processing frame: 83
0: STDOUT: Processing frame: 84
0: STDOUT: Processing frame: 85
0: STDOUT: Processing frame: 86
0: STDOUT: Processing frame: 87
0: STDOUT: Processing frame: 88
0: STDOUT: Processing frame: 89
0: STDOUT: Processing frame: 90
0: STDOUT: Processing frame: 91
0: STDOUT: Processing frame: 92
0: STDOUT: Processing frame: 93
0: STDOUT: Processing frame: 94
0: STDOUT: Processing frame: 95
0: STDOUT: Processing frame: 96
0: STDOUT: Processing frame: 97
0: STDOUT: Processing frame: 98
0: STDOUT: Processing frame: 99
0: STDOUT: Processing frame: 100
0: STDOUT: Processing frame: 101
0: STDOUT: Processing frame: 102
0: STDOUT: Processing frame: 103
0: STDOUT: Processing frame: 104
0: STDOUT: Processing frame: 105
0: STDOUT: Processing frame: 106
0: STDOUT: Processing frame: 107
0: STDOUT: Processing frame: 108
0: STDOUT: Processing frame: 109
0: STDOUT: Processing frame: 110
0: STDOUT: Processing frame: 111
0: STDOUT: Processing frame: 112
0: STDOUT: Processing frame: 113
0: STDOUT: Processing frame: 114
0: STDOUT: Processing frame: 115
0: STDOUT: Processing frame: 116
0: STDOUT: Processing frame: 117
0: STDOUT: Processing frame: 118
0: STDOUT: Processing frame: 119
0: STDOUT: Processing frame: 120
0: STDOUT: Processing frame: 121
0: STDOUT: Processing frame: 122
0: STDOUT: Processing frame: 123
0: STDOUT: Processing frame: 124
0: STDOUT: Processing frame: 125
0: STDOUT: Processing frame: 126
0: STDOUT: Processing frame: 127
0: STDOUT: Processing frame: 128
0: STDOUT: Processing frame: 129
0: STDOUT: Processing frame: 130
0: STDOUT: Processing frame: 131
0: STDOUT: Processing frame: 132
0: STDOUT: Processing frame: 133
0: STDOUT: Processing frame: 134
0: STDOUT: Processing frame: 135
0: STDOUT: Processing frame: 136
0: STDOUT: Processing frame: 137
0: STDOUT: Processing frame: 138
0: STDOUT: Processing frame: 139
0: STDOUT: Processing frame: 140
0: STDOUT: Processing frame: 141
0: STDOUT: Processing frame: 142
0: STDOUT: Processing frame: 143
0: STDOUT: Processing frame: 144
0: STDOUT: Processing frame: 145
0: STDOUT: Processing frame: 146
0: STDOUT: Processing frame: 147
0: STDOUT: Processing frame: 148
0: STDOUT: Processing frame: 149
0: STDOUT: Processing frame: 150
0: STDOUT: Processing frame: 151
0: STDOUT: Processing frame: 152
0: STDOUT: Processing frame: 153
0: STDOUT: Processing frame: 154
0: STDOUT: Processing frame: 155
0: STDOUT: Processing frame: 156
0: STDOUT: Processing frame: 157
0: STDOUT: Processing frame: 158
0: STDOUT: Processing frame: 159
0: STDOUT: Processing frame: 160
0: STDOUT: Processing frame: 161
0: STDOUT: Processing frame: 162
0: STDOUT: Processing frame: 163
0: STDOUT: Processing frame: 164
0: STDOUT: Processing frame: 165
0: STDOUT: Processing frame: 166
0: STDOUT: Processing frame: 167
0: STDOUT: Processing frame: 168
0: STDOUT: Processing frame: 169
0: STDOUT: Processing frame: 170
0: STDOUT: Processing frame: 171
0: STDOUT: Processing frame: 172
0: STDOUT: Processing frame: 173
0: STDOUT: Processing frame: 174
0: STDOUT: Processing frame: 175
0: INFO: Process exit code: 0
0: INFO: Draft job complete!

=======================================================
Log Details

Log Date/Time = May 18/12 15:00:46
Frames = 0-0

Slave Machine = Farm11
Slave Version = v5.1.0.46398 R

Plugin Name = Draft

Thank you for your report!

It seems that there’s a problem in how I handled the eyes. I assumed that a literal ‘%v’ would appear in the inFile. I take it I should change this to something else, for example, check if the inFile is in a subfolder named ‘left’ or ‘right’?

Hey Paul et al

One more thing: it would be super rad awesome if the script also created an anaglyph version of the shot. This can be used to judge stereo depth with red/blue glasses. I’m attaching a nuke script that shows what needs to happen to get an anaglyph image, but the process is basically that it desaturates both eyes, and then takes the red channel from the right eye and the green and blue channels from the left eye. I included a gizmo example in the script that covers many of the desired variables to modify convergence and swap eyes, then a breakdown of that gizmo, and then just a super simple version of what’s happening with a basic anaglyph conversion. If you don’t know Nuke, I can walk you through it further, but really it’s the above description that’s important.
anaglyphExample.20120518.zip (3.31 KB)

Hi Paul

Yes, the %V would get replaced with each “view,” as Nuke calls them (ie ‘left,’ ‘right’) in the inFile. But it must be doing that or else it wouldn’t find any input frames, right? For output, the processed outFile names would have ‘right,’ ‘left,’ ’ ‘stereo’ and ‘anaglyph’ in place of the %V.

Rock on

paul or jon can jump in here iwth the commands, but we support anaglyph in draft already!

cb

I’m thinking Draft only gets ‘left’ – look around line 18 and line 40 in the log you posted. I guess the substitution was already done somewhere else? (This is different from the logs acrylic.cow posted, which actually had a %v in the inFile parameter.) Since there is no %v in the filename, the replacement doesn’t make a difference. I think this is why you saw the left eye repeated twice. But I don’t think this is too much of a problem – I’ll update the script to replace ‘left’ or ‘right’ instead of ‘%v’.

For sure! As Chris mentioned, Draft includes a command to do this:

anaglyphImage = Draft.Image.Anaglyph( leftEye, rightEye, "PS" )

But I don’t think it does any desaturation. We could add a saturation command if you need it; please let us know.

Draft also supports a “LSA” anaglyph mode which is supposed to be better than “PS”. However it currently does not work. It will be fixed in the next build.

I’ll include an anaglyph option when I update the script for ‘left’ / ‘right’.

Here’s an updated version of the script. It includes the following changes:

  • Handles ‘left’ and ‘right’ parent folders, in addition to the old handling for ‘%v’.
  • Now saves an anaglyph movie in addition to the side-by-side movie. The anaglyph movie can be disabled by changing: saveAnaglyph = True on line 97 to False.
  • Now adds a ‘_stereo’ suffix to stereoscopic side-by-side movies.

IS_1_HD_stereo_and_flat_resize_width2.zip (3.36 KB)

Privacy | Site terms | Cookie preferences