AWS Thinkbox Discussion Forums

Post Job Script fails

Hi there,
I am testing a Post Job script that calls oiiotool.exe to compress the rendered images with different settings based on the filename.

oiiotool first writes the data to a TMP file and then tries to rename it to the correct filename. The TMP file is created correctly but the renaming fails with an “Access is denied” error message.
The user account of the Deadline service does have sufficient access right to the relevant folders.

Is there anything on Deadlines side that prevents the script from running with sufficient rights?
Or maybe anything else that I should look into to resolve this?

Windows 10
Deadline 10.0.29.0

Cheers, Florian

Hey Florian,

Something to check - are you using ‘run as user’? If you are, the Deadline service will be running as one account while the sandbox it starts to do the actual work will be running as that user account. Maybe the user doesn’t have access?

Hi Justin,
we are using “run as user” but I already checked that the user has sufficient rights.
The TMP files are created correctly. Just the renaming to the original name failes.
Does Deadline maybe lock the files somehow?

I’m doing lots of this type of operation in post-scripts, and haven’t come across anything weird permission wise. Have you excluded the possibility of concurrent operations and file system locks, etc?

Hi Daniel,
I am testing my script on a test rendering so unless Deadline is locking files while the script is running, there should be no file locks. I can also rename and delete the files manually.

I thought that maybe Deadline prevents some operations like calling an .exe file from a script.

Hmm, ok well that resolves one of the only solutions I can think about then. I was only barking up the file-lock tree because we sometimes get nonsense like that from our NAS, but if you can rename manually, then that shouldn’t be the problem.

As for not being able to execute .exe files, I regularly run ffmpeg.exe so I don’t believe this is a blanket rule.

What happens if you run the script manually in a terminal (some re-writing needed, since you won’t have access to the Deadline* imports), or as a deadline script job? Can you verify that in an ideal test case, the script does indeed work as expected?

If I run the script manually, it fails as well. (“oiiotool ERROR: -o : oiiotool ERROR: could not move temp file”).
I am using subprocess.run at the moment.

What command do you use to launch ffmpeg.exe?

If you do want access to all the Deadline* imports, you can run your script with deadlinecommand.exe -ExecuteScript, which will run it just like Deadline would.

Now Deadline-wise we don’t do anything fancy with the contents of your scripts so I’m thinking it’s not a Deadline issue.

I’m not familiar enough with oiiotool though, it might be that subprocess.run isn’t waiting for oiiotool to finish before moving on to the next command?

At least subprocess.check_call waits for the command to finish but I get the same error.
But it seems to be a problem with oiiotool.exe. I tried a different cli program and didn’t get any error there.

Interesting.

My ffmpeg wrapper looks like this (loosly based on something from Thinkbox actually):

def _process(command, returnStream=1):
    environment = None
    startupinfo = None
    hideWindow  = True
    success     = False
    output      = (None, None)

    try:
        if os.name == 'nt':
            environment = {}
            for key in os.environ.keys():
                environment[key] = str(os.environ[key])
            environment['PATH'] = str(os.path.dirname(FFMPEGEXE) + ";" + os.environ['PATH'])
        
        if hideWindow and os.name == 'nt':
            # Python 2.6 has subprocess.STARTF_USESHOWWINDOW, and Python 2.7 has subprocess._subprocess.STARTF_USESHOWWINDOW, so check for both.
            if hasattr( subprocess, '_subprocess' ) and hasattr( subprocess._subprocess, 'STARTF_USESHOWWINDOW' ):
                startupinfo = subprocess.STARTUPINFO()
                startupinfo.dwFlags |= subprocess._subprocess.STARTF_USESHOWWINDOW
            elif hasattr( subprocess, 'STARTF_USESHOWWINDOW' ):
                startupinfo = subprocess.STARTUPINFO()
                startupinfo.dwFlags |= subprocess.STARTF_USESHOWWINDOW
    
        proc = subprocess.Popen(command, bufsize=4096, stdout=subprocess.PIPE, stderr=subprocess.PIPE, startupinfo=startupinfo, env=environment)
        output = proc.communicate()
        success = True if proc.returncode == 0 else False
    except Exception as e:
        print('Error while processing ffmpeg job: {}'.format(e))

    return (success, output[returnStream])

For most apps you would want to grab returnStream=0 which will be stdOut. It’s just that FFmpeg is a bit annoying and returns debug/frame data on stdErr which makes me default to this pipe.

Maybe mess around with this and see if it produces a different result to your current run / check_call method?

1 Like

Here’s another wrapper. This is how I’m calling deadlinecommand.exe.

As you can see I’ve messed around a whole lot in order to figure out why we’re getting weird sporadic ~10min freezes while waiting for this process, but the end result is manually reading the stdOut instead of using communicate().

Didn’t make much of a difference to me, but leaving it here in case it helps (comments and all):

def callDeadlineCommand(arguments, hideWindow=True, readStdout=True):
    '''Wrapper for Deadline system commands (borrowed from Thinkbox)
    
    Args:
        arguments (list): Arguments to send to the deadlinecommand.exe process
        hideWindow (bool, optional): Whether or not to hide the subprocess window
        readStdout (bool, optional): Whether or not to read and return the console output from the command

    Returns:
        string|json|mixed: Response from deadlinecommand.exe as expected based on the input arguments
    '''

    start = datetime.datetime.now()
    common.printDebug('Starting callDeadlineCommand method')

    environment = None
    startupinfo = None
    stdoutPipe  = None
    output      = ''
    deadlineCommand = getDeadlineExe()

    if hideWindow and os.name == 'nt':
        # Python 2.6 has subprocess.STARTF_USESHOWWINDOW, and Python 2.7 has subprocess._subprocess.STARTF_USESHOWWINDOW, so check for both.
        if hasattr(subprocess, '_subprocess') and hasattr(subprocess._subprocess, 'STARTF_USESHOWWINDOW'):
            startupinfo = subprocess.STARTUPINFO()
            startupinfo.dwFlags |= subprocess._subprocess.STARTF_USESHOWWINDOW
        elif hasattr(subprocess, 'STARTF_USESHOWWINDOW'):
            startupinfo = subprocess.STARTUPINFO()
            startupinfo.dwFlags |= subprocess.STARTF_USESHOWWINDOW
    
    if readStdout:
        stdoutPipe=subprocess.PIPE
    
    if isinstance(arguments, list):
        arguments.insert(0, deadlineCommand)
    else:
        arguments = '"{}" {}'.format(deadlineCommand, arguments)

    common.printDebug('Deadline command: {}'.format(arguments))
    common.printDebug('Waiting for reply...')

    try:
        #proc = subprocess.Popen(arguments, bufsize=4096, stdout=stdoutPipe, startupinfo=startupinfo, env=environment)
        proc = subprocess.Popen(arguments, bufsize=1, stdout=stdoutPipe, stdin=subprocess.PIPE, stderr=subprocess.PIPE, startupinfo=startupinfo, env=environment)
        proc.stdin.close()
        proc.stderr.close()
    except Exception as e:
        common.consoleLog('Deadline command failed to execute. System says: {}'.format(e))
        return 'Error: {}'.format(e) # This will trigger caller to break

    common.printDebug('Deadlinecommand.exe returned') # This happens more or less instantly

    if readStdout:
        common.printDebug('Reading output')
        #print proc.communicate()
        output = proc.stdout.read() # This is what freezes on occasion
    
    common.printDebug('Finished callDeadlineCommand method')
    finish = datetime.datetime.now()
    diff = finish - start
    mins, secs = divmod(diff.days * 86400 + diff.seconds, 60)
    common.printDebug('callDeadlineCommand total processing time was: {} minutes, {} seconds'.format(mins, secs))

    return output

PS: There’s a few dependencies for this method that you won’t have on your end, so just re-write accordingly. Like the common module and the getDeadlineExe() method.

2 Likes

Thanks for the code examples. I’ll have a look.

Meanwhile I am trying to use the OpenImageIO python bindings instead of the oiiotool.exe.
Installing and importing the oiio bindings works so far but the job failes after converting 12 images (just reading the exr and writing it back out again).

Sandbox process exited unexpectedly while waiting for response from Plugin.

Converting from exr to jpg works flawless.
Any idea why this might happen?

Looking at the code responsible for that error message, it’s just what it says on the tin.

The sandbox which - is an application that was introduced in Deadline 8.0 that allows plugins and event plugins to run in a separate environment - has stopped without us telling it to.

Is that the first error you get? Or is there more to that log?

As for why - maybe by the 12th image it hangs or crashes once it runs out of memory?

Here is the whole log. Max RAM usage was only 15% so I don’t think that this is the problem.

RenderPluginException

=======================================================
Stack Trace
=======================================================
   at Deadline.Plugins.SandboxedPlugin.d(DeadlineMessage azg)
   at Deadline.Plugins.SandboxedPlugin.RenderScript(String scriptFilename, Boolean preJobScript)
   at Deadline.Slaves.SlaveRenderThread.c(TaskLogWriter adq)

=======================================================
Log
=======================================================
2020-02-11 16:39:54:  0: Loading Job's Plugin timeout is Disabled
2020-02-11 16:39:56:  0: Executing plugin command of type 'Sync Files for Job'
2020-02-11 16:39:56:  0: All job files are already synchronized
2020-02-11 16:39:56:  0: Synchronizing Plugin Vray_2 from \\SERVER\DeadlineRepository10\custom\plugins\Vray_2 took: 0 seconds
2020-02-11 16:39:56:  0: Done executing plugin command of type 'Sync Files for Job'
2020-02-11 16:39:56:  0: Executing plugin command of type 'Initialize Plugin'
2020-02-11 16:39:57:  0: INFO: Executing plugin script 'C:\DeadlineTMP\hp20\plugins\5e3cfb60b20fd94874d3f882\Vray_2.py'
2020-02-11 16:39:57:  0: INFO: About: V-Ray Standalone Plugin for Deadline
2020-02-11 16:39:57:  0: INFO: Render Job As User disabled, running as current user 'Florianbehr'
2020-02-11 16:39:57:  0: INFO: The job's environment will be merged with the current environment before rendering
2020-02-11 16:39:57:  0: Done executing plugin command of type 'Initialize Plugin'
2020-02-11 16:39:57:  0: Start Job timeout is disabled.
2020-02-11 16:39:57:  0: Task timeout is being ignored because this is a Post Job Script Task
2020-02-11 16:39:57:  0: Loaded job: Testing_exrCompression_0001 (5e3cfb60b20fd94874d3f882)
2020-02-11 16:39:57:  0: Plugin executing post job script
2020-02-11 16:39:57:  0: Executing Post Job Script: "//SERVER/DeadlineRepository10/custom/scripts/Jobs/convert_exr_deadline.py"
2020-02-11 16:39:57:  0: Executing plugin command of type 'Execute Script'
2020-02-11 16:39:57:  0: INFO: Executing post job script '//SERVER/DeadlineRepository10/custom/scripts/Jobs/convert_exr_deadline.py'
2020-02-11 16:39:58:  0: INFO: Output file: \\SERVER\FOLDER\3D\maya\Renderings\Testing\Testing_exrCompression_0001\Testing_exrCompression_0001.0004.exr
2020-02-11 16:39:58:  0: INFO: Output file: \\SERVER\FOLDER\3D\maya\Renderings\Testing\Testing_exrCompression_0001\Testing_exrCompression_0001.0005.exr
2020-02-11 16:39:58:  0: INFO: Output file: \\SERVER\FOLDER\3D\maya\Renderings\Testing\Testing_exrCompression_0001\Testing_exrCompression_0001.0006.exr
2020-02-11 16:39:58:  0: INFO: Output file: \\SERVER\FOLDER\3D\maya\Renderings\Testing\Testing_exrCompression_0001\Testing_exrCompression_0001.0007.exr
2020-02-11 16:39:58:  0: INFO: Output file: \\SERVER\FOLDER\3D\maya\Renderings\Testing\Testing_exrCompression_0001\Testing_exrCompression_0001.0008.exr
2020-02-11 16:39:59:  0: INFO: Output file: \\SERVER\FOLDER\3D\maya\Renderings\Testing\Testing_exrCompression_0001\Testing_exrCompression_0001.0009.exr
2020-02-11 16:39:59:  0: INFO: Output file: \\SERVER\FOLDER\3D\maya\Renderings\Testing\Testing_exrCompression_0001\Testing_exrCompression_0001.0010.exr
2020-02-11 16:39:59:  0: INFO: Output file: \\SERVER\FOLDER\3D\maya\Renderings\Testing\Testing_exrCompression_0001\Testing_exrCompression_0001.0011.exr
2020-02-11 16:39:59:  0: INFO: Output file: \\SERVER\FOLDER\3D\maya\Renderings\Testing\Testing_exrCompression_0001\Testing_exrCompression_0001.0012.exr
2020-02-11 16:39:59:  0: INFO: Output file: \\SERVER\FOLDER\3D\maya\Renderings\Testing\Testing_exrCompression_0001\Testing_exrCompression_0001.0013.exr
2020-02-11 16:39:59:  0: INFO: Output file: \\SERVER\FOLDER\3D\maya\Renderings\Testing\Testing_exrCompression_0001\Testing_exrCompression_0001.0014.exr
2020-02-11 16:40:00:  0: INFO: Output file: \\SERVER\FOLDER\3D\maya\Renderings\Testing\Testing_exrCompression_0001\Testing_exrCompression_0001.0016.exr

=======================================================
Details
=======================================================
Date: 02/11/2020 16:40:29
Frames: 0
Elapsed Time: 00:00:00:36
Job Submit Date: 02/07/2020 06:53:36
Job User: florianbehr
Average RAM Usage: 10880399360 (8%)
Peak RAM Usage: 10880778240 (8%)
Average CPU Usage: 15%
Peak CPU Usage: 78%
Used CPU Clocks (x10^6 cycles): 575706
Total CPU Clocks (x10^6 cycles): 3838036

=======================================================
Slave Information
=======================================================
Slave Name: hp20
Version: v10.0.29.0 Release (019d7d1e7)
Operating System: Windows 10 Pro
Running As Service: No
Machine User: Florianbehr
IP Address: 192.168.1.82
MAC Address: 64:51:06:42:2F:F0
CPU Architecture: x64
CPUs: 48
CPU Usage: 0%
Memory Usage: 10.1 GB / 127.9 GB (7%)
Free Disk Space: 190.155 GB (16.963 GB on C:\, 1.679 GB on D:\, 171.513 GB on S:\)
Video Card: NVIDIA GeForce GTX 1080 Ti
Privacy | Site terms | Cookie preferences