AWS Thinkbox Discussion Forums

AWS Houdini / Redshift spot fleet

Having a little trouble with the AWS portal stuff. Currently using the default AMI to launch Houdini engine and Redshift instance. I have UBL enabled and added some credits to my account. The infrastructure runs fine, portal is connected, asset server working (as far as I can tell). Start my spot fleet, gpu button to select gpu instances, click on base AMI for houdini and redshift. It starts up, connects, starts the render, syncs the files, then I get a non-zero fail:

2019-12-12 17:58:01: Port Forwarder (houdini:1715): Client connected to port forwarder.
2019-12-12 17:58:01: 1: INFO: Sending EndTaskRequest to S3BackedCacheClient.
2019-12-12 17:58:01: 1: DEBUG: Request:
2019-12-12 17:58:01: 1: DEBUG: JobId: 5df27ecd08eaee072ef6193d
2019-12-12 17:58:01: 0: INFO: Process exit code: 1
2019-12-12 17:58:01: 1: Done executing plugin command of type ‘Render Task’
2019-12-12 17:58:01: 0: INFO: Sending EndTaskRequest to S3BackedCacheClient.
2019-12-12 17:58:01: 0: DEBUG: Request:
2019-12-12 17:58:01: 0: DEBUG: JobId: 5df27ecd08eaee072ef6193d
2019-12-12 17:58:02: 1: Executing plugin command of type ‘End Job’
2019-12-12 17:58:02: 1: INFO: Sending EndTaskRequest to S3BackedCacheClient.
2019-12-12 17:58:02: 1: DEBUG: Request:
2019-12-12 17:58:02: 1: DEBUG: JobId: 5df27ecd08eaee072ef6193d
2019-12-12 17:58:02: 1: Done executing plugin command of type ‘End Job’
2019-12-12 17:58:02: 0: Done executing plugin command of type ‘Render Task’
2019-12-12 17:58:02: 0: Executing plugin command of type ‘End Job’
2019-12-12 17:58:02: 0: INFO: Sending EndTaskRequest to S3BackedCacheClient.
2019-12-12 17:58:02: 0: DEBUG: Request:
2019-12-12 17:58:02: 0: DEBUG: JobId: 5df27ecd08eaee072ef6193d
2019-12-12 17:58:02: 0: Done executing plugin command of type ‘End Job’
2019-12-12 17:58:02: Sending kill command to process deadlinesandbox with id: 6055
2019-12-12 17:58:02: Sending kill command to process deadlinesandbox with id: 6040
2019-12-12 17:58:03: Scheduler Thread - Render Thread 0 threw a major error:
2019-12-12 17:58:03: >>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
2019-12-12 17:58:03: Exception Details
2019-12-12 17:58:03: RenderPluginException – Error: Renderer returned non-zero error code, 1. Check the log for more information.
2019-12-12 17:58:03: at Deadline.Plugins.PluginWrapper.RenderTasks(String taskId, Int32 startFrame, Int32 endFrame, String& outMessage, AbortLevel& abortLevel)
2019-12-12 17:58:03: RenderPluginException.Cause: JobError (2)
2019-12-12 17:58:03: RenderPluginException.Level: Major (1)
2019-12-12 17:58:03: RenderPluginException.HasSlaveLog: True
2019-12-12 17:58:03: RenderPluginException.SlaveLogFileName: /var/log/Thinkbox/Deadline10/deadlineslave_renderthread_0-ip-10-128-25-99-0000.log
2019-12-12 17:58:03: Exception.Data: ( )
2019-12-12 17:58:03: Exception.TargetSite: Deadline.Slaves.Messaging.PluginResponseMemento d(Deadline.Net.DeadlineMessage)
2019-12-12 17:58:03: Exception.Source: deadline
2019-12-12 17:58:03: Exception.HResult: -2146233088
2019-12-12 17:58:03: Exception.StackTrace:
2019-12-12 17:58:03: at Deadline.Plugins.SandboxedPlugin.d(DeadlineMessage bbd
2019-12-12 17:58:03: at Deadline.Plugins.SandboxedPlugin.RenderTask(String taskId, Int32 startFrame, Int32 endFrame
2019-12-12 17:58:03: at Deadline.Slaves.SlaveRenderThread.c(TaskLogWriter aep)
2019-12-12 17:58:03: <<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<
2019-12-12 17:58:04: Scheduler Thread - Render Thread 1 threw a major error:
2019-12-12 17:58:04: >>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
2019-12-12 17:58:04: Exception Details
2019-12-12 17:58:04: RenderPluginException – Error: Renderer returned non-zero error code, 1. Check the log for more information.
2019-12-12 17:58:04: at Deadline.Plugins.PluginWrapper.RenderTasks(String taskId, Int32 startFrame, Int32 endFrame, String& outMessage, AbortLevel& abortLevel)
2019-12-12 17:58:04: RenderPluginException.Cause: JobError (2)
2019-12-12 17:58:04: RenderPluginException.Level: Major (1)
2019-12-12 17:58:04: RenderPluginException.HasSlaveLog: True
2019-12-12 17:58:04: RenderPluginException.SlaveLogFileName: /var/log/Thinkbox/Deadline10/deadlineslave_renderthread_1-ip-10-128-25-99-0000.log
2019-12-12 17:58:04: Exception.Data: ( )
2019-12-12 17:58:04: Exception.TargetSite: Deadline.Slaves.Messaging.PluginResponseMemento d(Deadline.Net.DeadlineMessage)
2019-12-12 17:58:04: Exception.Source: deadline
2019-12-12 17:58:04: Exception.HResult: -2146233088
2019-12-12 17:58:04: Exception.StackTrace:
2019-12-12 17:58:04: at Deadline.Plugins.SandboxedPlugin.d(DeadlineMessage bbd
2019-12-12 17:58:04: at Deadline.Plugins.SandboxedPlugin.RenderTask(String taskId, Int32 startFrame, Int32 endFrame
2019-12-12 17:58:04: at Deadline.Slaves.SlaveRenderThread.c(TaskLogWriter aep)
2019-12-12 17:58:04: <<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<
2019-12-12 17:58:06: Message From License Forwarder: Success: Machine registered (ip-10-128-25-99/::ffff:10.128.25.99).
2019-12-12 17:58:06: Message From License Forwarder: Success: Machine registered (ip-10-128-25-99/::ffff:10.128.25.99).

And then I get

2019-12-12 17:58:18: Port Forwarder (houdini:1715): Client connected to port forwarder.
2019-12-12 17:58:19: 1: STDOUT: [Redshift] Redshift for Houdini plugin version 2.6.48 (Sep 20 2019 13:55:52)
2019-12-12 17:58:19: 1: STDOUT: [Redshift] Plugin compile time HDK version: 17.5.360
2019-12-12 17:58:19: 1: STDOUT: [Redshift] Houdini host version: 17.5.360
2019-12-12 17:58:19: 1: STDOUT: [Redshift] Plugin dso/dll and config path: /usr/redshift/redshift4houdini/17.5.360/dso
2019-12-12 17:58:19: 1: STDOUT: [Redshift] Core data path: /usr/redshift
2019-12-12 17:58:19: 1: STDOUT: [Redshift] Local data path: /home/ec2-user/redshift
2019-12-12 17:58:19: 1: STDOUT: [Redshift] Procedurals path: /usr/redshift/procedurals
2019-12-12 17:58:19: 1: STDOUT: [Redshift] Preferences file path: /home/ec2-user/redshift/preferences.xml
2019-12-12 17:58:19: 1: STDOUT: [Redshift] License path: /home/ec2-user/redshift
2019-12-12 17:58:19: Port Forwarder (houdini:1715): Client connected to port forwarder.
2019-12-12 17:58:19: 1: STDOUT: Detected Houdini version: (17, 5, 360)
2019-12-12 17:58:19: 1: STDOUT: [’/home/ec2-user/Thinkbox/Deadline10/slave/ip-10-128-25-99/plugins/5df27ecd08eaee072ef6193d/hrender_dl.py’, ‘-f’, ‘1021’, ‘1040’, ‘1’, ‘-g’, ‘-d’, '/out/REDACTED, ‘-gpu’, ‘1’, '/mnt/Data/mntjobs628230096908fa9a6a6f41c056cfb0cc/REDACTED
2019-12-12 17:58:19: 1: STDOUT: Start: 1021
2019-12-12 17:58:19: 1: STDOUT: End: 1040
2019-12-12 17:58:19: 1: STDOUT: Increment: 1
2019-12-12 17:58:19: 1: STDOUT: Ignore Inputs: True
2019-12-12 17:58:19: 1: STDOUT: No output specified. Output will be handled by the driver
2019-12-12 17:58:19: 1: STDOUT: GPUs: 1
2019-12-12 17:58:19: 1: STDOUT: Driver: /out/*REDACTED
2019-12-12 17:58:19: 1: STDOUT: Input File: /mnt/Data/mntjobs628230096908fa9a6a6f41c056cfb0cc/REDACTED
2019-12-12 17:58:19: 1: STDOUT: Traceback (most recent call last):
2019-12-12 17:58:19: 1: STDOUT: File “/home/ec2-user/Thinkbox/Deadline10/slave/ip-10-128-25-99/plugins/5df27ecd08eaee072ef6193d/hrender_dl.py”, line 149, in
2019-12-12 17:58:19: 1: STDOUT: hou.hipFile.load( inputFile )
2019-12-12 17:58:19: 1: STDOUT: File “/opt/hfs17.5/houdini/python2.7libs/hou.py”, line 35874, in load
2019-12-12 17:58:19: 1: STDOUT: return _hou.hipFile_load(*args, **kwargs)
2019-12-12 17:58:19: 1: STDOUT: hou.OperationFailed: The attempted operation failed.
2019-12-12 17:58:19: 1: STDOUT: Unable to open file: /mnt/Data/mntjobs628230096908fa9a6a6f41c056cfb0cc?REMOVED THIS TO PROTECT NDA’d CLIENT
2019-12-12 17:58:19: 1: STDOUT: [Redshift] Closing the RS instance. End of the plugin log system.
2019-12-12 17:58:19: Port Forwarder (houdini:1715): Client connected to port forwarder.
2019-12-12 17:58:20: 1: INFO: Process exit code: 1
2019-12-12 17:58:20: 1: INFO: Sending EndTaskRequest to S3BackedCacheClient.
2019-12-12 17:58:20: 1: DEBUG: Request:
2019-12-12 17:58:20: 1: DEBUG: JobId: 5df27ecd08eaee072ef6193d
2019-12-12 17:58:20: 1: Done executing plugin command of type ‘Render Task’
2019-12-12 17:58:20: 1: In the process of canceling current task: ignoring exception thrown by PluginLoader
2019-12-12 17:58:20: 1: Executing plugin command of type ‘End Job’
2019-12-12 17:58:20: 1: Done executing plugin command of type ‘End Job’
2019-12-12 17:58:20: Scheduler Thread - Unexpected Error Occurred
2019-12-12 17:58:20: >>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
2019-12-12 17:58:20: Exception Details
2019-12-12 17:58:20: PythonNetException – Exception : Unable to restore previous environment. Could not find path: ‘/var/lib/Thinkbox/Deadline10/hserverAddress’
2019-12-12 17:58:20: Exception.Data: ( )
2019-12-12 17:58:20: Exception.TargetSite: Void c(System.Exception)
2019-12-12 17:58:20: Exception.Source: franticx
2019-12-12 17:58:20: Exception.HResult: -2146233088
2019-12-12 17:58:20: Exception.StackTrace:
2019-12-12 17:58:20: File “none”, line 27, in main
2019-12-12 17:58:20: <<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<
2019-12-12 17:58:20: Sending kill command to process deadlinesandbox with id: 6473
2019-12-12 17:58:21: Message From License Forwarder: Success: Machine unregistered (ip-10-128-25-99/::ffff:10.128.25.99).
2019-12-12 17:58:21: Port Forwarder (redshift:5054): Port Forwarder shutting down.
2019-12-12 17:58:21: Port Forwarder (redshift:7054): Port Forwarder shutting down.
2019-12-12 17:58:21: Message From License Forwarder: Success: Machine unregistered (ip-10-128-25-99/::ffff:10.128.25.99).
2019-12-12 17:58:21: Port Forwarder (houdini:1715): Port Forwarder shutting down.
2019-12-12 17:58:22: 1: WARNING - Render thread is unresponsive shutting down job “UnknownJobID” that uses plugin “UnknownPlugin”. Cannot proceed until it exits.
2019-12-12 17:58:24: Message From License Forwarder: Success: Machine registered (ip-10-128-25-99/::ffff:10.128.25.99).
2019-12-12 17:58:25: Port Forwarder (redshift:5054): Port forwarder created.
2019-12-12 17:58:25: Port Forwarder (redshift:7054): Port forwarder created.
2019-12-12 17:58:25: Message From License Forwarder: Success: Machine registered (ip-10-128-25-99/::ffff:10.128.25.99).
2019-12-12 17:58:25: Scheduler Thread - Job’s Limit Groups: houdini, redshift
2019-12-12 17:58:26: 0: Loading Job’s Plugin timeout is Disabled
2019-12-12 17:58:28: 0: Executing plugin command of type ‘Sync Files for Job’
2019-12-12 17:58:28: 0: All job files are already synchronized
2019-12-12 17:58:28: 0: Plugin Houdini was already synchronized.
2019-12-12 17:58:28: 0: Done executing plugin command of type ‘Sync Files for Job’
2019-12-12 17:58:28: 0: Executing plugin command of type ‘Initialize Plugin’
2019-12-12 17:58:29: 0: INFO: Executing plugin script ‘/home/ec2-user/Thinkbox/Deadline10/slave/ip-10-128-25-99/plugins/5df27ecd08eaee072ef6193d/Houdini.py’
2019-12-12 17:58:29: 0: INFO: About: Houdini Plugin for Deadline
2019-12-12 17:58:29: 0: INFO: Render Job As User disabled, running as current user ‘ec2-user’
2019-12-12 17:58:29: 0: INFO: The job’s environment will be merged with the current environment before rendering
2019-12-12 17:58:29: 0: Done executing plugin command of type ‘Initialize Plugin’

Any thoughts?

Hello!

Based on the line:

2019-12-12 17:58:19: 1: STDOUT: Unable to open file: /mnt/Data/mntjobs628230096908fa9a6a6f41c056cfb0cc?REMOVED THIS TO PROTECT NDA’d CLIENT

It could be that that file wasn’t uploaded to S3. You’ll need to check out that AWS Asset Server Logs which live in either %PROGRAMDATA%\Thinkbox\AWSPortalAssetServer\logs\ or /var/log/Thinkbox/AWSPortalAssetServer/.

Look for the name of that file, and ideally it’ll have some extra info about either failing or succeeding to upload the file.

Take a look and let us know what you find!

Thanks for the info. The log doesn’t say much, just :

1576191660.759743 2019-12-12 23:01:00,759 [/opt/Thinkbox/AWSPortalAssetServer/awsportalassetserverlib/share_util.py:refresh_shares:86] [root] [1231] [MainThread] [INFO] Refreshing shares list.
1576191660.762903 2019-12-12 23:01:00,762 [/opt/Thinkbox/AWSPortalAssetServer/awsportalassetserverlib/share_util.py:refresh_shares:93] [root] [1231] [MainThread] [INFO] Share: Path: /mnt/library/ Id: mntlibraryc8252bef688a51738232a26cbec60465
1576191660.763902 2019-12-12 23:01:00,763 [/opt/Thinkbox/AWSPortalAssetServer/awsportalassetserverlib/share_util.py:refresh_shares:93] [root] [1231] [MainThread] [INFO] Share: Path: /mnt/cache01/ Id: mntcache01ba2adf9cde6e0d3d9671eac70e2a2a43
1576191660.764945 2019-12-12 23:01:00,764 [/opt/Thinkbox/AWSPortalAssetServer/awsportalassetserverlib/share_util.py:refresh_shares:93] [root] [1231] [MainThread] [INFO] Share: Path: /mnt/cache03/ Id: mntcache03ed541b216f58930e02fcd7d6c8d0f6d2
1576191660.765879 2019-12-12 23:01:00,765 [/opt/Thinkbox/AWSPortalAssetServer/awsportalassetserverlib/share_util.py:refresh_shares:93] [root] [1231] [MainThread] [INFO] Share: Path: /mnt/jobs/ Id: mntjobs628230096908fa9a6a6f41c056cfb0cc
1576191663.273913 2019-12-12 23:01:03,273 [/opt/Thinkbox/AWSPortalAssetServer/awsportalassetserver.py:get_and_set_ip_address:87] [root] [1231] [MainThread] [INFO] IPAddress set to 192.168.1.149

Then repeats that every 2 minutes.
This was just taken while trying to run a job. I got this from the machine itself:

Connecting to ip-10-128-55-199…
Success: Connected to Pulse, attempting to redirect command to target…
2019-12-12 22:58:36: Purging old logs and temp files
2019-12-12 22:58:39: Port Forwarder (houdini:1715): Port forwarder created.
2019-12-12 22:58:39: PYTHON: Settings Address. Prev Address is
2019-12-12 22:58:39: Port Forwarder (houdini:1715): Client connected to port forwarder.
2019-12-12 22:58:39: Worker - Confirmed Credit Usage for “houdini”.
2019-12-12 22:58:40: Message From License Forwarder: Success: Machine registered (ip-10-128-55-199/::ffff:10.128.55.199).
2019-12-12 22:58:41: Port Forwarder (redshift:5054): Port forwarder created.
2019-12-12 22:58:41: Port Forwarder (redshift:7054): Port forwarder created.
2019-12-12 22:58:41: Message From License Forwarder: Success: Machine registered (ip-10-128-55-199/::ffff:10.128.55.199).
2019-12-12 22:58:41: Scheduler Thread - Job’s Limit Groups: houdini, redshift
2019-12-12 22:58:42: 0: Loading Job’s Plugin timeout is Disabled
2019-12-12 22:58:44: 0: Executing plugin command of type ‘Sync Files for Job’
2019-12-12 22:58:44: 0: All job files are already synchronized
2019-12-12 22:58:44: 0: Synchronizing Plugin Houdini from /home/ec2-user/Thinkbox/Deadline10/cache/AboLy9iNSNbvdsCWV7J4G0fx0/custom/plugins/Houdini took: 0 seconds
2019-12-12 22:58:44: 0: Done executing plugin command of type ‘Sync Files for Job’
2019-12-12 22:58:44: 0: Executing plugin command of type ‘Initialize Plugin’
2019-12-12 22:58:45: 0: INFO: Executing plugin script ‘/home/ec2-user/Thinkbox/Deadline10/slave/ip-10-128-55-199/plugins/5df2c0d57b181820bd632b83/Houdini.py’
2019-12-12 22:58:45: 0: INFO: About: Houdini Plugin for Deadline
2019-12-12 22:58:45: 0: INFO: Render Job As User disabled, running as current user ‘ec2-user’
2019-12-12 22:58:45: 0: INFO: The job’s environment will be merged with the current environment before rendering
2019-12-12 22:58:45: 0: Done executing plugin command of type ‘Initialize Plugin’
2019-12-12 22:58:45: 0: Start Job timeout is disabled.
2019-12-12 22:58:45: 0: Task timeout is disabled.
2019-12-12 22:58:45: 0: Loaded job: *REDACTED
2019-12-12 22:58:45: 0: Executing plugin command of type ‘Start Job’
2019-12-12 22:58:46: 0: INFO: Sending StartTaskRequest to S3BackedCacheClient.
2019-12-12 22:58:46: 0: DEBUG: Request:
2019-12-12 22:58:46: 0: DEBUG: JobId: 5df2c0d57b181820bd632b83
2019-12-12 22:58:46: 0: DEBUG: JobUploadWhitelist: REDACTED.####.exr
2019-12-12 22:58:46: 0: DEBUG: JobUploadWhitelistRe: ^.+.abc$, ^.+.avi$, ^.+.bmp$, ^.+.bw$, ^.+.cin$, ^.+.cjp$, ^.+.cjpg$, ^.+.cxr$, ^.+.dds$, ^.+.dpx$, ^.+.dwf$, ^.+.dwfx$, ^.+.dwg$, ^.+.dxf$, ^.+.dxx$, ^.+.eps$, ^.+.exr$, ^.+.fbx$, ^.+.fxr$, ^.+.hdr$, ^.+.icb$, ^.+.iff$, ^.+.iges$, ^.+.igs$, ^.+.int$, ^.+.inta$, ^.+.iris$, ^.+.jpe$, ^.+.jpeg$, ^.+.jpg$, ^.+.jp2$, ^.+.mcc$, ^.+.mcx$, ^.+.mov$, ^.+.mxi$, ^.+.pdf$, ^.+.pic$, ^.+.png$, ^.+.prt$, ^.+.ps$, ^.+.psd$, ^.+.rgb$, ^.+.rgba$, ^.+.rla$, ^.+.rpf$, ^.+.sat$, ^.+.sgi$, ^.+.stl$, ^.+.sxr$, ^.+.targa$, ^.+.tga$, ^.+.tif$, ^.+.tiff$, ^.+.tim$, ^.+.vda$, ^.+.vrimg$, ^.+.vrmesh$, ^.+.vrsm$, ^.+.vrst$, ^.+.vst$, ^.+.wmf$, ^.+.ass$, ^.+.gz$, ^.+.ifd$, ^.+.mi$, ^.+.mi2$, ^.+.mxi$, ^.+.rib$, ^.+.rs$, ^.+.vrscene$
2019-12-12 22:58:46: 0: DEBUG: S3BackedCache Client Returned Sequence: 2
2019-12-12 22:58:46: 0: INFO: Executing global asset transfer preload script ‘/home/ec2-user/Thinkbox/Deadline10/slave/ip-10-128-55-199/plugins/5df2c0d57b181820bd632b83/GlobalAssetTransferPreLoad.py’
2019-12-12 22:58:46: 0: INFO: Looking for AWS Portal File Transfer…
2019-12-12 22:58:46: 0: INFO: Looking for File Transfer controller in /opt/Thinkbox/S3BackedCache/bin/task.py…
2019-12-12 22:58:46: 0: INFO: Could not find AWS Portal File Transfer.
2019-12-12 22:58:46: 0: INFO: AWS Portal File Transfer is not installed on the system.
2019-12-12 22:58:46: 0: Done executing plugin command of type ‘Start Job’
2019-12-12 22:58:46: 0: Plugin rendering frame(s): 1021-1040
2019-12-12 22:58:46: 0: Executing plugin command of type ‘Render Task’
2019-12-12 22:58:46: 0: INFO: Sending StartTaskRequest to S3BackedCacheClient.
2019-12-12 22:58:46: 0: DEBUG: Request:
2019-12-12 22:58:46: 0: DEBUG: JobId: 5df2c0d57b181820bd632b83
2019-12-12 22:58:46: 0: DEBUG: JobUploadWhitelist: REDACTED.####.exr
2019-12-12 22:58:46: 0: DEBUG: JobUploadWhitelistRe: ^.+.abc$, ^.+.avi$, ^.+.bmp$, ^.+.bw$, ^.+.cin$, ^.+.cjp$, ^.+.cjpg$, ^.+.cxr$, ^.+.dds$, ^.+.dpx$, ^.+.dwf$, ^.+.dwfx$, ^.+.dwg$, ^.+.dxf$, ^.+.dxx$, ^.+.eps$, ^.+.exr$, ^.+.fbx$, ^.+.fxr$, ^.+.hdr$, ^.+.icb$, ^.+.iff$, ^.+.iges$, ^.+.igs$, ^.+.int$, ^.+.inta$, ^.+.iris$, ^.+.jpe$, ^.+.jpeg$, ^.+.jpg$, ^.+.jp2$, ^.+.mcc$, ^.+.mcx$, ^.+.mov$, ^.+.mxi$, ^.+.pdf$, ^.+.pic$, ^.+.png$, ^.+.prt$, ^.+.ps$, ^.+.psd$, ^.+.rgb$, ^.+.rgba$, ^.+.rla$, ^.+.rpf$, ^.+.sat$, ^.+.sgi$, ^.+.stl$, ^.+.sxr$, ^.+.targa$, ^.+.tga$, ^.+.tif$, ^.+.tiff$, ^.+.tim$, ^.+.vda$, ^.+.vrimg$, ^.+.vrmesh$, ^.+.vrsm$, ^.+.vrst$, ^.+.vst$, ^.+.wmf$, ^.+.ass$, ^.+.gz$, ^.+.ifd$, ^.+.mi$, ^.+.mi2$, ^.+.mxi$, ^.+.rib$, ^.+.rs$, ^.+.vrscene$
2019-12-12 22:58:46: 0: DEBUG: S3BackedCache Client Returned Sequence: 2
2019-12-12 22:58:46: 0: INFO: Starting Houdini Job
2019-12-12 22:58:46: 0: INFO: Stdout Redirection Enabled: True
2019-12-12 22:58:46: 0: INFO: Asynchronous Stdout Enabled: False
2019-12-12 22:58:46: 0: INFO: Stdout Handling Enabled: True
2019-12-12 22:58:46: 0: INFO: Popup Handling Enabled: True
2019-12-12 22:58:46: 0: INFO: QT Popup Handling Enabled: False
2019-12-12 22:58:46: 0: INFO: WindowsForms10.Window.8.app.
Popup Handling Enabled: False
2019-12-12 22:58:46: 0: INFO: Using Process Tree: True
2019-12-12 22:58:46: 0: INFO: Hiding DOS Window: True
2019-12-12 22:58:46: 0: INFO: Creating New Console: False
2019-12-12 22:58:46: 0: INFO: Running as user: ec2-user
2019-12-12 22:58:46: 0: INFO: Executable: “/opt/hfs17.5/bin/hython”
2019-12-12 22:58:46: 0: CheckPathMapping: Swapped “/mnt/jobs/*REDACTED” with "/mnt/Data/mntjobs628230096908fa9a6a6f41c056cfb0cc/*REDACTED
2019-12-12 22:58:46: 0: INFO: Argument: “/home/ec2-user/Thinkbox/Deadline10/slave/ip-10-128-55-199/plugins/5df2c0d57b181820bd632b83/hrender_dl.py” -f 1021 1040 1 -g -d /out/snowFRONT/snowFRONT -gpu 0 "/mnt/Data/mntjobs628230096908fa9a6a6f41c056cfb0cc/*REDACTED
2019-12-12 22:58:46: 0: INFO: Full Command: “/opt/hfs17.5/bin/hython” “/home/ec2-user/Thinkbox/Deadline10/slave/ip-10-128-55-199/plugins/5df2c0d57b181820bd632b83/hrender_dl.py” -f 1021 1040 1 -g -d /out/*REDACTED -gpu 0 "/mnt/Data/mntjobs628230096908fa9a6a6f41c056cfb0cc/*REDACTED
2019-12-12 22:58:46: 0: INFO: Startup Directory: “/opt/hfs17.5/bin”
2019-12-12 22:58:46: 0: INFO: Process Priority: BelowNormal
2019-12-12 22:58:46: 0: INFO: Process Affinity: default
2019-12-12 22:58:46: 0: INFO: Process is now running
Success
2019-12-12 22:59:12: 0: STDOUT: ls: cannot access /dev/disk/by-id/: No such file or directory
2019-12-12 22:59:12: 0: STDOUT: ls: cannot access /dev/disk/by-id/: No such file or directory
2019-12-12 22:59:12: 0: STDOUT: cat: /sys/devices/virtual/dmi/id/board_vendor: No such file or directory
2019-12-12 22:59:12: 0: STDOUT: cat: /sys/devices/virtual/dmi/id/board_name: No such file or directory
2019-12-12 22:59:12: 0: STDOUT: cat: /sys/devices/virtual/dmi/id/board_version: No such file or directory
2019-12-12 22:59:12: 0: STDOUT: sh: lsb_release: command not found
2019-12-12 22:59:24: 0: STDOUT: [Redshift] Redshift for Houdini plugin version 2.6.48 (Sep 20 2019 13:55:52)
2019-12-12 22:59:24: 0: STDOUT: [Redshift] Plugin compile time HDK version: 17.5.360
2019-12-12 22:59:24: 0: STDOUT: [Redshift] Houdini host version: 17.5.360
2019-12-12 22:59:24: 0: STDOUT: [Redshift] Plugin dso/dll and config path: /usr/redshift/redshift4houdini/17.5.360/dso
2019-12-12 22:59:24: 0: STDOUT: [Redshift] Core data path: /usr/redshift
2019-12-12 22:59:24: 0: STDOUT: [Redshift] Local data path: /home/ec2-user/redshift
2019-12-12 22:59:24: 0: STDOUT: [Redshift] Procedurals path: /usr/redshift/procedurals
2019-12-12 22:59:24: 0: STDOUT: [Redshift] Preferences file path: /home/ec2-user/redshift/preferences.xml
2019-12-12 22:59:24: 0: STDOUT: [Redshift] License path: /home/ec2-user/redshift

Then sat there for 15 minutes without update. That normal? Looks like something different with this machine than the last. Can’t tell if I got farther or shorter than before.

Looks like what I posted last was just a fluke? Today I am getting an upload error message:

An error occurred (AccessDenied) when calling the CreateMultipartUpload operation: Access Denied

I’ve tried clicking the fix policies button but still the same error. Anything I can try?

Just had a phone call with Deadline support and the issue was resolved with changing a blank IP in the IAM policy that deadline creates.

2 Likes

Would’ve been nice to mention which policy that is. I am struggling with a similar issue.

This was a while ago and there have been some changes to the IAM policies but I believe it was in the AWSPortal IAM policy at the time. Just look through the json if there is an IP allow section.

We solved it too. I think it was a reference to a bucket in the wrong region, left over from before. After cleaning and recreating everything it worked.

1 Like
Privacy | Site terms | Cookie preferences