AWS Thinkbox Discussion Forums

I try to use deadline on AWS but I can't

Hi there,

I try to stand Worker on AWS for test via aws portal.
Worker instance has redshift and houdini.
I submitted render job.
But deadline monitor display the below error.

=======================================================
Log
=======================================================
2024-01-17 09:15:08:  0: Loading Job's Plugin timeout is Disabled
2024-01-17 09:15:08:  0: SandboxedPlugin: Render Job As User disabled, running as current user 'ec2-user'
2024-01-17 09:15:11:  0: Executing plugin command of type 'Initialize Plugin'
2024-01-17 09:15:11:  0: INFO: Executing plugin script '/var/lib/Thinkbox/Deadline10/workers/ip-10-128-62-150/plugins/65a79a3b44157858675ef55b/Houdini.py'
2024-01-17 09:15:11:  0: INFO: Plugin execution sandbox using Python version 3
2024-01-17 09:15:11:  0: Encountered an error while executing plugin command of type 'Initialize Plugin'
2024-01-17 09:15:11:  0: Unhandled exception. System.NullReferenceException: Object reference not set to an instance of an object.
2024-01-17 09:15:11:  0:    at Deadline.Plugins.DeadlinePlugin.CancelTask()
2024-01-17 09:15:11:  0:    at Deadline.Plugins.PluginWrapper.CancelTask()
2024-01-17 09:15:11:  0:    at Deadline.Slaves.CommandListener.e(String air)
2024-01-17 09:15:11:  0:    at Deadline.Slaves.CommandListener.d(Object aiq)
2024-01-17 09:15:11:  0:    at Deadline.Slaves.CommandListener.b(DeadlineMessage aio, Exception aip)
2024-01-17 09:15:11:  0:    at Deadline.Net.DeadlineMessageUtils.a.a(IAsyncResult bhs)
2024-01-17 09:15:11:  0:    at System.Threading.Tasks.TaskToApm.TaskAsyncResult.InvokeCallback()
2024-01-17 09:15:11:  0:    at System.Threading.Tasks.AwaitTaskContinuation.<>c.<.cctor>b__17_0(Object state)
2024-01-17 09:15:11:  0:    at System.Threading.ExecutionContext.RunInternal(ExecutionContext executionContext, ContextCallback callback, Object state)
2024-01-17 09:15:11:  0: --- End of stack trace from previous location ---
2024-01-17 09:15:11:  0:    at System.Threading.ExecutionContext.RunInternal(ExecutionContext executionContext, ContextCallback callback, Object state)
2024-01-17 09:15:11:  0:    at System.Threading.Tasks.AwaitTaskContinuation.RunCallback(ContextCallback callback, Object state, Task& currentTask)
2024-01-17 09:15:11:  0: --- End of stack trace from previous location ---
2024-01-17 09:15:11:  0:    at System.Threading.Tasks.Task.<>c.<ThrowAsync>b__128_1(Object state)
2024-01-17 09:15:11:  0:    at System.Threading.QueueUserWorkItemCallbackDefaultContext.Execute()
2024-01-17 09:15:11:  0:    at System.Threading.ThreadPoolWorkQueue.Dispatch()
2024-01-17 09:15:11:  0:    at System.Threading.PortableThreadPool.WorkerThread.WorkerThreadStart()
2024-01-17 09:15:11:  0:    at System.Threading.Thread.StartCallback()

What should I try to do?

Best Regards
Taka

It looks like the plugin sandbox is failing to start up, which version of Deadline is this and is this using one of our AMIs or one you’ve created?

The issue with these sandbox startup issues is the better data is in the Worker log, which doesn’t get streamed to the machine. Could you right click the Worker that’s having issues rendering and choose ‘Connect to Worker Log’ and grab the 100 lines around the below message when you see it?

2024-01-17 09:15:11:  0: Encountered an error while executing plugin command of type 'Initialize Plugin'
2024-01-17 09:15:11:  0: Unhandled exception. System.NullReferenceException: Object reference not set to an instance of an object.

Thank you for your reply.

My deadline repository vesrion is 10.2.
My deadline client version is 10.3.
In this case, I useed the AMI created by Thinkbox not me.

Sorry, I can’t do that quickly. I’m not operator. The operator takes some time off today.
When the operator do, I show you 100 lines.

I shoud update deadline repository version, right?

Yep! Get your repository and clients all on the same version. My guess would be that the 10.3 client is failing to run something with a script from the 10.2 repository.

In 10.3 we dropped support for Python 2, and moved to Python 3.10. So I’m expecting a Python mis-match issue to be the cause.

Thank you for your reply.
Thanks to your advice, this error disappeared.
But there is new below error I have struggled.
Please advice for us.

=======================================================
Error
=======================================================
Error: Deadline was unable to find Houdini 19_5 on the machine ip-10-128-55-213.

It tried and failed to find the render executable at these configured locations:
C:\Program Files\Side Effects Software\Houdini 19.5.640\bin\hython.exe
/Applications/Houdini/Houdini19.5.640/Frameworks/Houdini.framework/Versions/19.5.640/Resources/bin/hython
/opt/hfs19.5/bin/hython

Deadline checks each of the above paths and will attempt to run only the first program or script that is present on the file system. We attempt to provide the typical defaults for you when possible.

For local render nodes, you will need to ensure the particular software is installed and ensure list of globally configured locations is accurate for your installation.
To configure a custom location, in the Monitor while in Super User mode select Tools -> Configure Plugins -> Houdini, then provide the path to the executable you would like Deadline to run for this plugin.

If this is a Thinkbox provided AMI for AWS Portal, it may be that the AMI in use does not support this particular application. Please ensure the image has the correct software and version available.
   at Deadline.Plugins.PluginWrapper.RenderTasks(Task task, String& outMessage, AbortLevel& abortLevel)

=======================================================
Type
=======================================================
RenderPluginException

=======================================================
Stack Trace
=======================================================
   at Deadline.Plugins.SandboxedPlugin.d(DeadlineMessage bgt, CancellationToken bgu)
   at Deadline.Plugins.SandboxedPlugin.RenderTask(Task task, CancellationToken cancellationToken)
   at Deadline.Slaves.SlaveRenderThread.c(TaskLogWriter ajy, CancellationToken ajz)

=======================================================
Log
=======================================================
2024-01-24 09:39:06:  0: Loading Job's Plugin timeout is Disabled
2024-01-24 09:39:06:  0: SandboxedPlugin: Render Job As User disabled, running as current user 'ec2-user'
2024-01-24 09:39:08:  0: Executing plugin command of type 'Initialize Plugin'
2024-01-24 09:39:08:  0: INFO: Executing plugin script '/var/lib/Thinkbox/Deadline10/workers/ip-10-128-55-213/plugins/65b0da8db5232bd1f0c079e7/Houdini.py'
2024-01-24 09:39:08:  0: INFO: Plugin execution sandbox using Python version 3
2024-01-24 09:39:08:  0: INFO: About: Houdini Plugin for Deadline
2024-01-24 09:39:08:  0: INFO: The job's environment will be merged with the current environment before rendering
2024-01-24 09:39:08:  0: Done executing plugin command of type 'Initialize Plugin'
2024-01-24 09:39:08:  0: Start Job timeout is disabled.
2024-01-24 09:39:08:  0: Task timeout is disabled.
2024-01-24 09:39:08:  0: Loaded job: rs_dl_render_test - /out/Redshift_ROP1 (65b0da8db5232bd1f0c079e7)
2024-01-24 09:39:08:  0: Executing plugin command of type 'Start Job'
2024-01-24 09:39:08:  0: INFO: Sending StartTaskRequest to S3BackedCacheClient.
2024-01-24 09:39:08:  0: DEBUG: Request:
2024-01-24 09:39:08:  0: DEBUG: 	JobId: 65b0da8db5232bd1f0c079e7
2024-01-24 09:39:08:  0: DEBUG: 	JobUploadWhitelist: rs_dl_render_test.Redshift_ROP1.####.exr
2024-01-24 09:39:08:  0: DEBUG: 	JobUploadWhitelistRe: ^.+\.abc$, ^.+\.avi$, ^.+\.bmp$, ^.+\.bw$, ^.+\.cin$, ^.+\.cjp$, ^.+\.cjpg$, ^.+\.cxr$, ^.+\.dds$, ^.+\.dpx$, ^.+\.dwf$, ^.+\.dwfx$, ^.+\.dwg$, ^.+\.dxf$, ^.+\.dxx$, ^.+\.eps$, ^.+\.exr$, ^.+\.fbx$, ^.+\.fxr$, ^.+\.hdr$, ^.+\.icb$, ^.+\.iff$, ^.+\.iges$, ^.+\.igs$, ^.+\.int$, ^.+\.inta$, ^.+\.iris$, ^.+\.jpe$, ^.+\.jpeg$, ^.+\.jpg$, ^.+\.jp2$, ^.+\.mcc$, ^.+\.mcx$, ^.+\.mov$, ^.+\.mxi$, ^.+\.pdf$, ^.+\.pic$, ^.+\.png$, ^.+\.prt$, ^.+\.ps$, ^.+\.psd$, ^.+\.rgb$, ^.+\.rgba$, ^.+\.rla$, ^.+\.rpf$, ^.+\.sat$, ^.+\.sgi$, ^.+\.stl$, ^.+\.sxr$, ^.+\.targa$, ^.+\.tga$, ^.+\.tif$, ^.+\.tiff$, ^.+\.tim$, ^.+\.vda$, ^.+\.vrimg$, ^.+\.vrmesh$, ^.+\.vrsm$, ^.+\.vrst$, ^.+\.vst$, ^.+\.wmf$, ^.+\.ass$, ^.+\.gz$, ^.+\.ifd$, ^.+\.mi$, ^.+\.mi2$, ^.+\.mxi$, ^.+\.rib$, ^.+\.rs$, ^.+\.vrscene$
2024-01-24 09:39:08:  0: DEBUG: S3BackedCache Client Returned Sequence: 7
2024-01-24 09:39:08:  0: INFO: Executing global asset transfer preload script '/var/lib/Thinkbox/Deadline10/workers/ip-10-128-55-213/plugins/65b0da8db5232bd1f0c079e7/GlobalAssetTransferPreLoad.py'
2024-01-24 09:39:08:  0: INFO: Looking for legacy (pre-10.0.26) AWS Portal File Transfer...
2024-01-24 09:39:08:  0: INFO: Looking for legacy (pre-10.0.26) File Transfer controller in /opt/Thinkbox/S3BackedCache/bin/task.py...
2024-01-24 09:39:08:  0: INFO: Could not find legacy (pre-10.0.26) AWS Portal File Transfer.
2024-01-24 09:39:08:  0: INFO: Legacy (pre-10.0.26) AWS Portal File Transfer is not installed on the system.
2024-01-24 09:39:08:  0: Done executing plugin command of type 'Start Job'
2024-01-24 09:39:08:  0: Plugin rendering frame(s): 16
2024-01-24 09:39:08:  0: Executing plugin command of type 'Render Task'
2024-01-24 09:39:08:  0: INFO: Sending StartTaskRequest to S3BackedCacheClient.
2024-01-24 09:39:08:  0: DEBUG: Request:
2024-01-24 09:39:08:  0: DEBUG: 	JobId: 65b0da8db5232bd1f0c079e7
2024-01-24 09:39:08:  0: DEBUG: 	JobUploadWhitelist: rs_dl_render_test.Redshift_ROP1.####.exr
2024-01-24 09:39:08:  0: DEBUG: 	JobUploadWhitelistRe: ^.+\.abc$, ^.+\.avi$, ^.+\.bmp$, ^.+\.bw$, ^.+\.cin$, ^.+\.cjp$, ^.+\.cjpg$, ^.+\.cxr$, ^.+\.dds$, ^.+\.dpx$, ^.+\.dwf$, ^.+\.dwfx$, ^.+\.dwg$, ^.+\.dxf$, ^.+\.dxx$, ^.+\.eps$, ^.+\.exr$, ^.+\.fbx$, ^.+\.fxr$, ^.+\.hdr$, ^.+\.icb$, ^.+\.iff$, ^.+\.iges$, ^.+\.igs$, ^.+\.int$, ^.+\.inta$, ^.+\.iris$, ^.+\.jpe$, ^.+\.jpeg$, ^.+\.jpg$, ^.+\.jp2$, ^.+\.mcc$, ^.+\.mcx$, ^.+\.mov$, ^.+\.mxi$, ^.+\.pdf$, ^.+\.pic$, ^.+\.png$, ^.+\.prt$, ^.+\.ps$, ^.+\.psd$, ^.+\.rgb$, ^.+\.rgba$, ^.+\.rla$, ^.+\.rpf$, ^.+\.sat$, ^.+\.sgi$, ^.+\.stl$, ^.+\.sxr$, ^.+\.targa$, ^.+\.tga$, ^.+\.tif$, ^.+\.tiff$, ^.+\.tim$, ^.+\.vda$, ^.+\.vrimg$, ^.+\.vrmesh$, ^.+\.vrsm$, ^.+\.vrst$, ^.+\.vst$, ^.+\.wmf$, ^.+\.ass$, ^.+\.gz$, ^.+\.ifd$, ^.+\.mi$, ^.+\.mi2$, ^.+\.mxi$, ^.+\.rib$, ^.+\.rs$, ^.+\.vrscene$
2024-01-24 09:39:08:  0: DEBUG: S3BackedCache Client Returned Sequence: 7
2024-01-24 09:39:08:  0: INFO: Set HOUDINI_PATHMAP to {"/mnt/cgsv2021/":"/mnt/Data/mntcgsv2021386d7e3204ea8557cff0f48192a0df69/"}
2024-01-24 09:39:08:  0: INFO: Redshift Path Mapping...
2024-01-24 09:39:08:  0: INFO: source: "/mnt/cgsv2021/" dest: "/mnt/Data/mntcgsv2021386d7e3204ea8557cff0f48192a0df69/"
2024-01-24 09:39:08:  0: INFO: [REDSHIFT_PATHOVERRIDE_FILE] now set to: "/var/lib/Thinkbox/Deadline10/workers/ip-10-128-55-213/jobsData/65b0da8db5232bd1f0c079e7/RSMapping_tempBa1170/RSMapping.txt"
2024-01-24 09:39:08:  0: INFO: Starting Houdini Job
2024-01-24 09:39:08:  0: INFO: Stdout Redirection Enabled: True
2024-01-24 09:39:08:  0: INFO: Asynchronous Stdout Enabled: False
2024-01-24 09:39:08:  0: INFO: Stdout Handling Enabled: True
2024-01-24 09:39:08:  0: INFO: Popup Handling Enabled: True
2024-01-24 09:39:08:  0: INFO: QT Popup Handling Enabled: False
2024-01-24 09:39:08:  0: INFO: WindowsForms10.Window.8.app.* Popup Handling Enabled: False
2024-01-24 09:39:08:  0: INFO: Using Process Tree: True
2024-01-24 09:39:08:  0: INFO: Hiding DOS Window: True
2024-01-24 09:39:08:  0: INFO: Creating New Console: False
2024-01-24 09:39:08:  0: INFO: Running as user: ec2-user
2024-01-24 09:39:08:  0: INFO: Sending EndTaskRequest to S3BackedCacheClient.
2024-01-24 09:39:08:  0: DEBUG: Request:
2024-01-24 09:39:08:  0: DEBUG: 	JobId: 65b0da8db5232bd1f0c079e7
2024-01-24 09:39:08:  0: Done executing plugin command of type 'Render Task'

=======================================================
Details
=======================================================
Date: 01/24/2024 09:39:11
Frames: 16
Elapsed Time: 00:00:00:06
Job Submit Date: 01/24/2024 09:38:20
Job User: iei.shinji
Average RAM Usage: 787706304 (5%)
Peak RAM Usage: 793763840 (5%)
Average CPU Usage: 7%
Peak CPU Usage: 15%
Used CPU Clocks (x10^6 cycles): 5512
Total CPU Clocks (x10^6 cycles): 78740

=======================================================
Worker Information
=======================================================
Worker Name: ip-10-128-55-213
Version: v10.3.0.15 Release (76d003b0a)
Operating System: Linux
Machine User: ec2-user
IP Address: 10.128.55.213
MAC Address: 0E:B4:F5:D4:39:E5
CPU Architecture: x86_64
CPUs: 8
CPU Usage: 3%
Memory Usage: 759.4 MB / 15.2 GB (4%)
Free Disk Space: 13.023 GB 
Video Card: Amazon.com, Inc. Device 1111

=======================================================
AWS Information
=======================================================
Instance ID: i-08d2acfb1cd50a0ff
Instance Type: c5.2xlarge
Image ID: ami-040b65e832d45f9dc
Region: ap-northeast-1
Architecture: x86_64
Availability Zone: ap-northeast-1d

You need to set the Houdini 19_5 executable path from Tools → Configure plugins

I realize that I choiced this AMI.

Deadline Worker Base Image Linux 2 10.3.0.15 with Houdini 17.5.360 and Redshift 2.6.48 2023-10-28T002832Z

I have already setted Houdini 19_5 executable path from Tools.
Should I make AMI that has Houdini 19.5.640?

Yeah, it doesn’t take long. Copy installer through gateway to worker, install it, go to EC2 and create a new AMI from the instance.

Deadline has a predefined AMI Image for Houdini 19.5.303 with Mantra 19.5.303

If you want to us a renderer other than Mantra, you can use our predefined AMI image with Houdini 19.5 and installed your choice of renderer on it. It would have all the pre-setup for Houdini to work with Deadline Client installed and setup. Follow this documentation on creating a custom AMI image.

Thank you for everyone’s reply,
Due to your advices, I can reslove this ploblem.
But new problem happens to me.

My worker is installed houdini and redshift.


=======================================================
Error
=======================================================
FailRenderException : No licenses could be found to run this application
   at Deadline.Plugins.DeadlinePlugin.FailRender(String message) (Python.Runtime.PythonException)
  File "/var/lib/Thinkbox/Deadline10/workers/ip-10-128-49-171/plugins/65b73325b5232bd1f0c087d3/Houdini.py", line 418, in HandleStdoutLicense
    self.FailRender(self.GetRegexMatch(1))
   at Python.Runtime.Dispatcher.Dispatch(ArrayList args)
   at __FranticX_Processes_ManagedProcess_StdoutHandlerDelegateDispatcher.Invoke()
   at FranticX.Processes.ManagedProcess.RegexHandlerCallback.CallFunction()
   at FranticX.Processes.ManagedProcess.e(String cj, Boolean ck)
   at FranticX.Processes.ManagedProcess.Execute(Boolean waitForExit)
   at Deadline.Plugins.DeadlinePlugin.DoRenderTasks()
   at Deadline.Plugins.PluginWrapper.RenderTasks(Task task, String& outMessage, AbortLevel& abortLevel)
   at Deadline.Plugins.PluginWrapper.RenderTasks(Task task, String& outMessage, AbortLevel& abortLevel)

=======================================================
Type
=======================================================
RenderPluginException

=======================================================
Stack Trace
=======================================================
   at Deadline.Plugins.SandboxedPlugin.d(DeadlineMessage bgm, CancellationToken bgn)
   at Deadline.Plugins.SandboxedPlugin.RenderTask(Task task, CancellationToken cancellationToken)
   at Deadline.Slaves.SlaveRenderThread.c(TaskLogWriter ajt, CancellationToken aju)

=======================================================
Log
=======================================================
2024-01-29 05:20:47:  0: Loading Job's Plugin timeout is Disabled
2024-01-29 05:20:47:  0: SandboxedPlugin: Render Job As User disabled, running as current user 'ec2-user'
2024-01-29 05:20:49:  0: Executing plugin command of type 'Initialize Plugin'
2024-01-29 05:20:49:  0: INFO: Executing plugin script '/var/lib/Thinkbox/Deadline10/workers/ip-10-128-49-171/plugins/65b73325b5232bd1f0c087d3/Houdini.py'
2024-01-29 05:20:49:  0: INFO: Plugin execution sandbox using Python version 3
2024-01-29 05:20:49:  0: INFO: About: Houdini Plugin for Deadline
2024-01-29 05:20:49:  0: INFO: The job's environment will be merged with the current environment before rendering
2024-01-29 05:20:49:  0: Done executing plugin command of type 'Initialize Plugin'
2024-01-29 05:20:49:  0: Start Job timeout is disabled.
2024-01-29 05:20:49:  0: Task timeout is disabled.
2024-01-29 05:20:49:  0: Loaded job: rs_dl_render_test - /obj/ground_explosion/filecache1/render (65b73325b5232bd1f0c087d3)
2024-01-29 05:20:49:  0: Executing plugin command of type 'Start Job'
2024-01-29 05:20:49:  0: INFO: Sending StartTaskRequest to S3BackedCacheClient.
2024-01-29 05:20:49:  0: DEBUG: Request:
2024-01-29 05:20:49:  0: DEBUG: 	JobId: 65b73325b5232bd1f0c087d3
2024-01-29 05:20:49:  0: DEBUG: 	JobUploadWhitelist: 
2024-01-29 05:20:49:  0: DEBUG: 	JobUploadWhitelistRe: ^.+\.abc$, ^.+\.avi$, ^.+\.bmp$, ^.+\.bw$, ^.+\.cin$, ^.+\.cjp$, ^.+\.cjpg$, ^.+\.cxr$, ^.+\.dds$, ^.+\.dpx$, ^.+\.dwf$, ^.+\.dwfx$, ^.+\.dwg$, ^.+\.dxf$, ^.+\.dxx$, ^.+\.eps$, ^.+\.exr$, ^.+\.fbx$, ^.+\.fxr$, ^.+\.hdr$, ^.+\.icb$, ^.+\.iff$, ^.+\.iges$, ^.+\.igs$, ^.+\.int$, ^.+\.inta$, ^.+\.iris$, ^.+\.jpe$, ^.+\.jpeg$, ^.+\.jpg$, ^.+\.jp2$, ^.+\.mcc$, ^.+\.mcx$, ^.+\.mov$, ^.+\.mxi$, ^.+\.pdf$, ^.+\.pic$, ^.+\.png$, ^.+\.prt$, ^.+\.ps$, ^.+\.psd$, ^.+\.rgb$, ^.+\.rgba$, ^.+\.rla$, ^.+\.rpf$, ^.+\.sat$, ^.+\.sgi$, ^.+\.stl$, ^.+\.sxr$, ^.+\.targa$, ^.+\.tga$, ^.+\.tif$, ^.+\.tiff$, ^.+\.tim$, ^.+\.vda$, ^.+\.vrimg$, ^.+\.vrmesh$, ^.+\.vrsm$, ^.+\.vrst$, ^.+\.vst$, ^.+\.wmf$, ^.+\.ass$, ^.+\.gz$, ^.+\.ifd$, ^.+\.mi$, ^.+\.mi2$, ^.+\.mxi$, ^.+\.rib$, ^.+\.rs$, ^.+\.vrscene$
2024-01-29 05:20:49:  0: DEBUG: S3BackedCache Client Returned Sequence: 1
2024-01-29 05:20:49:  0: INFO: Executing global asset transfer preload script '/var/lib/Thinkbox/Deadline10/workers/ip-10-128-49-171/plugins/65b73325b5232bd1f0c087d3/GlobalAssetTransferPreLoad.py'
2024-01-29 05:20:49:  0: INFO: Looking for legacy (pre-10.0.26) AWS Portal File Transfer...
2024-01-29 05:20:49:  0: INFO: Looking for legacy (pre-10.0.26) File Transfer controller in /opt/Thinkbox/S3BackedCache/bin/task.py...
2024-01-29 05:20:49:  0: INFO: Could not find legacy (pre-10.0.26) AWS Portal File Transfer.
2024-01-29 05:20:49:  0: INFO: Legacy (pre-10.0.26) AWS Portal File Transfer is not installed on the system.
2024-01-29 05:20:49:  0: Done executing plugin command of type 'Start Job'
2024-01-29 05:20:49:  0: Plugin rendering frame(s): 1
2024-01-29 05:20:50:  0: Executing plugin command of type 'Render Task'
2024-01-29 05:20:50:  0: INFO: Sending StartTaskRequest to S3BackedCacheClient.
2024-01-29 05:20:50:  0: DEBUG: Request:
2024-01-29 05:20:50:  0: DEBUG: 	JobId: 65b73325b5232bd1f0c087d3
2024-01-29 05:20:50:  0: DEBUG: 	JobUploadWhitelist: 
2024-01-29 05:20:50:  0: DEBUG: 	JobUploadWhitelistRe: ^.+\.abc$, ^.+\.avi$, ^.+\.bmp$, ^.+\.bw$, ^.+\.cin$, ^.+\.cjp$, ^.+\.cjpg$, ^.+\.cxr$, ^.+\.dds$, ^.+\.dpx$, ^.+\.dwf$, ^.+\.dwfx$, ^.+\.dwg$, ^.+\.dxf$, ^.+\.dxx$, ^.+\.eps$, ^.+\.exr$, ^.+\.fbx$, ^.+\.fxr$, ^.+\.hdr$, ^.+\.icb$, ^.+\.iff$, ^.+\.iges$, ^.+\.igs$, ^.+\.int$, ^.+\.inta$, ^.+\.iris$, ^.+\.jpe$, ^.+\.jpeg$, ^.+\.jpg$, ^.+\.jp2$, ^.+\.mcc$, ^.+\.mcx$, ^.+\.mov$, ^.+\.mxi$, ^.+\.pdf$, ^.+\.pic$, ^.+\.png$, ^.+\.prt$, ^.+\.ps$, ^.+\.psd$, ^.+\.rgb$, ^.+\.rgba$, ^.+\.rla$, ^.+\.rpf$, ^.+\.sat$, ^.+\.sgi$, ^.+\.stl$, ^.+\.sxr$, ^.+\.targa$, ^.+\.tga$, ^.+\.tif$, ^.+\.tiff$, ^.+\.tim$, ^.+\.vda$, ^.+\.vrimg$, ^.+\.vrmesh$, ^.+\.vrsm$, ^.+\.vrst$, ^.+\.vst$, ^.+\.wmf$, ^.+\.ass$, ^.+\.gz$, ^.+\.ifd$, ^.+\.mi$, ^.+\.mi2$, ^.+\.mxi$, ^.+\.rib$, ^.+\.rs$, ^.+\.vrscene$
2024-01-29 05:20:50:  0: DEBUG: S3BackedCache Client Returned Sequence: 1
2024-01-29 05:20:50:  0: INFO: Set HOUDINI_PATHMAP to {"/mnt/cgsv2021/":"/mnt/Data/mntcgsv2021386d7e3204ea8557cff0f48192a0df69/"}
2024-01-29 05:20:50:  0: INFO: Redshift Path Mapping...
2024-01-29 05:20:50:  0: INFO: source: "/mnt/cgsv2021/" dest: "/mnt/Data/mntcgsv2021386d7e3204ea8557cff0f48192a0df69/"
2024-01-29 05:20:50:  0: INFO: [REDSHIFT_PATHOVERRIDE_FILE] now set to: "/var/lib/Thinkbox/Deadline10/workers/ip-10-128-49-171/jobsData/65b73325b5232bd1f0c087d3/RSMapping_tempf1zIb0/RSMapping.txt"
2024-01-29 05:20:50:  0: INFO: Starting Houdini Job
2024-01-29 05:20:50:  0: INFO: Stdout Redirection Enabled: True
2024-01-29 05:20:50:  0: INFO: Asynchronous Stdout Enabled: False
2024-01-29 05:20:50:  0: INFO: Stdout Handling Enabled: True
2024-01-29 05:20:50:  0: INFO: Popup Handling Enabled: True
2024-01-29 05:20:50:  0: INFO: QT Popup Handling Enabled: False
2024-01-29 05:20:50:  0: INFO: WindowsForms10.Window.8.app.* Popup Handling Enabled: False
2024-01-29 05:20:50:  0: INFO: Using Process Tree: True
2024-01-29 05:20:50:  0: INFO: Hiding DOS Window: True
2024-01-29 05:20:50:  0: INFO: Creating New Console: False
2024-01-29 05:20:50:  0: INFO: Running as user: ec2-user
2024-01-29 05:20:50:  0: INFO: Executable: "/opt/hfs19.5/bin/hython"
2024-01-29 05:20:50:  0: INFO: Argument: "/var/lib/Thinkbox/Deadline10/workers/ip-10-128-49-171/plugins/65b73325b5232bd1f0c087d3/hrender_dl.py" -f 1 1 1 -d /obj/ground_explosion/filecache1/render -tempdir "/var/lib/Thinkbox/Deadline10/workers/ip-10-128-49-171/jobsData/65b73325b5232bd1f0c087d3/0_tempI6zHb0" -arnoldAbortOnLicenseFail 1 "/cgsv2021/users/suzuki.keisuke/temp/RS_deadline_test/rs_dl_render_test.hip"
2024-01-29 05:20:50:  0: INFO: Full Command: "/opt/hfs19.5/bin/hython" "/var/lib/Thinkbox/Deadline10/workers/ip-10-128-49-171/plugins/65b73325b5232bd1f0c087d3/hrender_dl.py" -f 1 1 1 -d /obj/ground_explosion/filecache1/render -tempdir "/var/lib/Thinkbox/Deadline10/workers/ip-10-128-49-171/jobsData/65b73325b5232bd1f0c087d3/0_tempI6zHb0" -arnoldAbortOnLicenseFail 1 "/cgsv2021/users/suzuki.keisuke/temp/RS_deadline_test/rs_dl_render_test.hip"
2024-01-29 05:20:50:  0: INFO: Startup Directory: "/opt/hfs19.5/bin"
2024-01-29 05:20:50:  0: INFO: Process Priority: BelowNormal
2024-01-29 05:20:50:  0: INFO: Process Affinity: default
2024-01-29 05:20:50:  0: INFO: Process is now running
2024-01-29 05:20:51:  0: INFO: Sending EndTaskRequest to S3BackedCacheClient.
2024-01-29 05:20:51:  0: DEBUG: Request:
2024-01-29 05:20:51:  0: DEBUG: 	JobId: 65b73325b5232bd1f0c087d3
2024-01-29 05:20:51:  0: Done executing plugin command of type 'Render Task'

=======================================================
Details
=======================================================
Date: 01/29/2024 05:20:54
Frames: 1
Elapsed Time: 00:00:00:08
Job Submit Date: 01/29/2024 05:09:56
Job User: iei.shinji
Average RAM Usage: 844939264 (6%)
Peak RAM Usage: 908918784 (6%)
Average CPU Usage: 8%
Peak CPU Usage: 20%
Used CPU Clocks (x10^6 cycles): 9882
Total CPU Clocks (x10^6 cycles): 123522

=======================================================
Worker Information
=======================================================
Worker Name: ip-10-128-49-171
Version: v10.1.21.3 Release (2efbdb379)
Operating System: Linux
Machine User: ec2-user
IP Address: 10.128.49.171
MAC Address: 0E:AF:8F:CD:EB:E7
CPU Architecture: x86_64
CPUs: 8
CPU Usage: 0%
Memory Usage: 725.0 MB / 15.2 GB (4%)
Free Disk Space: 3.022 GB 
Video Card: Amazon.com, Inc. Device 1111

=======================================================
AWS Information
=======================================================
Instance ID: i-00939e3a90dc5c427
Instance Type: c5.2xlarge
Image ID: ami-0b7699e588458b810
Region: ap-northeast-1
Architecture: x86_64
Availability Zone: ap-northeast-1d
apps-fileview.texmex_20240118.01_p2

I bought redshift UBL license. But I didn’t buy houdini UBL license.
Do I need houdini UBL license? And didn’t I need install houdini license server?

Yep, you’ll need a Houdini license to run Houdini here. You won’t need a Houdini license server on the machine.

Then I bought houdini UBL license and Deadline UBL license. And download certificates.
But this error happens. What should I do? Please advise me.

Make sure you’ve set up a license limit for your Houdini UBL and applied it to the job.

Use the steps here to set up usage based licensing - Usage-Based Licensing — Deadline 10.3.1.4 documentation

Please check our settings.

◆UBL settings


If I want to give priority to UBL, Standard Threshold should be greater than 0 Machines UBL limit
as well Wokers?

◆Limited settings


This is houdini license limit. Is that right? Redshift also has same settings.

Best Regards
Taka

That’s all good, as log as both the redshift and houdini limits are set on the job. This page describes how, I would have shared earlier but I just noticed you’re running AWS Portal.

Thinking of that, what version of Deadline are you running? On Jan 21st you said it was a mix of 10.2 and 10.3, but that task report you shared on Jan 28th is running 10.1. If you’re setting up a new farm I recommend running 10.3 everywhere as mixing versions can be problematic, especially when using AWS Portal.

Thank you for your response.
I updated aws-portal for latest version today.
But I have same error.
So I will check infrastructure contents and worker contents and S3 bucket.
I have ssh connection to Infrastructure. And then I try to connect worker via infrastructure for ssh.
I can’t connect worker. I don’t know why. This is the error.

[ec2-user@ip-10-128-2-4 Deadline10]$ ssh ec2-user@10.128.38.43
Permission denied (publickey,gssapi-keyex,gssapi-with-mic).

◆Infrastructure
What specific files and status does Infrastructure have?
What should I check?

◆Worker
What specific files does Infrastructure have?
What should I check?

◆S3


Is this right?

Where is houdini.pfx and redshift.pfx and deadline.pfx(UBL certificates)?

Either you need to enable key forwarding, or the key is not installed on the worker for some reason.
Could you check that the infrastructure has the key under ~/.ssh/ ? I guess it is called Dash or something.
I guess nowadays you could open a shell to a host via the browser and avoid all the key issues. Haven’t tried it.

Glad you’re on the latest now, that should make life a little better. :slight_smile:

To connect to a Worker via SSH you’ll need agent forwarding, or more simply you can set up session manager. If you’d rather use SSH you can follow the steps starting here.

The UBL certificates should be in the bucket there. UBL Certificates can fail to upload if you haven’t setup the UBL Certificate Directory or if the machine you are starting the infrastructure from doesn’t have access to the Usage-Based Licensing Certificate Directory that you specified in the AWS Portal Settings.

I’d also check the other troubleshooting options on this page - Job Report Says it Failed to Checkout Third Party License — Deadline 10.3.1.4 documentation

Thank you for the advise.

I changed UBL certificate directory, so these certificates go into bucket.
Then there is no error.
But rendering job doesn’t prgress. Status: Queued.
I think asset server setting is wrong, because asset is nothing in S3 bucket.

When client(Windows) submit job(path:\cgsv2021\project01\test~~~~), my asset server(Linux) and path mapping setting is right?

3

What else setting should I do?

Best Regards
Taka

Please help me understand the scenario.

  1. Your assets live here: /mnt/cgsv/
  2. You are submitting from Windows
  3. The asset paths in the scene belong to which OS?
  4. And which OS are you running the asset server on?
Privacy | Site terms | Cookie preferences