AWS Thinkbox Discussion Forums

【FailRenderException : No licenses could be found to run this application】 What shoud I do?

Hi all,

I just try to submit the job from on-premise client to the worker(linux) on the cloud that the aws-portal built.
But I think that worker can’t get houdini license because of this error.

=======================================================
Error
=======================================================
FailRenderException : No licenses could be found to run this application
  at Deadline.Plugins.DeadlinePlugin.FailRender(String message) (Python.Runtime.PythonException)
 File "/var/lib/Thinkbox/Deadline10/workers/ip-10-128-25-93/plugins/65cc9f93b5232bd1f0c0bfb2/Houdini.py", line 418, in HandleStdoutLicense
   self.FailRender(self.GetRegexMatch(1))
  at Python.Runtime.Dispatcher.Dispatch(ArrayList args)
  at __FranticX_Processes_ManagedProcess_StdoutHandlerDelegateDispatcher.Invoke()
  at FranticX.Processes.ManagedProcess.RegexHandlerCallback.CallFunction()
  at FranticX.Processes.ManagedProcess.e(String cj, Boolean ck)
  at FranticX.Processes.ManagedProcess.Execute(Boolean waitForExit)
  at Deadline.Plugins.DeadlinePlugin.DoRenderTasks()
  at Deadline.Plugins.PluginWrapper.RenderTasks(Task task, String& outMessage, AbortLevel& abortLevel)
  at Deadline.Plugins.PluginWrapper.RenderTasks(Task task, String& outMessage, AbortLevel& abortLevel)
=======================================================
Type
=======================================================
RenderPluginException
=======================================================
Stack Trace
=======================================================
  at Deadline.Plugins.SandboxedPlugin.d(DeadlineMessage bgm, CancellationToken bgn)
  at Deadline.Plugins.SandboxedPlugin.RenderTask(Task task, CancellationToken cancellationToken)
  at Deadline.Slaves.SlaveRenderThread.c(TaskLogWriter ajt, CancellationToken aju)
=======================================================
Log
=======================================================
2024-02-14 11:11:57:  0: Loading Job's Plugin timeout is Disabled
2024-02-14 11:11:57:  0: SandboxedPlugin: Render Job As User disabled, running as current user 'ec2-user'
2024-02-14 11:11:59:  0: Executing plugin command of type 'Initialize Plugin'
2024-02-14 11:11:59:  0: INFO: Executing plugin script '/var/lib/Thinkbox/Deadline10/workers/ip-10-128-25-93/plugins/65cc9f93b5232bd1f0c0bfb2/Houdini.py'
2024-02-14 11:12:00:  0: INFO: Plugin execution sandbox using Python version 3
2024-02-14 11:12:00:  0: INFO: About: Houdini Plugin for Deadline
2024-02-14 11:12:00:  0: INFO: The job's environment will be merged with the current environment before rendering
2024-02-14 11:12:00:  0: Done executing plugin command of type 'Initialize Plugin'
2024-02-14 11:12:00:  0: Start Job timeout is disabled.
2024-02-14 11:12:00:  0: Task timeout is disabled.
2024-02-14 11:12:00:  0: Loaded job: rs_dl_render_test - /out/Redshift_ROP1 (65cc9f93b5232bd1f0c0bfb2)
2024-02-14 11:12:00:  0: Executing plugin command of type 'Start Job'
2024-02-14 11:12:00:  0: INFO: Sending StartTaskRequest to S3BackedCacheClient.
2024-02-14 11:12:00:  0: DEBUG: Request:
2024-02-14 11:12:00:  0: DEBUG: 	JobId: 65cc9f93b5232bd1f0c0bfb2
2024-02-14 11:12:00:  0: DEBUG: 	JobUploadWhitelist: rs_dl_render_test.Redshift_ROP1.####.exr
2024-02-14 11:12:00:  0: DEBUG: 	JobUploadWhitelistRe: ^.+\.abc$, ^.+\.avi$, ^.+\.bmp$, ^.+\.bw$, ^.+\.cin$, ^.+\.cjp$, ^.+\.cjpg$, ^.+\.cxr$, ^.+\.dds$, ^.+\.dpx$, ^.+\.dwf$, ^.+\.dwfx$, ^.+\.dwg$, ^.+\.dxf$, ^.+\.dxx$, ^.+\.eps$, ^.+\.exr$, ^.+\.fbx$, ^.+\.fxr$, ^.+\.hdr$, ^.+\.icb$, ^.+\.iff$, ^.+\.iges$, ^.+\.igs$, ^.+\.int$, ^.+\.inta$, ^.+\.iris$, ^.+\.jpe$, ^.+\.jpeg$, ^.+\.jpg$, ^.+\.jp2$, ^.+\.mcc$, ^.+\.mcx$, ^.+\.mov$, ^.+\.mxi$, ^.+\.pdf$, ^.+\.pic$, ^.+\.png$, ^.+\.prt$, ^.+\.ps$, ^.+\.psd$, ^.+\.rgb$, ^.+\.rgba$, ^.+\.rla$, ^.+\.rpf$, ^.+\.sat$, ^.+\.sgi$, ^.+\.stl$, ^.+\.sxr$, ^.+\.targa$, ^.+\.tga$, ^.+\.tif$, ^.+\.tiff$, ^.+\.tim$, ^.+\.vda$, ^.+\.vrimg$, ^.+\.vrmesh$, ^.+\.vrsm$, ^.+\.vrst$, ^.+\.vst$, ^.+\.wmf$, ^.+\.ass$, ^.+\.gz$, ^.+\.ifd$, ^.+\.mi$, ^.+\.mi2$, ^.+\.mxi$, ^.+\.rib$, ^.+\.rs$, ^.+\.vrscene$
2024-02-14 11:12:00:  0: DEBUG: S3BackedCache Client Returned Sequence: 2
2024-02-14 11:12:00:  0: INFO: Executing global asset transfer preload script '/var/lib/Thinkbox/Deadline10/workers/ip-10-128-25-93/plugins/65cc9f93b5232bd1f0c0bfb2/GlobalAssetTransferPreLoad.py'
2024-02-14 11:12:00:  0: INFO: Looking for legacy (pre-10.0.26) AWS Portal File Transfer...
2024-02-14 11:12:00:  0: INFO: Looking for legacy (pre-10.0.26) File Transfer controller in /opt/Thinkbox/S3BackedCache/bin/task.py...
2024-02-14 11:12:00:  0: INFO: Could not find legacy (pre-10.0.26) AWS Portal File Transfer.
2024-02-14 11:12:00:  0: INFO: Legacy (pre-10.0.26) AWS Portal File Transfer is not installed on the system.
2024-02-14 11:12:00:  0: Done executing plugin command of type 'Start Job'
2024-02-14 11:12:00:  0: Plugin rendering frame(s): 1
2024-02-14 11:12:00:  0: Executing plugin command of type 'Render Task'
2024-02-14 11:12:00:  0: INFO: Sending StartTaskRequest to S3BackedCacheClient.
2024-02-14 11:12:00:  0: DEBUG: Request:
2024-02-14 11:12:00:  0: DEBUG: 	JobId: 65cc9f93b5232bd1f0c0bfb2
2024-02-14 11:12:00:  0: DEBUG: 	JobUploadWhitelist: rs_dl_render_test.Redshift_ROP1.####.exr
2024-02-14 11:12:00:  0: DEBUG: 	JobUploadWhitelistRe: ^.+\.abc$, ^.+\.avi$, ^.+\.bmp$, ^.+\.bw$, ^.+\.cin$, ^.+\.cjp$, ^.+\.cjpg$, ^.+\.cxr$, ^.+\.dds$, ^.+\.dpx$, ^.+\.dwf$, ^.+\.dwfx$, ^.+\.dwg$, ^.+\.dxf$, ^.+\.dxx$, ^.+\.eps$, ^.+\.exr$, ^.+\.fbx$, ^.+\.fxr$, ^.+\.hdr$, ^.+\.icb$, ^.+\.iff$, ^.+\.iges$, ^.+\.igs$, ^.+\.int$, ^.+\.inta$, ^.+\.iris$, ^.+\.jpe$, ^.+\.jpeg$, ^.+\.jpg$, ^.+\.jp2$, ^.+\.mcc$, ^.+\.mcx$, ^.+\.mov$, ^.+\.mxi$, ^.+\.pdf$, ^.+\.pic$, ^.+\.png$, ^.+\.prt$, ^.+\.ps$, ^.+\.psd$, ^.+\.rgb$, ^.+\.rgba$, ^.+\.rla$, ^.+\.rpf$, ^.+\.sat$, ^.+\.sgi$, ^.+\.stl$, ^.+\.sxr$, ^.+\.targa$, ^.+\.tga$, ^.+\.tif$, ^.+\.tiff$, ^.+\.tim$, ^.+\.vda$, ^.+\.vrimg$, ^.+\.vrmesh$, ^.+\.vrsm$, ^.+\.vrst$, ^.+\.vst$, ^.+\.wmf$, ^.+\.ass$, ^.+\.gz$, ^.+\.ifd$, ^.+\.mi$, ^.+\.mi2$, ^.+\.mxi$, ^.+\.rib$, ^.+\.rs$, ^.+\.vrscene$
2024-02-14 11:12:00:  0: DEBUG: S3BackedCache Client Returned Sequence: 2
2024-02-14 11:12:00:  0: INFO: Set HOUDINI_PATHMAP to {"/mnt/cgsv2021/":"/mnt/Data/mntcgsv2021386d7e3204ea8557cff0f48192a0df69/"}
2024-02-14 11:12:00:  0: INFO: Redshift Path Mapping...
2024-02-14 11:12:00:  0: INFO: source: "/mnt/cgsv2021/" dest: "/mnt/Data/mntcgsv2021386d7e3204ea8557cff0f48192a0df69/"
2024-02-14 11:12:00:  0: INFO: [REDSHIFT_PATHOVERRIDE_FILE] now set to: "/var/lib/Thinkbox/Deadline10/workers/ip-10-128-25-93/jobsData/65cc9f93b5232bd1f0c0bfb2/RSMapping_tempmCX2P0/RSMapping.txt"
2024-02-14 11:12:00:  0: INFO: Starting Houdini Job
2024-02-14 11:12:00:  0: INFO: Stdout Redirection Enabled: True
2024-02-14 11:12:00:  0: INFO: Asynchronous Stdout Enabled: False
2024-02-14 11:12:00:  0: INFO: Stdout Handling Enabled: True
2024-02-14 11:12:00:  0: INFO: Popup Handling Enabled: True
2024-02-14 11:12:00:  0: INFO: QT Popup Handling Enabled: False
2024-02-14 11:12:00:  0: INFO: WindowsForms10.Window.8.app.* Popup Handling Enabled: False
2024-02-14 11:12:00:  0: INFO: Using Process Tree: True
2024-02-14 11:12:00:  0: INFO: Hiding DOS Window: True
2024-02-14 11:12:00:  0: INFO: Creating New Console: False
2024-02-14 11:12:00:  0: INFO: Running as user: ec2-user
2024-02-14 11:12:00:  0: INFO: Executable: "/opt/hfs19.5/bin/hython"
2024-02-14 11:12:00:  0: INFO: Argument: "/var/lib/Thinkbox/Deadline10/workers/ip-10-128-25-93/plugins/65cc9f93b5232bd1f0c0bfb2/hrender_dl.py" -f 1 1 1 -o "$HIP/render/$HIPNAME.$OS.$F4.exr" -d /out/Redshift_ROP1 -tempdir "/var/lib/Thinkbox/Deadline10/workers/ip-10-128-25-93/jobsData/65cc9f93b5232bd1f0c0bfb2/0_tempaUA4L0" -arnoldAbortOnLicenseFail 1 "/cgsv2021/users/suzuki.keisuke/temp/RS_deadline_test/rs_dl_render_test.hip"
2024-02-14 11:12:00:  0: INFO: Full Command: "/opt/hfs19.5/bin/hython" "/var/lib/Thinkbox/Deadline10/workers/ip-10-128-25-93/plugins/65cc9f93b5232bd1f0c0bfb2/hrender_dl.py" -f 1 1 1 -o "$HIP/render/$HIPNAME.$OS.$F4.exr" -d /out/Redshift_ROP1 -tempdir "/var/lib/Thinkbox/Deadline10/workers/ip-10-128-25-93/jobsData/65cc9f93b5232bd1f0c0bfb2/0_tempaUA4L0" -arnoldAbortOnLicenseFail 1 "/cgsv2021/users/suzuki.keisuke/temp/RS_deadline_test/rs_dl_render_test.hip"
2024-02-14 11:12:00:  0: INFO: Startup Directory: "/opt/hfs19.5/bin"
2024-02-14 11:12:00:  0: INFO: Process Priority: BelowNormal
2024-02-14 11:12:00:  0: INFO: Process Affinity: default
2024-02-14 11:12:00:  0: INFO: Process is now running
2024-02-14 11:12:02:  0: Sending kill command to process hython-bin with id: 9596
2024-02-14 11:12:02:  0: INFO: Sending EndTaskRequest to S3BackedCacheClient.
2024-02-14 11:12:02:  0: DEBUG: Request:
2024-02-14 11:12:02:  0: DEBUG: 	JobId: 65cc9f93b5232bd1f0c0bfb2
2024-02-14 11:12:02:  0: Done executing plugin command of type 'Render Task'
=======================================================
Details
=======================================================
Date: 02/14/2024 11:12:05
Frames: 1
Elapsed Time: 00:00:00:09
Job Submit Date: 02/14/2024 11:10:11
Job User: suenaga.daishi
Average RAM Usage: 827300992 (3%)
Peak RAM Usage: 886923264 (3%)
Average CPU Usage: 9%
Peak CPU Usage: 16%
Used CPU Clocks (x10^6 cycles): 8175
Total CPU Clocks (x10^6 cycles): 90826
=======================================================
Worker Information
=======================================================
Worker Name: ip-10-128-25-93
Version: v10.1.21.3 Release (2efbdb379)
Operating System: Linux
Machine User: ec2-user
IP Address: 10.128.25.93
MAC Address: 06:8C:D5:D5:7D:CD
CPU Architecture: x86_64
CPUs: 8
CPU Usage: 13%
Memory Usage: 954.0 MB / 31.4 GB (2%)
Free Disk Space: 2.274 GB
Video Card: Cirrus Logic GD 5446
=======================================================
AWS Information
=======================================================
Instance ID: i-0cfc29035e69e5ac9
Instance Type: m4.2xlarge
Image ID: ami-08602f40cdbc18dfa
Region: ap-northeast-1
Architecture: x86_64
Availability Zone: ap-northeast-1a

◆license server status
license

What setting do you think that is lack?

did you add a ‘limit’ to the job and set the license limit to use Houdini & Redshift UBL?

https://docs.thinkboxsoftware.com/products/deadline/10.3/1_User%20Manual/manual/licensing-usage-based.html#third-party-usage-based-licensing

1 Like

Thanks you for the reply.

I had this setting in the same way for houdini.

What’s the wrong?
Please advise me.

You are running a Houdini & Redshift job, so you need a Houdini Engine and Redshift license, to use the UBL you’ll need a limit for both Houdini and Redshift to pick up a license of each.

If you can export to Redshift Standalone you’ll save the cost of Houdini Engine licenses.

Thank you for your respond.
I want to resolve the error.
If limit setting is not problem, what else should I do?

Check the ports work for the UBL access.

I’d also try and narrow down the issue by submitting a simple standalone Redshift job, and a simple Houdini process to see which one is having issue getting a license.

You are using Deadline 10.1.21.3, Deadline became free to use with version 10.1.23.6 but I’m pretty sure it’s always been free on AWS instances

Thank you for the advise.

I hava a question.
Which machine need to access UBL License Server URL?

  • Client(Job submitter)
  • Deadline repositry
  • AWS Portal server

All or client only or two machine?

Since its an AWS Portal created machine that’s having the issue you don’t need to worry about communication between the EC2 instance and the UBL URL, assuming you haven’t added anything to the security groups.

Take the steps that start here, some of which you’ve already done to double check your setup.

My guess would be that you didn’t add the Houdini UBL limit you created to the job, but those steps should help confirm what’s going on.

I’d also consider upgrading to Deadline 10.3, as your version of Deadline pre-dates a re-working of AWS Portal that improved its logging.

Thank you for your advise.

My environment is just complicated.
The bellow machines can access the only white list URL via proxy server.

  • Client(Job submitter)
  • Deadline repositry

So, I wanna know if those need to connect UBL server URL and what port number is.

Hello @hasekura.takamitsu

If the render node is unable to hit the URL, UBL wont work. Do yo have any firewall rules which prevent the UBL traffic? The UBL port for Houdini is 1715 and for Redshift it is 5054 and 7054.

Thank you for the reply.

Do yo have any firewall rules which prevent the UBL traffic?

No. The render node started in AWS via aws portal. So, if I don’t do anything special, I can not be aware of firewall rule, right?

I just wanna know if client and deadline repositry to need access UBL server URL.

So, if I don’t do anything special, I can not be aware of firewall rule, right?
Correct, you do not need anything special in the firewall rules.

No, the on premise clients do not need to access the UBLs.

I will need to look at the verbose Worker and License Forwarder (LF) logs at the time of issue to be able to troubleshoot this further.

  • Please enable verbose logging for LF /Worker: To enable verbose logging of Deadline application configure it from Deadline Monitor> Tools> Configure Repo Options> Application Data (left pane)> check option for LF /Worker verbose logging (bottom right pane)> restart the LF /Worker.

  • Restart the Worker> From Deadline Monitor’s Workers Panel> Right click the Worker> Remote Control> Worker Commands> Restart Worker

  • Restart the License Forwarder (LF) from Deadline Monitor> View> New Panel> License Forwarder> Right Click> Remote Control LF> LF Commands> Restart LF>

  • Reproduce the issue and share the LF logs: from Deadline Monitor> View> New Panel> License Forwarder> Right Click> Connect to LF logs

  • Reproduce the issue and share the Worker logs: from Deadline Monitor> Workers panel> Right click> Connect to Worker logs

Thank you for all.

I didn’t attach the limit, When I submit the job.
So, I attached the limit, license error disappeared.
Thanks to your advices, this problem resolves.

But worker job status is still “Queued”.
I will create new case.

Best Regards
Taka

1 Like
Privacy | Site terms | Cookie preferences