AWS Thinkbox Discussion Forums

single task failure, 7 error logs

Hi there,

We get some crashes that trigger 6+ error logs in deadline… is that normal?

I see a task crash report, then a missing sandbox error for the same task, then 4 further sandbox errors, then a stalled slave report :slight_smile:
Pretty excessive… Some of the actual errors:

=======================================================
Error
=======================================================
Sandbox Process is not available, may be disposed already.

=======================================================
Type
=======================================================
InvalidProgramException

=======================================================
Stack Trace
=======================================================
   at Deadline.Events.SandboxedEventManager.OnJobError(Job job, String[] auxiliaryFilenames, Task task, Report errorReport, DataController dataController)
   at Deadline.Controllers.DataController.ReportError(Exception e, Slave slave, IPlugin plugin, Job job, Task task, TimeSpan taskTimeElapsed)
   at Deadline.Slaves.SlaveSchedulerThread.a(Int32 A_0, TimeSpan A_1, Exception A_2, AbortLevel A_3)
=======================================================
Error
=======================================================
Sandbox Process is not available, may be disposed already.

=======================================================
Type
=======================================================
InvalidProgramException

=======================================================
Stack Trace
=======================================================
   at Deadline.Events.SandboxedEventManager.OnJobError(Job job, String[] auxiliaryFilenames, Task task, Report errorReport, DataController dataController)
   at Deadline.Controllers.DataController.ReportError(Exception e, Slave slave, IPlugin plugin, Job job, Task task, TimeSpan taskTimeElapsed)
   at Deadline.Slaves.SlaveSchedulerThread.a(Int32 A_0, TimeSpan A_1, Exception A_2, AbortLevel A_3)
   at Deadline.Slaves.SlaveSchedulerThread.b(Int32 A_0, Task A_1, TimeSpan A_2)
   at Deadline.Slaves.SlaveSchedulerThread.f()
   at Deadline.Slaves.SlaveSchedulerThread.h()
=======================================================
Error
=======================================================
Sandbox Process is not available, may be disposed already.

=======================================================
Type
=======================================================
InvalidProgramException

=======================================================
Stack Trace
=======================================================
   at Deadline.Events.SandboxedEventManager.OnJobError(Job job, String[] auxiliaryFilenames, Task task, Report errorReport, DataController dataController)
   at Deadline.Controllers.DataController.ReportError(Exception e, Slave slave, IPlugin plugin, Job job, Task task, TimeSpan taskTimeElapsed)
   at Deadline.Slaves.SlaveSchedulerThread.a(Exception A_0)
=======================================================
Error
=======================================================
Sandbox Process is not available, may be disposed already.

=======================================================
Type
=======================================================
InvalidProgramException

=======================================================
Stack Trace
=======================================================
   at Deadline.Events.SandboxedEventManager.OnJobError(Job job, String[] auxiliaryFilenames, Task task, Report errorReport, DataController dataController)
   at Deadline.Controllers.DataController.ReportError(Exception e, Slave slave, IPlugin plugin, Job job, Task task, TimeSpan taskTimeElapsed)
   at Deadline.Slaves.SlaveSchedulerThread.a(Exception A_0)
   at Deadline.Slaves.SlaveSchedulerThread.h()
   
=======================================================
Error
=======================================================
Sandbox Process is not available, may be disposed already.

=======================================================
Type
=======================================================
InvalidProgramException

=======================================================
Stack Trace
=======================================================
   at Deadline.Events.SandboxedEventManager.OnJobError(Job job, String[] auxiliaryFilenames, Task task, Report errorReport, DataController dataController)
   at Deadline.Controllers.DataController.ReportError(Exception e, Slave slave, IPlugin plugin, Job job, Task task, TimeSpan taskTimeElapsed)
   at Deadline.Slaves.SlaveSchedulerThread.a(Exception A_0)

This was brought up again here as very confusing. Previously we had 1 error = 1 report, so when the artists see 3-7 errors reports for each task failure, they think their job is failing all over the place. Are these additional reports added to the error count? Or is this counted as a single error for the task&job?
Could there maybe be some method to either filter or group errors that belong to the same task failure?

Hey Laszlo,

We’ll look into this, it definitely should not spam job error reports like that if the sandbox goes down.

We are getting quite a few of these crashes btw… i think we have quite wide spread slave crashes, but its not very visible due to a robust slave restarting system we have in place…

Another example (v8.0.5.1):

2016-09-22 18:05:16:  2016/09/22 16:16:35 DBG: [15788] [19644] [Flowline-DEBUG] ImplicitElement_Close|ImplicitSurface : Block: KDTree 12 Processing 93.750000 Percent of current Cache. Totally Processed 22 Buckets: of 32, Skipped: 9,Processed Elements: 2398742 of 2398750
2016-09-22 18:05:16:  2016/09/22 16:16:35 ERR: [15788] [19644] [Flowline-TRACE] ImplicitElement_Close|ImplicitSurface : BeginAccess: Successfully Locked HyperGrid from Block 10, Processsing Thread 12, Total Processed 19 of 19
2016-09-22 18:05:16:  VRAY LOG's last 2048 characters:
2016-09-22 18:05:16:  ighArmor_001", RGB: -0.0134657 -0.00592822 -0.00698195)
2016-09-22 18:05:16:  [2016/Sep/22|16:17:31]         warning: Material returned overbright or invalid color (object "chr_AquaManA-00_mesh_m_high_beltLogo_001", RGB: -0.0111547 0.000337806 0.00215418)
2016-09-22 18:05:16:  [2016/Sep/22|16:18:14]         warning: Material returned overbright or invalid color (object "chr_AquaManA-00_mesh_m_high_beltStrap_002", RGB: -0.0120526 -0.000669219 0.0008055)
2016-09-22 18:05:16:  [2016/Sep/22|16:22:52]         warning: Material returned overbright or invalid color (object "chr_AquaManA-00_mesh_m_high_underPant_001", RGB: -0.0110411 -0.00098986 -0.00192043)
2016-09-22 18:05:16:  [2016/Sep/22|16:36:28]         warning: Material returned overbright or invalid color (object "chr_AquaManA-00_mesh_m_high_thighArmorDetails_002", RGB: -0.00978682 0.00469476 0.00407592)
2016-09-22 18:05:16:  [2016/Sep/22|16:40:32]         warning: Material returned overbright or invalid color (object "chr_AquaManA-00_mesh_m_high_thighArmor_003", RGB: -0.0104973 0.000672316 -0.00202669)
2016-09-22 18:05:16:  [2016/Sep/22|16:41:58]         warning: Material returned overbright or invalid color (object "chr_AquaManA-00_mesh_m_high_thighArmor_002", RGB: -0.00800016 0.00424465 0.0016039)
2016-09-22 18:05:16:  [2016/Sep/22|16:47:01]         warning: Material returned overbright or invalid color (object "chr_AquaManA-00_mesh_m_high_thighArmorDetails_001", RGB: -0.0110159 0.00195243 0.00225154)
2016-09-22 18:05:16:  [2016/Sep/22|16:55:15]         warning: Material returned overbright or invalid color (object "chr_AquaManA-00_mesh_r_high_bootArmor_002", RGB: -0.00815276 0.00483654 0.00310183)
2016-09-22 18:05:16:  [2016/Sep/22|17:19:57]         warning: Material returned overbright or invalid color (object "chr_AquaManA-00_mesh_m_high_bolt_001", RGB: -0.00842732 0.00338297 0.0030548)
2016-09-22 18:05:16:  [2016/Sep/22|17:34:51]         warning: Material returned overbright or invalid color (object "chr_AquaManA-00_mesh_m_high_aquaTongue_001", RGB: -0.000521041 0.00242578 -0.00207054)
2016-09-22 18:05:16:  [2016/Sep/22|17:45:23]         warning: Material returned overbright or invalid color (object "chr_AquaManA-00_mesh_m_high_aquaTeethGumDown_001", RGB: -0.000422536 0.00258225 -0.00283242)
2016-09-22 18:05:16:  )
2016-09-22 18:05:16:     at Deadline.Plugins.PluginWrapper.RenderTasks(String taskId, Int32 startFrame, Int32 endFrame, String& outMessage, AbortLevel& abortLevel)
2016-09-22 18:05:16:  RenderPluginException.Cause: JobError (2)
2016-09-22 18:05:16:  RenderPluginException.Level: Major (1)
2016-09-22 18:05:16:  RenderPluginException.HasSlaveLog: True
2016-09-22 18:05:16:  RenderPluginException.SlaveLogFileName: C:\ProgramData\Thinkbox\Deadline8\logs\deadlineslave_renderthread_0-LAPRO0629-0000.log
2016-09-22 18:05:16:  Exception.Data: ( )
2016-09-22 18:05:16:  Exception.TargetSite: Void RenderTask(System.String, Int32, Int32)
2016-09-22 18:05:16:  Exception.Source: deadline
2016-09-22 18:05:16:  Exception.HResult: -2146233088
2016-09-22 18:05:16:    Exception.StackTrace: 
2016-09-22 18:05:16:     at Deadline.Plugins.Plugin.RenderTask(String taskId, Int32 startFrame, Int32 endFrame)
2016-09-22 18:05:16:     at Deadline.Slaves.SlaveRenderThread.a(TaskLogWriter A_0)
2016-09-22 18:05:16:  <<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<
2016-09-22 18:05:16:  Scheduler Thread - Unexpected Error Occurred While Handling Exception
2016-09-22 18:05:16:  >>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
2016-09-22 18:05:16:  Exception Details
2016-09-22 18:05:16:  InvalidProgramException -- Sandbox Process is not available, may be disposed already.
2016-09-22 18:05:16:  Exception.Data: ( )
2016-09-22 18:05:16:  Exception.TargetSite: Void OnJobError(Deadline.Jobs.Job, System.String[], Deadline.Jobs.Task, Deadline.Reports.Report, Deadline.Controllers.DataController)
2016-09-22 18:05:16:  Exception.Source: deadline
2016-09-22 18:05:16:  Exception.HResult: -2146233030
2016-09-22 18:05:16:    Exception.StackTrace: 
2016-09-22 18:05:16:     at Deadline.Events.SandboxedEventManager.OnJobError(Job job, String[] auxiliaryFilenames, Task task, Report errorReport, DataController dataController)
2016-09-22 18:05:16:     at Deadline.Controllers.DataController.ReportError(Exception e, Slave slave, IPlugin plugin, Job job, Task task, TimeSpan taskTimeElapsed)
2016-09-22 18:05:16:     at Deadline.Slaves.SlaveSchedulerThread.a(Int32 A_0, TimeSpan A_1, Exception A_2, AbortLevel A_3)
2016-09-22 18:05:16:  <<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<
2016-09-22 18:05:16:  Scheduler Thread - Unexpected Error Occurred
2016-09-22 18:05:16:  >>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
2016-09-22 18:05:16:  Exception Details
2016-09-22 18:05:16:  InvalidProgramException -- Sandbox Process is not available, may be disposed already.
2016-09-22 18:05:16:  Exception.Data: ( )
2016-09-22 18:05:16:  Exception.TargetSite: Void OnJobError(Deadline.Jobs.Job, System.String[], Deadline.Jobs.Task, Deadline.Reports.Report, Deadline.Controllers.DataController)
2016-09-22 18:05:16:  Exception.Source: deadline
2016-09-22 18:05:16:  Exception.HResult: -2146233030
2016-09-22 18:05:16:    Exception.StackTrace: 
2016-09-22 18:05:16:     at Deadline.Events.SandboxedEventManager.OnJobError(Job job, String[] auxiliaryFilenames, Task task, Report errorReport, DataController dataController)
2016-09-22 18:05:16:     at Deadline.Controllers.DataController.ReportError(Exception e, Slave slave, IPlugin plugin, Job job, Task task, TimeSpan taskTimeElapsed)
2016-09-22 18:05:16:     at Deadline.Slaves.SlaveSchedulerThread.a(Int32 A_0, TimeSpan A_1, Exception A_2, AbortLevel A_3)
2016-09-22 18:05:16:     at Deadline.Slaves.SlaveSchedulerThread.b(Int32 A_0, Task A_1, TimeSpan A_2)
2016-09-22 18:05:16:     at Deadline.Slaves.SlaveSchedulerThread.f()
2016-09-22 18:05:16:     at Deadline.Slaves.SlaveSchedulerThread.h()
2016-09-22 18:05:16:  <<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<
2016-09-22 18:05:16:  Scheduler Thread - Unexpected Error Occurred While Handling Exception (task requeue)
2016-09-22 18:05:16:  >>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
2016-09-22 18:05:16:  Exception Details
2016-09-22 18:05:16:  InvalidProgramException -- Sandbox Process is not available, may be disposed already.
2016-09-22 18:05:16:  Exception.Data: ( )
2016-09-22 18:05:16:  Exception.TargetSite: Void OnJobError(Deadline.Jobs.Job, System.String[], Deadline.Jobs.Task, Deadline.Reports.Report, Deadline.Controllers.DataController)
2016-09-22 18:05:16:  Exception.Source: deadline
2016-09-22 18:05:16:  Exception.HResult: -2146233030
2016-09-22 18:05:16:    Exception.StackTrace: 
2016-09-22 18:05:16:     at Deadline.Events.SandboxedEventManager.OnJobError(Job job, String[] auxiliaryFilenames, Task task, Report errorReport, DataController dataController)
2016-09-22 18:05:16:     at Deadline.Controllers.DataController.ReportError(Exception e, Slave slave, IPlugin plugin, Job job, Task task, TimeSpan taskTimeElapsed)
2016-09-22 18:05:16:     at Deadline.Slaves.SlaveSchedulerThread.a(Exception A_0)
2016-09-22 18:05:16:  <<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<
2016-09-22 18:05:16:  Scheduler Thread - exception occurred:
2016-09-22 18:05:16:  Scheduler Thread - Unexpected Error Occurred
2016-09-22 18:05:16:  >>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
2016-09-22 18:05:16:  Exception Details
2016-09-22 18:05:16:  InvalidProgramException -- Sandbox Process is not available, may be disposed already.
2016-09-22 18:05:16:  Exception.Data: ( )
2016-09-22 18:05:16:  Exception.TargetSite: Void OnJobError(Deadline.Jobs.Job, System.String[], Deadline.Jobs.Task, Deadline.Reports.Report, Deadline.Controllers.DataController)
2016-09-22 18:05:16:  Exception.Source: deadline
2016-09-22 18:05:16:  Exception.HResult: -2146233030
2016-09-22 18:05:16:    Exception.StackTrace: 
2016-09-22 18:05:16:     at Deadline.Events.SandboxedEventManager.OnJobError(Job job, String[] auxiliaryFilenames, Task task, Report errorReport, DataController dataController)
2016-09-22 18:05:16:     at Deadline.Controllers.DataController.ReportError(Exception e, Slave slave, IPlugin plugin, Job job, Task task, TimeSpan taskTimeElapsed)
2016-09-22 18:05:16:     at Deadline.Slaves.SlaveSchedulerThread.a(Exception A_0)
2016-09-22 18:05:16:     at Deadline.Slaves.SlaveSchedulerThread.h()
2016-09-22 18:05:16:  <<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<
2016-09-22 18:05:17:  Scheduler Thread - Unexpected Error Occurred While Handling Exception (task requeue)
2016-09-22 18:05:17:  >>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
2016-09-22 18:05:17:  Exception Details
2016-09-22 18:05:17:  InvalidProgramException -- Sandbox Process is not available, may be disposed already.
2016-09-22 18:05:17:  Exception.Data: ( )
2016-09-22 18:05:17:  Exception.TargetSite: Void OnJobError(Deadline.Jobs.Job, System.String[], Deadline.Jobs.Task, Deadline.Reports.Report, Deadline.Controllers.DataController)
2016-09-22 18:05:17:  Exception.Source: deadline
2016-09-22 18:05:17:  Exception.HResult: -2146233030
2016-09-22 18:05:17:    Exception.StackTrace: 
2016-09-22 18:05:17:     at Deadline.Events.SandboxedEventManager.OnJobError(Job job, String[] auxiliaryFilenames, Task task, Report errorReport, DataController dataController)
2016-09-22 18:05:17:     at Deadline.Controllers.DataController.ReportError(Exception e, Slave slave, IPlugin plugin, Job job, Task task, TimeSpan taskTimeElapsed)
2016-09-22 18:05:17:     at Deadline.Slaves.SlaveSchedulerThread.a(Exception A_0)
2016-09-22 18:05:17:  <<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<

Based on this log, i cant tell what caused the crash… but the slave went down, along with 3dsmax as well.

As you can see here, this caused quite a few crashes on this one job alone (not all errors fit onto the image, there were about 30% more):
[attachment=0]Capture.PNG[/attachment]

Privacy | Site terms | Cookie preferences