AWS Thinkbox Discussion Forums

Locked file stopped pending jobs to be queued

Having a problem with pending jobs not active when a previous job is locked(can't delete completed job). All jobs that were pending and had completed dependencies were never made active while a completed job was locked. I also submitted new jobs after the file was locked and all new ones render and completed as usual, but previous pending never became active. When I got into the office this morning, I manually deleted the locked completed job from the repository job folder. Once this was done, all the pending files were made active and were picked up. Is this a first time problem?

Thanks in advance.

Hi Ken,



I believe this is a first time problem, and I’m actually a little

surprised that a locked completed job would prevent dependent jobs from

resuming, especially when newly submitted dependent jobs work fine. I’ve

logged this as a bug, and the information you’ve provided should help us

reproduce the problem.



Thanks,

Hi Ryan

If it happens again is there anything you would like from me?

I think the information you provided in your first post should be

enough, but if we have any questions, we’ll let you know. Also, if you

find that this is a frequently occurring problem (as opposed to a

one-off random issue), let us know.



Thanks

  • Ryan


Will do Ryan. Thank you.

Hi Ken,



I tested this out with the beta version of Deadline 3.0 and this doesn’t

appear to be a problem with this version. I submitted 2 jobs in the

suspended state, made one dependent on the other, and resumed the

dependent job so that it was in the pending state. I then used the task

menu to mark the first job as complete, and used a tool of ours to lock

the job. Later on, the pending job resumed without any issues, with the

first job still locked.



It would appear that this issue has been resolved in 3.0, but we will

keep an eye on it when this version is released.



Cheers,

Hi Ryan







Thank you for checking this out. Since we are using Deadline 2.7 could I

possibly recreate this problem on my end just to verify if it was a

single time thing or not?







________________________________



From: Frantic_Deadline Listmanager

[mailto:Frantic_Deadline.listmanager@support.franticfilms.com]

Sent: Monday, May 12, 2008 03:02

Subject: Locked file stopped pending jobs to be queued







From: “Ryan Russell” (rrussell@franticfilms.com)



Hi Ken,



I tested this out with the beta version of Deadline 3.0 and this doesn’t



appear to be a problem with this version. I submitted 2 jobs in the

suspended state, made one dependent on the other, and resumed the

dependent job so that it was in the pending state. I then used the task

menu to mark the first job as complete, and used a tool of ours to lock

the job. Later on, the pending job resumed without any issues, with the

first job still locked.



It would appear that this issue has been resolved in 3.0, but we will

keep an eye on it when this version is released.



Cheers,

Sorry about that, I replied though your email rather than the thread.

Thank you for checking this out. Since we are using Deadline 2.7 could I
possibly recreate this problem on my end just to verify if it was a
single time thing or not?

Hi Ken,



One way to simulate a job being locked is to open up the job directory

(select the job in the monitor and hit ctrl-d), then set the permissions

on the *.lock file to disable reading/writing permissions. If you do

this, and then try to do anything with the job in the Monitor, it should

say it’s locked. Re-enabling the permissions on the *.lock file should

then unlock the job.



Cheers,

I have several files on the queue right now. When they are done I will give it a try.

Thanks

I have several files on the queue right now. When they are done I will give it a try.

Thanks

Privacy | Site terms | Cookie preferences