I have a custom event plugin which submits new jobs from incoming jobs using the onJobSubmitted() callback.
Pseudo-code:
onJobSubmitted():
if JobPlugin == "Houdini":
print("This is a Houdini job")
submitNewNukeJob()
elif JobPlugin == "Nuke":
print("This is a Nuke job")
submitNewFFmpegJob()
elif JobPlugin == "FFmpeg":
print("This is an FFmpeg job")
Generally it seems to work like a charm. Except, each job’s report contains the print() logs of all the subsequent jobs.
So in each plugin’s code I wrote print("Current job plugin:", job.JobPlugin)
This is my first job’s report (houdini):
=======================================================
Log
=======================================================
2024-07-15 13:45:21: MyEventPlugin: Current job plugin: Houdini
2024-07-15 13:45:22: MyEventPlugin: Current job plugin: Nuke
2024-07-15 13:45:22: MyEventPlugin: Current job plugin: FFmpeg
This is the Nuke job report (created from Houdini job):
=======================================================
Log
=======================================================
2024-07-15 13:45:22: MyEventPlugin: Current job plugin: Nuke
2024-07-15 13:45:22: MyEventPlugin: Current job plugin: FFmpeg
And this is FFmpeg job report (created from Nuke job):
=======================================================
Log
=======================================================
2024-07-15 13:45:22: MyEventPlugin: Current job plugin: FFmpeg
I wanted to ask what do you think about this? And what do you think I could try to fix it? Cheers
As I’m reading it you’re submitting a Houdini job that submits a Nuke job that submits an FFMPEG job. So there’s an OnJobSubmitted event in an OnJobSubmitted event in an OnJobSubmitted event.
I think the trouble is the nested calls to OnJobSubmitted and you’re seeing the results of your stack unwinding in the logs.
I’d try separating your job creating OnJobSubmitted events instead of doing it all in the same file. That should separate the event plugins names and thus the logging into the report files.
Hi Justin, I submit these jobs by creating jobinfo and plugininfo files and sending them with ClientUtils.ExecuteCommandAndGetOutput(arguments)
I’m not sure what you mean by separating events.
My original assumption was that it works like this: 1. Event Plugin sees new job 2. Event Plugin executes code 3. Event Plugin ends
So it happens like a domino effect. Does this fit the definition of being nested? Each consecutive event is executed because there is a new job submitted and not because of Python code from the previous event(at least not directly). But the evidence clearly shows this theory is flawed somewhere.
Just in case this is what you meant: my files are separated. I dynamically import each plugin logic depending on what the current job.JobPlugin is (by using Python’s importlib).
I mean, have separate events plugins, that should have the logs get separated. So instead of having your currentCreateJobs event (to make up a name ), you have CreateHoudini, CreateNuke, and CreateFFMPEG.
So instead of CreateJobs running an instance of CreateJobs within itself, a different event plugin is run. My thinking is that the different events should result in separated log files, since it looks like all the CreateJobs logging is grouped together and getting included on Jobs that shouldn’t have it.
That’s a very creative approach, I have to say. Way too much effort for extremely little gain, so I’m probably not going to use it, but I really appreciate this idea, it sounds like it would totally work. Thanks!