Deadline Logs to Kibana Logstash

Is anyone has any info about Deadline job logs importing to Kibana Logstash\Elasticsearch?

1 Like

Interested in the response to this, and or helping to explore this.

So do you have any info on this? I have setup my kibana server and just need help figuring out on how to get Deadline logs to Kibana.

The only way i can think of making this work, would be to create a custom logstash filter, that is distributed to each machine/ slave.

ive made an assumption in the conf file that its a linux machine, otherwise you can change the input file

path:
        input {
      file {
        path => "/var/log/Thinkbox/Deadline10/deadlineslave-*.log"
      }
    }

filter {
	
		grok {
	  		match => { "message" => "%{DATE:date} %{TIME:time}: %{GREEDYDATA:msg}"}
		
		
	}else {
		drop{}
	}
   }

I havent tested this , as our last elk/elg stack died (waiting for a new instance to be spun up), but you still need to add a output to this.
For testing, you could add the stdout driver like this:

output {
  stdout { }
}

Last note, about doing this, unless you have a pile of nodes to load balance the input (or have a tiny farm) , i would suggest setting up a message queue/ broker to collect all the logs and then pass them to elasticsearch. I have found its very easy to overload the input nodes.

Hope this helps.

Hoping we get a new stack going in the new year, so i can play with this a little.

Out of interest, what are you hoping to glean from this information?

Cheers
Kym

Thanks for the info!

I’m gathering metrics for our team to measure how much work is going thru our render farm using Deadline. It’s also good for management to see how work is getting done and see that the system is able to produce measurable work.

I actually setup a similar system when I was working for Microsoft.