AWS Thinkbox Discussion Forums

Farm reports

Hey gang. I was playing around with generating some farm reports so we can see which jobs required the most RAM, the most CPU’s, render time etc. I am noticing that most of this data that is generated can’t be sorted because the fields are using different units of measurement. For example, if I sort by average RAM used, I am given a field that has MB, KB, and GB all as units of measurement, but I can’t sort it from highest to lowest because of this.

We have completed over 14k jobs this week alone, so sorting this kind of info with the same unit of measurement is necessary to be able to get anything from it. May I request that the fields for the farm reports be standardized so it can be sorted? For example I would rather see something like .002GB than 2MB.

Hmmm, it should be using the actual data value (which is in Bytes) – and not the display value – to do the sorting. What kind of report are you generating? Are you using one of the pre-made ones (ie, Completed Job Stats, Farm Overview, Slave Resource Usage), or a custom one?

Is it actually sorting by display value, or is the sorting just kinda all over the place and seemingly random?

I am generating a “Completed Job Stats” standard report.

in the Average RAM Usage field I see the following:

112.271 MB
1.992 GB



120.000 KB
988.000 KB

I also see this in other fields including Total Image File Size, Auxiliary File Size, etc.

So what happens is that I can click on the sorting for Peak RAM Usage, but I end up with a list that isn’t actually in order according to how much RAM the jobs used. It looks like this:
580.648 MB
5.494 GB
5.857 GB
7.524 GB
7.778 GB
7.818 GB
7.863 GB
11.235 GB
11.758 GB
425.328 MB
14.164 GB
120.000 KB
988.000 KB
8.152 GB
8.164 GB
8.176 GB
8.230 GB
And so on…

It’s mostly in order, but further down the list I see more entries in GB, KB, and MB that don’t seem to fall into the correct place when sorted by amount of RAM. This is also making it a bit difficult to sort in Excel after exporting the data. Right now I’m trying to generate some better reports so we can compare/contrast the different shows we have currently and come up with resource requirements for each project. I’m also going to try using this data to figure out which of our artists are the biggest farm hogs and other things like that. I’m hitting a snag when I try to see who’s using the most RAM.

Thanks for the help! We’re using Deadline v6.1.0.54178 until this Sunday, then we’ll upgrade to RC3

Hmm, it seems to be getting messed up with larger values, since it’s sorting fine for me, but all my values are < 2GB. Unfortunately the fix will have to wait for 6.2, though, since we’re locking down 6.1 to only mission-critical stuff in preparation for the release.

In the meantime, would it be useful to create an “Ends With: GB” filter for the RAM column you want to sort? That way it should sort more intelligently after exporting to Excel, at least.

I am actually starting to script out the information that I need. I thought I’d mention a little snag I hit when i was doing this. I was going through the completed jobs report that I exported in .csv format. I am using the split function to get the different fields like RAM usage and whatnot. I noticed that the script I was working on didn’t like the modified frame ranges that were used in some cases. For example, in some cases I saw frame ranges that looked like,

“1009,1019,1029,1039,1009-1040”

In a .csv format, the script didn’t like the extra commas. I was getting a lot of errors because the script thought those extra commas designated new columns. I had to export again to .tsv format so that I can properly parse the whole report without getting snagged on the modified frame ranges. No biggie, first time working with a .csv file. Was a good learning experience.

Privacy | Site terms | Cookie preferences