I had to delete tens of thousands of bogus job errors that had continuously accumulated due to slave startup bugs this morning, and it blew my Monitor’s memory usage up to about 1,300 MB. I ran Force Garbage Collection to see what would happen, and the Monitor echoed this to the console:
Before Collection:
[memory usage] 122.348MB / 0.000 Bytes
After Collection:
[memory usage] 95.562 MB / 0.000 Bytes
My monitor is still using over a gig of RAM (although it seems to drop in chunks every couple of minutes), so I’m curious why the Monitor thinks its footprint is so svelte. I realize that running anything in a memory-managed environment complicates a process’ ability to track its own memory usage, but the discrepancy between what it thinks it’s using and what is actually in use by the process is pretty staggering. Is the .NET GC really this lazy, or could this be a leak?
Thanks.