In a number of situations as a programmer, I've found that my compile times are slower than I would like, and I want to understand the reason and fix them. Particular language (yes, I'm using C/C++) tricks have already been discussed, and we apply many of them. I've also seen this question and realize it's related. What I'm more interested in is what tools people use to diagnose hardware/system bottlenecks in the build process. Is there a standard way to prove "Disk read/writes are too slow for our builds - we need SSD's!" or "The anti-virus settings are killing our build times!", etc...?
Resources I've found, none directly related to compiling performance diagnosis:
- A TechNet article about using PerfMon (Quite good, close to what I'd like)
- This IBM link detailing some PerfMon information, but it's not specific to compiling and appears somewhat out of date.
- A webpage specifically describing diagnosis of avg disk queue length
Currently, diagnosing a slow build is very much an art, and my tools of choice are:
- PerfMon
- Process Explorer
- Process Monitor
- Push hard enough to get a machine to "just try it". (Basically, trial and error.)
What do others do to diagnose system-level build performance bottlenecks? Can we come up with a list of PerfMon or Process Explorer statistics to watch for, with thresholds for whats "acceptable" on a modern machine?
PerfMon:
- CPU -> % of processor time
- MEMORY -> Page/sec
- DISK -> Avg. disk queue length
Process Explorer:
- CPU -> CPU
- DISK -> I/O Delta Total
- MEMORY -> Page Faults