We are doing a series of client-side performance tests for a large website and we primarily use Fiddler and DynaTrace for our measurements. But we've run into 2 major issues:
- The clients are intent on measuring the asynchronous part of the page load (Stuff that gets kicked in after document.ready). Using fiddler, its hard to tell which http requests were a part of the original page fetch and which ones are kicked off by the document.ready event. We could do a quick test with JS disabled and that would give us an idea of which fetches are Asynchronous, but this is not a foolproof method either as this would not include requests which were made by JS files that were loaded synchronously.
On paper, DynaTrace seems like the ideal solution to this problem, but the metrics I've seen come from it don't seem to correlate to what we see otherwise. (For eg., I created a dummy page with a JQuery document.ready event with a built-in delay using setTimeout. I was expecting the delay time value to closely match what I see in DynaTrace as the time taken for the load event but I did not see this correlation.
- The problem of scripting and continuous measurement. Neither of the tools seem scriptable in a way that would allow you to hook them up to a CI system and spit out a bunch of numbers ever build.
Is there another tool that can be used for this intent? Is there an industry standard best practice solution that people use ? Any homegrown hacks ?