I'm working with a rather large .net web application.
Users want to be able to export reports to PDF. Since the reports are based on aggregation of many layers of data, the best way to get an accurate snapshot is to actually take a snapshot of the UI. I can take the html of the UI and parse that to a PDF file.
Since the UI may take up to 30 seconds to load but the results never change, I wand to cache a pdf as soon as item gets saved in a background thread.
My main concern with this method is that if I go through the UI, I have to worry about timeouts. While background threads and the like can last as long as they want, aspx pages only last so long until they are terminated.
I have two ideas how to take care of this. The first idea is to create an aspx page that loads the UI, overrides render, and stores the rendered data to the database. A background thread would make a WebRequest to that page internally and then grab the results from the database. This obviously has to take security into consideration and also needs to worry about timeouts if the UI takes too long to generate.
The other idea is to create a page object and populate it manually in code, call the relevant methods by hand, and then grab the data from that. The problems with that method, aside from having no idea how to do it,is that I'm afraid I may forget to call a method or something may not work correctly because it's not actually associated with a real session or webserver.
What is the best way to simulate the UI of a page in a background thread?