I have a website that is dynamic in the sense that a lot of data is generated from a database, but the contents of the database changes rarely (about 1-3 times a week). These changes are manual and controlled.
Instead of having the overhead of a dynamic website, I prefer to use a static pages. I'm debating what is the best solution:
curl/wget/spider
This question mentions it. The disadvantages I see might be:
- manual clean up needed (links, missing images, etc.)
- cannot mix static and dynamic pages
proxy
I could use a proxy to cache the static pages for a certain number of days. Disadvantages:
- hard to manage the cache of each page
- need to clear the cache after each manual change?
Use program to generate static pages
My current choice: I use perl programs to generate static pages from dynamic content. This doesn't scale very well as I have to hard code a lot of HTML, especially the page structure
Any other ways to do it? What would you/do you prefer?