When it comes to denormalizing data in a transactional database for performance, there are (at least) three different approaches:
Push updates through stored procedures which update both the normalized transactional data and the denormalized reporting/analysis data;
Implement triggers on the transactional tables that update the secondary tables; this is almost always the route taken when maintaining histories;
Defer the processing to a nightly batch process, possibly doing an ETL into a data mart/warehouse.
Let's assume for the purposes of this question that option #3 isn't viable, because the domain requires the denormalized data to be consistent with the normalized data at all times. Hierarchical aggregates, which I deal with rather frequently, are one example of this.
I've used both of the first two approaches a fair bit and lately I've been leaning toward the trigger-based approach, but I'm wondering if there are any "gotchas" that I haven't discovered yet, and thought it would be worth asking this question so I'll have some ideas to keep in mind when making long-term decisions in the future.
So in your experience, what are the pros and cons of either tool for the specific purpose of maintaining real-time denormalized data? In what situations would you choose one over the other, and why?
(P.S. Please no answers like "triggers are too complicated" or "all updates should always go through a stored proc" - make it appropriate to the context of the question.)