I recently became a part of the team of developers writing our "flagship" product. Its primarily a read intensive web app (asp.net(c#) and oracle) implemented in an N-tier system. Most of the writes in the DB are done through external services (not through the webapp). Instead of scheduling normal batch jobs in the DB for data aggregation, they're pushing everything up the tiers to the business layer (sometimes a creating a hundred million objects). While this does keep all the "business logic" in the same place, it also takes about 200 times longer than running the equivalent query in the database. This seems like a terrible idea to me. Am I wrong here and this is standard and good stuff? Does anybody have any real case studies I can point my co-workers towards (or myself if I'm in the wrong)?
I'm not debating whether n-tier is good or bad, but does it fit for data aggregation processing and the like?