Okay, here's the scenario. I have a utility that processes tons of records, and enters information to the Database accordingly.
It works on these records in multi-threaded batches. Each such batch writes to the same log file for creating a workflow trace for each record. Potentially, we could be making close to a million log writes in a day.
Should this log be made into a database residing on another server? Considerations:
- The obvious disadvantage of multiple threads writing to the same log file is that the log messages are shuffled amongst each other. In the database, they can be grouped by batch id.
- Performance - which would slow down the batch processing more? writing to a local file or sending log data to a database on another server on the same network. Theoretically, the log file is faster, but is there a gotcha here?
Are there any optimizations that can be done on either approach?
Thanks.