views:

95

answers:

4

I have a SQL Server 2005 database. I am logging data to a table. I want to prevent the table data from getting too big.

How can I limit the size of the table to x number of rows, and keep logging? I want the oldest rows to drop off.

+7  A: 

You have to build this process yourself. You might want to look into creating a SQL Server job that runs a SQL DELETE statement that is based on the criteria you defined.

bobs
A job is probably better than a trigger if you don't mind the total going a bit over. Just set it up to run once nightly or weekly or whatever.
Mark Ransom
+3  A: 

This is the one example where triggers might actually be a good idea in Sql Server. (My personal feeling is that triggers in SQL are like GOTOs in code.)

Just write an INSERT trigger which, when triggered, will check for the number of rows in the file and execute a DELETE according to whatever rules you specify.

Here's a link to trigger basics. And another, this time with screen caps.

Paul Sasik
If the table in question is changing often, then this would not be a good idea. It usually much better to schedule maintenance items for times when the system is under the least amount of load..
Chris Lively
If you do go this route you should evaluate performance. With triggers and large amounts of data, an INSERT statement will take a lot longer to complete. And, you have to anticipate failures in the delete process. You have to decide whether it's important to fail an insert because the trigger failed.
bobs
+2  A: 

Place the table on its own filegroup. Limit the size of the filegroup. See:

Then add a job that deletes the old log records, but this is usually trickier than it sounds. The most efficient way to use a sliding window: How to Implement an Automatic Sliding Window in a Partitioned Table. If not possible, then the next best thing is to make sure the clustered key on the table is the date, so that deletes can remove efficiently the old rows.

Remus Rusanu
A: 

If you want to restrict the size of a table for logging purposes, I would not advise thinking of solving the problem by limiting the number of records stored in a table. Instead have an archive or purge process for the table which stores the logs, this process can be configured to either purge/archive the logs either once X number of rows is reached, or perhaps later you want to reconfigure it for after X number of min’s/hrs/etc. If you are concerned about the actual space, then it would be best to analyze how much space your logs are actually taking up. Once you have an idea how much physical space you have available for the database, then restrict the data growth from SQL Server to be sure that the data file which the logging information is stored on does not exceed your expectations.

Vijay Selvaraj