views:

350

answers:

2

Using classic ASP VBScript, coupled to MS SQL Server, we have a function that takes 3 integers, page_id, heading_id, language_id

it looks up the page heading in a translation table...

This table contains roughly 10,000 rows...

The heading column is nvarchar(1000) plus the 3 int columns meaning it's around 2KiB per row max...

so my question is, is it worth copying this table into memory (Application object for example) using ADO's GetRows or through another method, or is it not worth it due to the "large" data size.

There are approximately 1 to 250 headings per page, 462 pages, with average of 34 headings/page.

So to save ~34 database calls a page, I use a large chunk of memory, and calls to application, and possibly it's slower anyway?

Thoughts?

A: 

For most data driven applications, the part that is usually most expensive is the database connection. At 2k per row and 10k rows, that's still only 20 meg. If the data is used very frequently than that seems like a very small price to pay to eliminate unnecessary database hits.

Peter
yep, that's what I was thinking, I was mainly asking if there is a better or "best-practice" way of doing this, or should I just rely on VBScript's multi-dimension array stuck in the application object.
Gaspard Leon
+1  A: 

Probably, save the calls.

Also, 34 database calls don't have to be 34 round trips. Batch your database calls.

Corey Trager
pretty much what I thought, as well... mainly just wondering if you or anyone else has experience with ASP and arrays, if I should avoid arrays, or should embrace them.
Gaspard Leon
I use arrays in classic ASP for many things, not least of which is as a cache.
Pittsburgh DBA