views:

505

answers:

5

I've a requirement of creating a HttpHandler that will serve an image file (simple static file) and also it'll insert a record in the SQL Server table. (e.g http://site/some.img, where some.img being a HttpHandler) I need an in-memory object (like Generic List object) that I can add items to on each request (I also have to consider a few hundreds or thousands requests per second) and I should be able unload this in-memory object to sql table using SqlBulkCopy.

List --> DataTable --> SqlBulkCopy

I thought of using the Cache object. Create a Generic List object and save it in the HttpContext.Cache and insert every time a new Item to it. This will NOT work as the CacheItemRemovedCallback would fire right away when the HttpHandler tries to add a new item. I can't use Cache object as in-memory queue.

Anybody can suggest anything? Would I be able to scale in the future if the load is more?

A: 

How about just using the Generic List to store requests and using different thread to do the SqlBulkCopy?

This way storing requests in the list won't block the response for too long, and background thread will be able to update the Sql on it's own time, each 5 min so.

you can even base the background thread on the Cache mechanism by performing the work on CacheItemRemovedCallback.

Just insert some object with remove time of 5 min and reinsert it at the end of the processing work.

Alex Reitbort
+1  A: 

Why would CacheItemRemovedCalledback fire when you ADD something to the queue? That doesn't make sense to me... Even if that does fire, there's no requirement to do anything here. Perhaps I am misunderstanding your requirements?

I have quite successfully used the Cache object in precisely this manner. That is what it's designed for and it scales pretty well. I stored a Hashtable which was accessed on every app page request and updated/cleared as needed.

Option two... do you really need the queue? SQL Server will scale pretty well also if you just want to write directly into the DB. Use a shared connection object and/or connection pooling.

Bryan
A: 

Thanks Alex & Bryan for your suggestions.

Bryan: When I try to replace the List object in the Cache for the second request (now, count should be 2), the CacheItemRemovedCalledback gets fire as I'm replacing the current Cache object with the new one. Initially, I also thought this is weird behavior so I gotta look deeper into it. Also, for the second suggestion, I will try to insert record (with the Cached SqlConnection object) and see what performance I get when I do the stress test. I doubt I'll be getting fantastic numbers as it's I/O operation.

I'll keep digging on my side for an optimal solution meanwhile with your suggestions.

Rac123
If you will try to insert records one by one, I doubt you will get acceptable performance. What I was trying to suggest is to update the database in a bulk, with number of records. Take a look at SqlBulkCopy on msdn.
Alex Reitbort
Ah... OK. I think I see your problem now. You don't need to replace the list object back into the Cache on each request! It's there until it expires or is explicitly removed. Just use the same list over and over again.
Bryan
A: 

You can create a conditional requirement within the callback to ensure you are working on a cache entry that has been hit from an expiration instead of a remove/replace (in VB since I had it handy):

Private Shared Sub CacheRemovalCallbackFunction(ByVal cacheKey As String, ByVal cacheObject As Object, ByVal removalReason As Web.Caching.CacheItemRemovedReason)
    Select Case removalReason
        Case Web.Caching.CacheItemRemovedReason.Expired, Web.Caching.CacheItemRemovedReason.DependencyChanged, Web.Caching.CacheItemRemovedReason.Underused
        ' By leaving off Web.Caching.CacheItemRemovedReason.Removed, this will exclude items that are replaced or removed explicitly (Cache.Remove) '
    End Select
End Sub

Edit Here it is in C# if you need it:

private static void CacheRemovalCallbackFunction(string cacheKey, object cacheObject, System.Web.Caching.CacheItemRemovedReason removalReason)
{
    switch(removalReason)
    {
        case System.Web.Caching.CacheItemRemovedReason.DependencyChanged:
        case System.Web.Caching.CacheItemRemovedReason.Expired:
        case System.Web.Caching.CacheItemRemovedReason.Underused:
            // This excludes the option System.Web.Caching.CacheItemRemovedReason.Removed, which is triggered when you overwrite a cache item or remove it explicitly (e.g., HttpRuntime.Cache.Remove(key))
            break;
    }
}
patridge
A: 

To expand on my previous comment... I get the picture you are thinking about the cache incorrectly. If you have an object stored in the Cache, say a Hashtable, any update/storage into that Hashtable will be persisted without you explicitly modifying the contents of the Cache. You only need to add the Hashtable to the Cache once, either at application startup or on the first request.

If you are worried about the bulkcopy and page request updates happening simultaneously, then I suggest you simple have TWO cached lists. Have one be the list which is updated as page requests come in, and one list for the bulk copy operation. When one bulk copy is finished, swap the lists and repeat. This is similar to double-buffering video RAM for video games or video apps.

Bryan