views:

211

answers:

2

Let's say that, theoratically, I have a page / controller action in my website that does some very heavy stuff. It takes about 10 seconds to complete it's operation.

Now, I use .NET's outputcache mechanism to cache it for 15 minutes (for examle, I use [OutputCache(Duration = 900)]) What happens if, after 15 minutes, the cache is expired and 100 users request the page again within those 10 seconds that it takes to do the heavy processing?

  1. The heavy stuff is done only the first time, and there is some locking mechanism so that the other 99 users will get the cache result
  2. The heavy stuff is done 100 times (and the server is crippled as it can take up to 100 * 10 seconds)

Easy question maybe, but I'm not 100% sure. I hope it is number one, though :-)

Thanks!

+2  A: 

Well, it depends upon how you have IIS configured. If you have less than 100 worker threads (let's say, 50), then the "heavy stuff" is done 50 times, crippling your server, and then the remaining 50 requests will be served from cache.

But no, there is no "locking mechanism" on a cached action result; that would be counterproductive, for the most part.

Edit: I believe this to be true, but Nick's tests say otherwise, and I don't have time to test now. Try it yourself! The rest of the answer is not dependent on the above, though, and I think it's more important.

Generally speaking, however, no web request, cached or otherwise, should take 10 seconds to return. If I were in your shoes, I would look at somehow pre-computing the hard part of the request. You can still cache the action result if you want to cache the HTML, but it sounds like your problem is somewhat bigger than that.

You might also want to consider asynchronous controllers. Finally, note that although IIS and ASP.NET MVC will not lock on this heavy computation, you could. If you use asynchronous controllers combined with a lock on the computation, then you would get effectively the behavior you're asking for. I can't really say if that's the best solution without knowing more about what your doing.

Craig Stuntz
Thanks heavens I really don't have a request that takes 10 seconds, but I greatly exaggerated to illustrate a point. I was just curious what would happen in such a scenario. Thanks! Might consider implementing async controllers though.
Razzie
been a while... just did a test myself though and I'm sure it does not lock, like you said. Thanks.
Razzie
+2  A: 

It seems to lock here, doing a simple test:

<%@ OutputCache Duration="10" VaryByParam="*" %>

protected void Page_Load(object sender, EventArgs e)
{
    System.Threading.Thread.Sleep(new Random().Next(1000, 30000));
}

The first page hits the a breakpoint there, even though it's left sleeping...no other request hits a breakpoint in the Page_Load method...it waits for the first one to complete and returns that result to everyone who's requested that page.

Note: this was simpler to test in a webforms scenario, but given this is a shared aspect of the frameworks, you can do the same test in MVC with the same result.

Here's an alternative way to test:

<asp:Literal ID="litCount" runat="server" />

public static int Count = 0;

protected void Page_Load(object sender, EventArgs e)
{
  litCount.Text = Count++.ToString();
  System.Threading.Thread.Sleep(10000);
}

All pages queued up while the first request goes to sleep will have the same count output.

Nick Craver
Are you testing on WebDev? It behaves *very* differently than IIS on a multi-core server.
Craig Stuntz
@Craig: Testing using IIS 7.5, Windows 7 x64 on a quad-core
Nick Craver
That seems odd. Perhaps you're using debug mode? This really *shouldn't* lock. The other requests should get a cache miss.
Craig Stuntz
@Craig: Nope, release mode...just attaching the debugger in IIS, but of course it's possible this is affecting the behavior. On the other side, this would be the optimal behavior, for IIS to have other requests wait and all get the result of the first hit that's processing. If you're specifying output cache you're kind of saying that you don't want this thing to process often, and IIS acting this way would accomplish that and still serve every the requests as quickly as possible (first request's thread should finish first, in most cases).
Nick Craver
If the request is still in process, it's not at all obvious that the next request will be a cache hit even if the first request completes. I'd suggest testing with logging rather than the debugger.
Craig Stuntz
@Craig: Can you explain a bit more what you mean? If a user makes request #2 that would hit the same output (as in the VarByWhatever matches up), shouldn't IIS wait for the request to complete and return that result to everyone who would have hit the cache had they hit the same thing moments later? It should be the same result. (If it's not, you shouldn't be using OutputCache) If it didn't do this and you got 1000 hits at once, you'd be processing the same result 1000 times, instead of once quickly and return to 1000 requests...makes sense that it would behave this way.
Nick Craver
The cache can have a dependency. It can also be cleared by other code.
Craig Stuntz
@Craig, That is a fair point, but they would all be asking for the page as it was at the instant they all went for it so still a maybe. I tried a similar test as you requested here, release mode, no debugging with a static int that gets incremented every page process...same result, the number only get bumped once. I'll update the answer to show that approach.
Nick Craver
I believe that *something* isn't right here, but I don't have time to go through it myself. So +1 for testing, and I'll update my answer to reflect what you did.
Craig Stuntz
@Craig: I appreciate the intelligent discussion. Please do test...as I'll be using this in the near future I'd be very curious if you get different behavior.
Nick Craver