views:

49

answers:

3

I build ASP.NET websites (hosted under IIS 6 usually, often with SQL Server backends and forms authentication).

Clients sometimes ask if I can check whether there are people currently browsing (and/or whether there are users currently logged in to) their website at a given moment, usually so the can safely do a deployment (they want a hotfix, for example).

I know the web is basically stateless so I can't be sure whether someone has closed the browser window, but I imagine there'd be some count of not-yet-timed-out sessions or something, and surely logged-in-users...

Is there a standard and/or easy way to check this?

A: 

You may be looking for the Membership.GetNumberOfUsersOnline method, although I'm not sure how reliable it is.

Jakob Gade
I believe I'd have to be using the membership provider for that.
MGOwen
+1  A: 

Jakob's answer is correct but does rely on installing and configuring the Membership features.

A crude but simple way of tracking users online would be to store a counter in the Application object. This counter could be incremented/decremented upon their sessions starting and ending. There's an example of this on the MSDN website:

Session-State Events (MSDN Library)

Because the default Session Timeout is 20 minutes the accuracy of this method isn't guaranteed (but then that applies to any web application due to the stateless and disconnected nature of HTTP).

Kev
Thanks I'll look into this and accept if I use it...
MGOwen
A: 

Sessions, suggested by other users, are a basic way of doing things, but are not too reliable. They can also work well in some circumstances, but not in others.

For example, if users are downloading large files or watching videos or listening to the podcasts, they may stay on the same page for hours (unless the requests to the binary data are tracked by ASP.NET too), but are still using your website.

Thus, my suggestion is to use the server logs to detect if the website is currently used by many people. It gives you the ability to:

  • See what sort of requests are done. It's quite easy to detect humans and crawlers, and with some experience, it's also possible to see if the human is currently doing something critical (such as writing a comment on a website, editing a document, or typing her credit card number and ordering something) or not (such as browsing).
  • See who is doing those requests. For example, if Google is crawling your website, it is a very bad idea to go offline, unless the search rating doesn't matter for you. On the other hand, if a bot is trying for two hours to crack your website by doing requests to different pages, you can go offline for sure.

Note: if a website has some critical areas (for example, writing this long answer, I would be angry if Stack Overflow goes offline in a few seconds just before I submit my answer), you can also send regular AJAX requests to the server while the user stays on the page. Of course, you must be careful when implementing such feature, and take in account that it will increase the bandwidth used, and will not work if the user has JavaScript disabled).

MainMa
Thanks @MainMa, but which logs do you mean by "server logs"? not the server's event log?
MGOwen
@MGOwen: I am talking about the logs where you see *every* HTTP request to the server. Of course filters can be used to get rid of the meaningless entries (images and CSS files for an ordinary website for example).
MainMa