views:

155

answers:

3

I'm currently building a web-app for a client who would like to utilise cookies to better shape their traffic, etc, etc, you know the drill.

So at the end of last week I added a simple cookie write (basically set cookie="helloworld") and a cookie read with a counter to the app, which is effectively just a single page so every request to the service goes through this read/write for sure.

A few million requests after deployment cookie read rates were at ~25% of total requests, and now after the weekend they've only crept up to ~33%.


For reference: no path is set, the domain is fixed, the expiry is a month from creation. The number of repeat visits is relatively high. I am 100% confident in the technology for setting/getting and counting.


Update: further investigation reveals that the acceptance rates are 90% for browsers which (surprise, surprise) aren't IE. Requests which identify themselves as IE (which no doubt includes a number of bots) have a 10% acceptance rate. Based on a fiddler session (confirmed by firebug), the response headers look like this:

HTTP/1.1 200 OK
Content-Encoding: gzip
Expires: Wed, 09 Jun 1993 00:00:00 GMT
Vary: Accept-Encoding
Set-Cookie: foo; domain=bar.com; expires=Sat, 11-Jul-2009 11:10:19 GMT; path=/; HttpOnly
Cache-Control: no-cache
Cache-Control: private
Cache-Control: no-store
Cache-Control: must-revalidate
Cache-Control: max-stale=0
Cache-Control: post-check=0
Cache-Control: pre-check=0
Date: Thu, 11 Jun 2009 11:10:19 GMT
Transfer-Encoding: chunked
X-AspNet-Version: 2.0.50727
Server: Microsoft-IIS/6.0
X-Powered-By: ASP.NET
Content-Type: text/html
Pragma: no-cache

So I have two theories:

1). IE sees a conflict in the headers which causes it to ignore the cookie - possibly this is related to the cache-control fields?

2). Something is missing or malformed which IE requires. I can find no evidence of this googling for it.

Can anyone find fault with the headers above or have similar experience?

+1  A: 

I will update the list as I think of more reasons, but these are all possible.

  • Does your website utilize any creative form of frames?
  • Are you setting cookies with Javascript and are your users likely to have it enabled?
  • Is the Javascript cross browser compatible?
  • Is the target audience likely to have cookies disabled?
  • Do you have lots of robots crawling your site?
  • Is the tracking code on every page?

Are you comparing cookie responses to total number of HTTP requests? Do you realize some of your requests are probably images, css or other content that might not be sending the cookie header? I think I read somewhere that browsers didn't use the cookie header for certain types of content.

Sam152
No. No and No (there is no JS, this is all done with C# as it happens). It's a very large audience but most will be random public users. No. Yes, there is only one page.
annakata
Response to edit: no, definitely only comparing actual data requests, but I'm not sure you're right about browser discrimination with cookies.
annakata
A: 

I suppose the cookies are rejected by the browser due to inaccuracy in its declaration. You’re especially missing the validity information (Domain, Path and Expires/Max-age). In fact, missing Expires or Max-age information will make that cookie just a session cookie, expiring on closing the browser.

Gumbo
Sure, but all the data is set correctly in at least some cases (see edit) and I can't think of a way to catch if/when it fails.
annakata
Are these actualy sent by your application or are these just assumptions by your browser? And have you checked if that just happens with a particular browser (esp. IE)?
Gumbo
A: 

For the benefit of future readers, it turns out it's just fecking P3P on an implementation which was not expected.

DIE IE DIE

annakata