views:

196

answers:

8

My company has a web analytics package which we use for our own and customer marketing campaign tracking. It uses a combination of server logs, JS & image web bugs, cookies, unique cached files, and ETag headers to collect and collate user activity.

Recently we have found that a certain (unnamed) privacy-guard application which plugs into the user's browser is munging certain tracking codes with the apparent intent of preventing the user's activity from being tracked. We have purchased a copy of the app and tested locally, and it does the same for many other web bug and analytics applications including Google Analytics.

For most of these, the way in which the data is altered would prevent the tracking software from operating properly. However, they use a consistent pattern for the alterations, and due to the way that our collation works, their changes have no effect on the operation of our tracking and analytics package. (Well, there is one side effect which reduces accuracy of some timing calculations from millis to seconds.)

In a nutshell, the situation is:

  1. Our analytics results are unaffected by the application's attempt to subvert the data

  2. The user clearly intends to prevent analysis of their online activity

  3. It is possible for us to alter our application to detect the attempted blocking

  4. We would have to spend time and money patching and testing our application in order to make the attempted privacy blocking actually successful

So there is an ethical quandary, as to how much effort we should take to detect and honor the user's wishes. Some of the issues involved are:

  1. Isn't it the responsibility of the privacy app to perform as expected? There are ways they could alter the data which would prevent our analytics from tracking their users.

  2. It our responsibility to to enhance our application to detect the user's intent? This would incur both the development cost as well as eliminate valuable data (roughly 2% of our traffic is using this app).

What do you think our ethical responsibility should be?

  • We should ignore it and have our application work as-is

  • We should take the expense, lose the data, and honor the users' implied desire

  • We should contact the developers of the app and tell them a better way to stop our system from working

  • We should publicize that their software does not perform as expected

  • Other...?

EDIT: To clarify, the privacy tool simply doesn't work. Our application, without alteration, still tracks users who use it. We would have to change our app in order to not track these users.

EDIT: We do have a cookie-based opt-out which the user can select from the tracker's home page.

UPDATE: We sent a note to the company that developed the privacy application, and they said they would look into it.

+2  A: 

What ethical obligation do you have to assist me in anything that you suspect I am attempting and failing at?

Shmoopty
You've misunderstood; the user installed the app (to protect privacy) and the OP is wondering if that implies he should let the app 'destroy' his cookies even if he knows how to fix it. He's trying to *help* you. (At least, wondering *how* to).
Noon Silk
@silky That's right--the point is that the munged data tells us the user prefers not to be tracked, and the question is whether we should go out of our way to honor their implied privacy preference.
ryandenki
@silky: More specifically, OP's asking *how much effort* they should take in helping me and whether it's their responsibility.
Shmoopty
+4  A: 

I would provide a way to disable your tracking, and contact the authors of the tool and ask them to use that explicitly. Don't get into an arms-race trying to undo their work; (it will only continue); provide a trivially 'off' switch and everyone will be happy.

Noon Silk
We do already have an opt-out cookie option that users can select from the home page of the tracker.
ryandenki
I mean an API for 3rd-party privacy apps to disable it (so they don't need to 'guess' how). It should be fairly straightforward, and in the mean time I would, in order to honour the goal, disregard the 'bad' data (perhaps for a month or X period) after which you can make it clear that the 3rd parties should block in the 'new' method. If they don't do it, after a significant time period, I think you could start tracking it, or keep ignoring it, I would feel comfortable either way, I think.
Noon Silk
NOTE: I just read your edit. In this case, I would publish the 'blocking' API approach, and say that you will not change to DISREGARD the data that is currently valid, and simply strongly encourage the new API. That is very honourable, and I don't believe anyone could say you've done anything wrong with that approach.
Noon Silk
A: 

I don't see you as having even a remote social responsibility to tell the customer "hey, you're not evading our system properly", so I'd tend towards "keep going as-is".

If it sincerely bothers you, then maybe contact the developers like you mention, but there's no reason for you to expend anything on it.

Sapph
+2  A: 

I think the correct solution is to let the user decide if he wants to be tracked. As I see it, there are two ways to reach this goal:

  1. Filter those users out in your application.
  2. Tell the developer of the other application of its weaknesses.

I'd chose the approach that is less work for you. Write them an e-mail. If they don't improve their app, I would happily continue tracking. (At the same time you could consider an opt-out API like others have suggested.)

I could even imagine that you could benefit if you've got someone at that other company that knows you/your company in a positive way.

Georg
+1 for "I'd chose the approach that is less work for you". You're already acting very fair. No need to sink resources into that as well. You're in the business of making money after all.
Joachim Sauer
+8  A: 

I have been active in computer privacy issues for more than 20 years and this is the very first time I have come across a question such as yours. It is very interesting.

You have no obligation to attempt to modify your application to detect the user's efforts, and there are several reasons why I would recommend that you not follow this course of action:

  • There may be other applications that you are also rendering ineffective. You don't want to favor one application over another.
  • If you take this action, you will need to be careful for upgrades to both your application and the privacy application.
  • If you just silently modify your application, the privacy community will lose a valuable "teachable moment."

Sadly, the "privacy negotiation" part of P3P was never really implemented. It would have been an ideal situation here.

If you feel strongly about this, you are welcome to contact the developer and tell them what they are doing wrong. Alternatively, if you have an academic bent, you could write an article for a privacy conference; it would be an interesting "lessons learned" piece. You could also write a blog post, but I suspect that you do not wish the publicity.

If you want to send me a private message, I would be happy to relay the message to the developer.

vy32
You're right on all counts. We ended up contacting them, explaining what part of their approach was ineffectual. You might be interested in the details of the bug: we have some hex-encoded sequences as part of a per-user unique key. They replaced some characters in the sequence with non-hex characters so that it can't be decoded. Unfortunately, whatever algorithm they chose always changes the same characters in the same way for any given sequence. And, our application doesn't actually decode it as hex, but uses the text sequence as-is. So any given key was still unique after their alterations.
ryandenki
+3  A: 

I would just leave everything as-is and not contact the developer. If you get on their radar they might update the app to break your analytics completely.

I think the whole "tracking" fear of users is very interesting.

When tracking is used in an ethical manner it actually benefits users because the company can tell what works from a marketing perspective. This means less money wasted on marketing that can be used in other areas such a product development or even selling products at a lower cost.

IMO, makers of some of these "privacy" applications are guilty of exaggerating the "dangers" of ethical tracking to boost the demand for their products.

Dana Holt
+1  A: 

I would first let the developers know about their product's deficiency, with as much information as you can give them to make it easy for them to fix. I would then add something to detect the tool in your own code and automatically notify any sites that think they are protected, but aren't, with an explanation of what you have done. I would not alter your crawler in such a way as to respect the poorly implemented privacy guard's intention - that seems, to me, to be asking too much, and sends you down a dangerous path where you wind up (at least being perceived as) responsible for other people's software. Our software systems are generally too complex as it is - writing in special cases for broken software would, I think, just add to your system's complexity, at no great value to you, your customers, or web users generally.

If you do notify the developers and nothing has happened in a reasonable amount of time, I would consider making the product's deficiencies public. Not to shame them, but to protect users; if your software can (however inadvertently) overcome their defenses, malicious software certainly could as well.

I can't cite chapter and verse of why this course of action is ethical, but it's what makes sense, what feels right, to me. This is a great question.

Carl Manaster
This is actually what we did--we wrote out a reasonably technical description of what was wrong with the approach they were taking, and recommended a different action to take for our cookie and tracking codes. We received a "thanks, we'll look into it" mail, so we'll see if anything happens from here.
ryandenki
A: 

Personally, you should just ignore their efforts.

You're already inconsiderate of others trying to protect their privacy, why should this person using this software be any different?

Consider this, what is the primary reason "cookies" don't work for tracking? It's because people disable them on their browsers. I mean, I'm sure there's a browser out there somewhere that doesn't support cookies, maybe something ca.1993.

But today? Seriously? The only reason a cookie wouldn't work is because the user chose to disable cookies.

Sure, there may be some minute portion of the market that isn't "allowed" to use cookies, maybe a school browser, or one locked down in a business. But, really, what fraction of the total cookie blockages can that really be? (I don't know your application or what kind of traffic you're getting.)

Since cookies don't work, you've resorted to JS tricks, again something that the bulk of folks turn off due to wanting to protect themselves. Obviously, JS isn't as widespread as cookies (mobile browsers notably) in terms of support. But the argument is the same, in terms of overall data "lost" to those edge case browsers.

So, I don't know why suddenly this new system concerns you so. I'd just push on like you're doing already.

Will Hartung
You may have misread the question--the issue is how far we should go in being considerate when somebody's "privacy" software is ineffective. We _want_ to let people opt out, and we _want_ to honor their intent. "JS tricks" don't work if people disable cookies, and we provide an opt-out feature directly from the tracking server's home page.The problem in this case is that the privacy protection software they are using doesn't do something sensible like disabling cookies, but rather makes a pathetic attempt at mangling the data, hoping that this will work to block tracking, but it doesn't.
ryandenki
The reason "this new system concerns" us, is that it is clear that these users don't want to be tracked, and we would like to accommodate them. But since honoring their desire requires us to implement a specific set of workarounds to detect the privacy tool they are using and exclude their traffic, we, in effect, have to do extra work to make another company's broken software have the desired effect.
ryandenki