views:

83

answers:

4

What approach should I take to develop software to block traffic at the desktop level (windows) to sites based on their domain name.

Messing with a host file was easy, but the browser seems to have too long of a delay before it recognizes changes in the host file.

My ideal solution would be a simple ruby script to disable sites during work hours and then re-enable them later.

A: 

Personally, depending on your location you might look at doing this at the router level. You can filter on many of them based on MAC address, and that would be a better way of doing it.

I know that my DLink has abilities to do this, including the on at X and off at Y.

Also, you might ask this over at ServerFault for more answers.

Mitchel Sellers
Thanks, but unfortunately I am not looking for a router or server solution.
jrhicks
Doing it at the workstation level, you are going to be prone to allow users to modify/get around it, hopefully someone else can come up with a solution for you
Mitchel Sellers
Getting "around" is not an issue, I'm working towards self-imposed Internet reduction
jrhicks
+1  A: 

Mouslehole is proxy written in ruby. You can easily customize it to block pages, and even rewrite page contents while surfing. You run the proxy on your own machine and configure your favorite browser to not connect directly to the internet. More info about that at http://github.com/whymirror/mousehole/tree.

johannes
_why is awesome!
jrhicks
+2  A: 

I think this is just what you're looking for: The LeechBlock Firefox extension can block domains or even paths on domains (e.g. google.com/reader/) during set hours.

DataWraith
+1  A: 

IE does not ignore the hosts file. You should double check that you modified the right one, and added the entry correctly.

jeffamaphone
good to know, thanks
jrhicks