views:

77

answers:

6

If this is possible...

What would you have to use/learn in order to write something to consistently check (every 20 minutes) whether a page has been updated? It would involve logging into the system and navigating the site. As an example I'd wished I'd had in the past, a script to log onto my school's website and check for updated grades.

Thanks to all suggestions :)

A: 

will it involve things such as changing window focus, mouse movement, etc?

If so, take a look at autoit

yx
sounds like logon means "logon to website". And so no fancy mouse movements, etc, but instead may be submitting forms.
Cheeso
autoit is definitely the easiest to throw together, it repeats exactly what you tell it to do on a timer, granted while its running you won't be able to use the keyboard/mouse while its doing input, you can use it while its sleeping (in between the 20 minutes though).
yx
A: 

I personally would use Perl. But, I'd imagine there are a number of scripting languages that can accomplish this.

Cheeto
A: 

This sounds like you want to knock together something quick and dirty, use what you know, most languages have a library for this. You don't need to learn something new!

thatismatt
A: 

Or, if you are just interested in getting notified when the page has changed, you could use some existing service, such as:

Watch That Page

Ulf Lindback
A: 

Consider Selenium, specifically the Selenium IDE for Firefox; it's generally used for testing, but it lets you record and replay scripts in your browser -- and then edit them as HTML and JavaScript. You can just insert a test that stops the script once the grades differ from their "expected" values.

Skill set required:

  • HTML
  • Firefox Addons
  • JavaScript (and the DOM)
  • Selenium
ojrac
+1  A: 

This is basically a shell script based solution for a Unix or Windows with Cygwin system.

To fetch the page of interest,

wget -N <URI>

Do this once from the command line and get the file. Open the file in some editor and check for your pattern of interest. You can then use grep, sed, awk, or even perl to filter out things of interest.

You could then write this all into a shell script and either loop it with a sleep or plug it into a cron job.

You would basically need to learn some sed, awk, grep for this. But, that should be quick for the purpose at hand and what you learn would be usable in future for other such needs. You could also dive into perl or python -- but i will not suggest a start from scratch on that for what you need here.

nik
ok, this is odd. I kept a URL reference for sed too, but it does not show up!? If i edit the answer the review text shows the link too. So why does the published answer not show this link?
nik