views:

415

answers:

5

I am tired of clicking "File" and then "Save Page As" in Firefox when I want to save some websites.

Is there any script to do this in Python? I would like to save the pictures and css files so that when I read it offline, it looks normal.

+1  A: 

probably a tool like wget is more appropriate for this type of thing.

cobbal
A: 

This is a non-Python answer and I'm not sure what your machine is running, but have you consider using a site ripper such as wget?

import os
cmd = 'wget <parameters>'
os.system(cmd)
NoahD
+7  A: 

You could use wget

wget -m -k -E [url]

-E, --html-extension        save HTML documents with `.html' extension.
-m,  --mirror             shortcut for -N -r -l inf --no-remove-listing.
-k,  --convert-links      make links in downloaded HTML point to local files.
Cato Johnston
Thanks this was the most helpful, but I found out that the "-p" flag is the one that is needed. "wget -k -p www.google.com"
Unknown
+1  A: 

Like Cobbal stated, this is largely what wget is designed to do. I believe there's some flags/arguments that you can set to make it download the entire page, CSS + all. I suggest just alias-ing into something more convenient to type, or tossing it into a quick script.

Sector Corrupt
A: 

Have you looked at HTTrack?

DoxaLogos