I am tired of clicking "File" and then "Save Page As" in Firefox when I want to save some websites.
Is there any script to do this in Python? I would like to save the pictures and css files so that when I read it offline, it looks normal.
I am tired of clicking "File" and then "Save Page As" in Firefox when I want to save some websites.
Is there any script to do this in Python? I would like to save the pictures and css files so that when I read it offline, it looks normal.
This is a non-Python answer and I'm not sure what your machine is running, but have you consider using a site ripper such as wget?
import os
cmd = 'wget <parameters>'
os.system(cmd)
You could use wget
wget -m -k -E [url]
-E, --html-extension save HTML documents with `.html' extension.
-m, --mirror shortcut for -N -r -l inf --no-remove-listing.
-k, --convert-links make links in downloaded HTML point to local files.
Like Cobbal stated, this is largely what wget is designed to do. I believe there's some flags/arguments that you can set to make it download the entire page, CSS + all. I suggest just alias-ing into something more convenient to type, or tossing it into a quick script.