views:

121

answers:

6

Don't think that I'm mad, I understand how php works!

That being said. I develop personal website and I usually take advantage of php to avoid repetion during the development phase nothing truly dynamic, only includes for the menus, a couple of foreach and the likes.

When the development phase ends I need to give the website in html files to the client. Is there a tool (crawler?) that can do this for me instead of visiting each page and saving the interpreted html?

A: 

Maybe, command line will help?

Anton Gogolev
how it can help?
Col. Shrapnel
@Col I suppose it can execute your PHP scripts and spit out the output.
Anton Gogolev
yes it can. as well as regular version do. the solution is still unclear to me.
Col. Shrapnel
+6  A: 

You can use wget to download recursively all the pages linked.

You can read more about this here: http://en.wikipedia.org/wiki/Wget#Recursive_download

Adirael
+1 for wget, that's how I'd do it.
Josh
+1 for wget. That's how I've done it in the past and would do it again.
Rob Wilkerson
+3  A: 

If you need something more powerful that recursive wget, httrack works pretty well. http://www.httrack.com/

Daniel Papasian
Nice tool, thank you. Exactly what I was searching.
0plus1
And I happen to benefit from now knowing of this tool too. Thanks!
erisco
A: 

If you're on windows, you can use Free Download Manager to crawl a web-site.

jeroen
+1  A: 

If you want to use a crawler, I would go for the mighty wget.

Otherwise you could also use some build tool like make.

You need to create a file nameed Makefile in the same folder of your php files.
It should contain this:

all: 1st_page.html 2nd_page.html 3rd_page.html

1st_page.html: 1st_page.php
    php command

2nd_page.html: 2nd_page.php
    php command

3rd_page.html: 3rd_page.php
    php command

Note that the php command is not preceded by spaces but by a tabulation. (See this page for the php command line syntax.)

After that, whenever you want to update your html files just type

make

in your terminal to automatically generate them.

It could seem a lot of work for just a simple job, but make is a very handy tool that you will find useful to automate other tasks as well.

Dom De Felice
+1  A: 

Pavuk offers much finer control than wget. And will rewrite the URLs in the grabbed pages if required.

symcbean
Wow, that tool is incredible! Thank you!
0plus1