views:

26

answers:

0

Hi , i am writing one shell script to download and display the content from a site and i am saving this content in my local file system.

i have used the following wget in the script to get the content :

/usr/sfw/bin/wget -q -p -nH -np --referer=$INFO_REF --timeout=300 -P $TMPDIR $INFO_URL

where INFO_REF is the page where i need to display the content from INFO_URL.

The problem is that i am able to get the content(images/css) as an html page but in this html the links on the images and headlines,which are pointing to different site are not working and the path of the URLs(image links) are changing to my local file system path.

I tried adding -k option in the wget and with this option these urls are pointing to correct location but now the images are not coming as the images path are changing from relative to absolute location.(without -k images are coming properly)

Please tell what option i can use so that images and the links in teh page both come properly.Do i need to use two seperate wget commands one for images and other for links in the page.