tags:

views:

58

answers:

2

I've written Perl code using WWW::Mechanize to retrieve a webpage. When I retrieve http webpages it works fine but it doesnt work for https. I've checked and I have the Crypt::SSLeay package. What else could be wrong?

The error message is..

Error GETing https://www.temp.com: Can't Connect to www.temp.com:443 <Bad hostname 'www.temp.com'> at scrape.pl line 8
+1  A: 

Apparently, I needed to add the following in my file

$ENV{'HTTPS_PROXY'} = 'http://proxy:port/';

for Crypt::SSLeay

Rajesh
+2  A: 

I've seen in your related Mechanize question that you call the proxy method with only the http and ftp schemes. Try again with https included.

It's probably more useful to set up the proxy environment variables since then all programs can take advantage of this central configuration instead of configuring proxies for each program separately. Do not forget https_proxy. Call the env_proxy method instead of proxy to use them.

daxim
Setting the https_proxy worked for login. But it gives error GETing the dashboard page which is again https://temp.com/dashboard
Rajesh
Infact I get this error for any 'get' of a ssl page.
Rajesh