views:

46

answers:

1

I've tried this on a few machines on different networks, all running ruby 1.8.7 and I get the same result after a long wait.

Net::HTTP.get(URI.parse('https://encrypted.google.com/'))
  Timeout::Error: execution expired

but HTTP works fine

Net::HTTP.get(URI.parse('http://www.google.com/'))

After the inital timeout I get an EOFError instead

  EOFError: end of file reached

It's really got me stumped. If you have any ideas or you can let me know if you get the same results I'd really appreciate it.

+2  A: 

I think you need to set use_ssl to true...

example:

uri = URI.parse("https://www.google.com/")
http = Net::HTTP.new(uri.host, uri.port)
http.use_ssl = true

request = Net::HTTP::Get.new(uri.request_uri)

response = http.request(request)

puts response.body

This is cannibalized from the following Ruby Inside post.

Brian
Thanks Brian! That's just what I needed to know. After seeing how complicated Net::HTTP is for a simple url_exists? call I think I'll take Ruby Inside's advice and go with HTTParty since it gives me the same results with a lot less hassle.HTTParty.head('https://encrypted.google.com').response.is_a?(Net::HTTPSuccess)
aNoble
I think that's a great idea... HTTParty makes http operations a lot easier to work with - albeit a bit heavy as it requires several external gems (activesupport for example).
Brian