Is there a way to decode tinyURL links in R so that I can see which web pages they actually refer to?
I don't know R but in general you need to make a http request to the tinyurl-url. You should get back a 301 response with the actual url.
library(RCurl)
decode.short.url <- function(u) {
x <- try( getURL(u, header = TRUE, nobody = TRUE, followlocation = FALSE) )
if(class(x) == 'try-error') {
return(u)
} else {
x <- strsplit(x, "Location: ")[[1]][2]
return(strsplit(x, "\r")[[1]][1])
}
}
( u <- c("http://tinyurl.com/adcd", "http://tinyurl.com/fnqsh") )
( sapply(u, decode.short.url) )
HTH
Tony Breyal
Below is a quick and dirty solution, but should get the job done:
library(RCurl)
decode.short.url <- function(u) {
x <- try( getURL(u, header = TRUE, nobody = TRUE, followlocation = FALSE) )
if(class(x) == 'try-error') {
return(u)
} else {
x <- strsplit(x, "Location: ")[[1]][2]
return(strsplit(x, "\r")[[1]][1])
}
}
The variable 'u' below contains one shortend url, and one regular url.
u <- c("http://tinyurl.com/adcd", "http://www.google.com")
You can then get the expanded results by doing the following.
sapply(u, decode.short.url)
The above should work for most services which shorten the URL, not just tinyURL. I think.
HTH
Tony Breyal
re " don't know R but in general you need to make a http request to the tinyurl-url. You should get back a 301 response with the actual url." from neo, i tried going to tinurl.com with a tinurl but didn't get the long one back; so, my question is, how does one "make a http request" as described?