views:

430

answers:

2

I have the following code that reads the content of a url

public static String DownloadText(String url){
    StringBuffer result = new StringBuffer();
    try{
        URL jsonUrl = new URL(url);

        InputStreamReader isr  = new InputStreamReader(jsonUrl.openStream());

        BufferedReader in = new BufferedReader(isr);

        String inputLine;

        while ((inputLine = in.readLine()) != null){
            result.append(inputLine);
        }
    }catch(Exception ex){
        result = new StringBuffer("TIMEOUT");
        Log.e(Util.AppName, ex.toString());
    }
        in.close();
        isr.close();
    return result.toString();
}

The problem is I am missing content after 4065 characters in the result returned. Can someone help me solve this problem.

Note: The url I am trying to read contains a json response so everything is in one line I think thats why I am having some content missing.

A: 

how about to utilize HttpClient? I mean org.apache.http.client.HttpClient.

skysign
+2  A: 

Try this:

try {
  feedUrl = new URL(url).openConnection();
} catch (MalformedURLException e) {
  Log.v("ERROR","MALFORMED URL EXCEPTION");
} catch (IOException e) {
  e.printStackTrace();
}
try {
  in = feedUrl.getInputStream();
  json = convertStreamToString(in);

while convertStreamToString is:

private static String convertStreamToString(InputStream is) throws UnsupportedEncodingException {
  BufferedReader reader = new BufferedReader(new InputStreamReader(is,"UTF-8"));
  StringBuilder sb = new StringBuilder();
  String line = null;
  try {
    while ((line = reader.readLine()) != null) {
      sb.append(line + "\n");
    }
  } catch (IOException e) {
    e.printStackTrace();
  } finally {
    try {
      is.close();
  } catch (IOException e) {
    e.printStackTrace();
  }
}
return sb.toString();

}

LucaB