tags:

views:

23

answers:

1

I'm coding this in Java for an Android device. I'd just like to read all of the data in the RSS feed and then I have a parser ready to act on the data. The trouble I'm getting is that only a small portion of the data is ever being read by my code. It then says the end of the reader has been reached and proceeds to return the partial data to the parser.

The code I have is this:

try
{
    URL contentUrl = new URL(url); 
    URLConnection conn = contentUrl.openConnection();

    HttpURLConnection httpConn = (HttpURLConnection) conn;
    httpConn.setAllowUserInteraction(false);
    httpConn.setInstanceFollowRedirects(true);
    httpConn.setRequestMethod("GET");

    httpConn.connect();

    if(httpConn.getResponseCode() == HttpURLConnection.HTTP_OK)
    {
        InputStreamReader inStr = new InputStreamReader(httpConn.getInputStream());                                 

        int charsRead;
        char[] buff = new char[BUFFER_SIZE];
        String data = "";
        while((charsRead = inStr.read(buff,0,BUFFER_SIZE))!= -1)
        {
            data += String.copyValueOf(buff, 0, charsRead);
            buff = new char[BUFFER_SIZE];
        }
        Log.d("Http","Data: "+data.);
        return data;
    }
    else
    {
        return null;
    }
}
catch(IOException e)
{
    return null;
}

The debug log output looks like this:

> D/dalvikvm(19134): GC_FOR_MALLOC freed 24 objects / 757272 bytes in 27ms
D/dalvikvm(19134): GC_FOR_MALLOC freed 2 objects / 48 bytes in 26ms
D/dalvikvm(19134): GC_FOR_MALLOC freed 3 objects / 301112 bytes in 29ms
I/dalvikvm-heap(19134): Grow heap (frag case) to 3.677MB for 305028-byte allocation
D/dalvikvm(19134): GC_FOR_MALLOC freed 0 objects / 0 bytes in 37ms
D/dalvikvm(19134): GC_FOR_MALLOC freed 10 objects / 760856 bytes in 27ms
D/dalvikvm(19134): GC_FOR_MALLOC freed 3 objects / 305112 bytes in 26ms
D/dalvikvm(19134): GC_FOR_MALLOC freed 10 objects / 770856 bytes in 26ms
D/dalvikvm(19134): GC_FOR_MALLOC freed 2 objects / 48 bytes in 26ms
D/dalvikvm(19134): GC_FOR_MALLOC freed 3 objects / 309112 bytes in 27ms
D/dalvikvm(19134): GC_FOR_MALLOC freed 22 objects / 777648 bytes in 27ms
D/dalvikvm(19134): GC_FOR_MALLOC freed 2 objects / 48 bytes in 30ms
D/dalvikvm(19134): GC_FOR_MALLOC freed 3 objects / 309288 bytes in 26ms
D/dalvikvm(19134): GC_FOR_MALLOC freed 27 objects / 778696 bytes in 27ms
D/Http    (19134): Data: <?xml version="1.0" encoding="UTF-8"?>
D/Http    (19134): <?xml-stylesheet type="text/xsl" media="screen" href="/~d/sty
les/atom10full.xsl"?><?xml-stylesheet type="text/css" media="screen" href="http:
//feeds.feedburner.com/~d/styles/itemcontent.css"?><feed xmlns="http://www.w3.or
g/2005/Atom" xmlns:openSearch="http://a9.com/-/spec/opensearch/1.1/" xmlns:geors
s="http://www.georss.org/georss" xmlns:thr="http://purl.org/syndication/thread/1
.0" xmlns:gd="http://schemas.google.com/g/2005" xmlns:feedburner="http://rssname
space.org/feedburner/ext/1.0" gd:etag="W/&quot;CUcFSHc7fip7ImA9Wx5UFk8.&quot;"><
id>tag:blogger.com,1999:blog-11300808</id><updated>2010-10-20T18:03:39.906-07:00
</updated><title type="text">Google Code Blog</title><subtitle type="html">Updat
es from Google's open source projects.</subtitle><link rel="http://schemas.googl
e.com/g/2005#feed" type="application/atom+xml" href="http://googlecode.blogspot.
com/feeds/posts/default" /><link rel="alternate" type="text/html" href="http://g
ooglecode.blogspot.com/" /><link rel="next" type="application/atom+xml" href="ht
tp://www.blogger.com/feeds/11300808/posts/default?start-index=26&amp;max-results
=25&amp;redirect=false&amp;v=2" /><author><name>Chris DiBona</name><email>norepl
[email protected]</email></author><generator version="7.00" uri="http://www.blogger.
com">Blogger</generator><openSearch:totalResults>753</openSearch:totalResults><o
penSearch:startIndex>1</openSearch:startIndex><openSearch:itemsPerPage>25</openS
earch:itemsPerPage><atom10:link xmlns:atom10="http://www.w3.org/2005/Atom" rel="
self" type="application/atom+xml" href="http://feeds.feedburner.com/blogspot/Dcn
i" /><feedburner:info uri="blogspot/dcni" /><atom10:link xmlns:atom10="http://ww
w.w3.org/2005/Atom" rel="hub" href="http://pubsubhubbub.appspot.com/" /><entry g
d:etag="W/&quot;CUcFSHc6fCp7ImA9Wx5UFk8.&quot;"><id>tag:blogger.com,1999:blog-11
300808.post-3825033765857847156</id><published>2010-10-20T18:03:00.000-07:00</pu
blished><updated>2010-10-20T18:03:39.914-07:00</updated><app:edited xmlns:app="h
ttp://www.w3.org/2007/app">2010-10-20T18:03:39.914-07:00</app:edited><category s
cheme="http://www.blogger.com/atom/ns#" term="speed tracer" /><category scheme="
http://www.blogger.com/atom/ns#" term="gwt" /><category scheme="http://www.blogg
er.com/atom/ns#" term="eclipse" /><category scheme="http://www.blogger.com/atom/
ns#" term="cloud portability" /><title>Advancing cloud computing with integrated
 developer tools by Google and VMware</title><content type="html">&lt;p&gt;&lt;i
&gt;Cross-posted from the &lt;a href="http://googlewebtoolkit.blogspot.com/"&amp;gt;
Google Web Toolkit Blog&lt;/a&gt;&lt;/i&gt;&lt;/p&gt;
D/Http    (19134):
D/Http    (19134): &lt;p&gt;
D/Http    (19134): Earlier this year at Google I/O, we &lt;a href="http://google
code.blogspot.com/2010/05/enabling-cloud-portability-with-google.html"&gt;announ
ced&lt;/a&gt; a collaboration between &lt;a href="http://www.google.com"&amp;gt;Goog
le&lt;/a&gt; and &lt;a href="http://www.vmware.com"&amp;gt;VMware&amp;lt;/a&amp;gt; focused
on making it easy to build business-oriented, &lt;a href="http://code.google.com
/cloudportability/"&gt;cloud portable&lt;/a&gt; web apps. We showed how business
es could use our integrated developer tools to build modern web apps that are ÔÇ
£cloud readyÔÇØ from the start, and can be deployed to any standard environment,
 including &lt;a href="http://code.google.com/appengine/"&amp;gt;Google App Engine&l
t;/a&gt; and on &lt;a href="http://www.springsource.com/products/cloud-applicati
on-platform"&gt;VMware vFabric&lt;/a&gt; on-premise solutions. Today we are happ
y to announce that these tools will be generally available within the next few w
eeks.
D/Http    (19134): &lt;/p&gt;
D/Http    (19134):
D/Http    (19134): &lt;p&gt;
D/Http    (19134): Of course, if youÔÇÖre itching to get a head start, you can j
ump right in by downloading the release candidate version of &lt;a href="http://
www.springsource.com/landing/best-development-tool-enterprise-java"&gt;SpringSou
rce Tool Suite&lt;/a&gt;.
D/Http    (19134): &lt;/p&gt;
D/Http    (19134):
D/Http    (19134): &lt;p&gt;
D/Http    (19134): If youÔÇÖd prefer to wait for the general release, you can &l
t;a href="http://code.google.com/cloudportability/"&amp;gt;sign up&lt;/a&gt; to be n
otified as soon as they are

If you check the feed (http://feeds2.feedburner.com/blogspot/Dcni) then you'll see that this is only a tiny fraction of the content. I thought since I'm getting the heap message that the phone may be running out of heap space and therefore quitting, but I found an RSS TTS project which has its own http code: talkingrssreader and when I pull theirs out and try it with my code it still has the same issue. Trying the talking RSS reader app with that feed doesn't have this issue.

I'm at a loss now as to what else to try. I can obtain and parse shorter RSS feeds without a problem but it seems after a certain length it just refuses to gather all the data. I've gone through a lot of RSS readers in the Android market and not seen any others cut the feed short, so that makes me believe it's an issue in my code and not the device running out of memory. This is further evidenced by the fact that my code will easily read a 119kb xml file, but will fail on the Google RSS feed, which is only 32kb.

Can anyone give me some ideas on what the issue could be?

edit: After adding the changes suggested by the100rabh and Isaac, the code now looks like this:

InputStreamReader inStr = new InputStreamReader(httpConn.getInputStream());                                 

int charsRead;
char[] buff = new char[BUFFER_SIZE];
StringBuffer data = new StringBuffer();
while((charsRead = inStr.read(buff,0,BUFFER_SIZE))!= -1)
{
    data.append(buff, 0, charsRead);
}
Log.d("Http","Data: "+data.toString());
return data.toString();

The problem is still there though. It cuts the data short in exactly the same place as it did with the previous code.

+1  A: 

I think if you avoid creating buff each time, ur problems will go off

buff = new char[BUFFER_SIZE];

the100rabh
Plus, the string concatenation technique that you use is very wasteful. Define `data` as a `StringBuffer` and use the `StringBuffer.append(char[] str, int offset, int len)` instead of what you're doing right now.
Isaac
Thank you for the help. I have changed the code now to add both suggestions, but the problem persists. There are a lot less GORW_HEAP and GC_FOR_MALLOC messages in the output, but the data is still truncated in exactly the same place it was before. I'll post the new code at the end of my original question as it won't fit here.
Teario