views:

20

answers:

1

I have the following piece of code:

try{
            SAXParserFactory spf = SAXParserFactory.newInstance();
            SAXParser sp = spf.newSAXParser();

            /* Get the XMLReader of the SAXParser we created. */
            XMLReader r = sp.getXMLReader();

            //This handles the xml and populates the entries array
            XMLHandler handler = new XMLHandler();


            // register event handlers
            r.setContentHandler(handler);
            String url = "http://news.library.ryerson.ca/api/isbnsearch.php?isbn="+ISBN;
            r.parse(url);

            return handler.getEntries();
        }

This code works fine most of the time, but there are several cases where a user can enter the isbn of a popular book with 100+ related ISBN's (such as harry potter for example). When that happens, the XML feed does not break, but it takes longer to load (can be up to 30+ seconds for extreme cases). When the page is loading, it never drops the connection, its just takes its time loading.

Is there a way to increase the timeout time for the function?

Thanks

A: 
//opens the URL as a stream, so it does not timeout prematurely
String u = new String("http://news.library.ryerson.ca/api/isbnsearch.php?isbn="+ISBN);
URL url = new URL(u);
InputStream stream = url.openStream();

r.parse(new InputSource(stream));
stream.close();

Solved this one myself by adding this in.

Steven1350