views:

92

answers:

1

Hey, I have fairly large xml file 1mb in size that i host on s3. I need to parse that xml file into my app engine datastore entirely.

I have written a simple DOM parser that works fine locally but online it reaches the 30sec error and stops.

I tried lowering the xml parsing by downloading the xml file into a BLOB at first before the parser then parse the xml file from blob. problem is that blobs are limited to 1mb. so it fails.

I have multiple inserts to the datastore which cause it to fail on 30 sec. i saw somewhere that they recommend using the Mapper class and save some exception where the process stopped but as i am a python n00b i cant figure out how to implement it on a DOM parser or an SAX one (please provide an example?) on how to use it.

i'm pretty much doing a bad thing right now and i parse the xml using php outside the app engine and push the data via HTTP post to the app engine using a proprietary API which works fine but is stupid and makes me maintain two codes.

can you please help me out?

A: 

For uploading large volumes of data, take a look at the Uploading and Downloading Data help page.

Jack M.
Yeah, its not a good usage. i need to map an xml to different entities. its no use for me.
Alon Carmel