views:

569

answers:

1

Im currently developing a system where the user will end up having large arrays( using android). However the JVM memory is at risk of running out, so in order to prevent this I was thinking of creating a temporary database and store the data in there. However, one of the concerns that comes to me is the SDcard limited by read and write. Also another problem that comes to mind is the overhead of such an operation. Can anyone clear up my concerns, as well as also suggest a possibly good alternative to handling large arrays ( in the end these arrays will be uploaded to a website by writing a csv file and uploading it).

Thanks, Faisal

A: 

A couple of thoughts:

  1. You could store them using a dbms like Derby, which is built into many versions of java
  2. You could store them in a compressed output stream that writes to bytes - this would work especially well if the data is easily compressed, i.e. regularly repeating numbers, text, etc
  3. You could upload portions of the arrays at a time, i.e. as you generate them, begin uploading pieces of the data up to the servers in chunks
aperkins
well with the first 2 options, whats the use of using derby instead of using Sqlite?and the second option, wouldnt that still be using up the sd cards read and write? or do i have a misconception about how the sdcard read write work?
Faisal Abid
with the third option, the user may not be connected to the internet all the time.
Faisal Abid
with the second option - you could be writing to a byte array that remains in memory, using a bytearrayoutputstream (or something similar - the exact class name escapes me this morning)
aperkins
Wont that still cause a java out of memory error?
Faisal Abid
Not if you are compressing it - wrapping the stream in a compression stream, something like a GZip stream or something similar.
aperkins
Great thanks for the help!
Faisal Abid