To make a long story short, one part of the application I'm working on needs to store a somewhat large volume of data in a database, for another part of the application to pick up later on. Normally this would be < 2000 rows, but can occasionally exceed 300,000 rows. The data needs to be temporarily stored and can be deleted afterwards.
I've been playing around with various ideas and one thing came to mind today. The LONGTEXT
datatype can store a maximum of 2^32 bytes, which equates to 4 GB. Now, that's a lot of stuff to cram into one table row. Mind you, the data would probably not exceed 60-80 MB at the very most. But my question is, is it a good idea to actually do that?
The two solutions I'm currently toying with using are something like this:
- Inserting all data as individual rows into a "temporary" table that would be truncated after finish.
- Inserting all data as a serialized string into a
LONGTEXT
column in a row that would be deleted after finish.
Purely from a performance perspective, would it be better to store the data as potentially >300,000 individual rows, or as a 60 MB LONGTEXT
entry?
If it's a wash, I will probably go with the LONGTEXT
option, as it would make the part of the application that picks up the data easier to write. It would also tie in better with yet another part, which would increase the overall performance of the application.
I would appreciate any thoughts on this.