tags:

views:

284

answers:

5

In Oracle LongRaw and Varchar2 have a max length of 4kb in Oracle, but I need to store objects of 8kb & 16kb, so I'm wondering what's a good solution. I know I could use a Blob, but a Blob has variable length and is basically an extra file behind the scenes if I'm correct, a feature and a Price I'm not interested in paying for my Objects. Are there any other solutions or datatypes that are more suited to this kind of need?

Thanks

+1  A: 

Varchar2 is of variable length just as well. If you need to store binary data of any bigger than small size in your database, you'll have to look in blob's direction. Another solutiuon is of course storing the binary somewhere on the file system, and storing the path to the file as a varchar in the db.

tehvan
Yeah the file solution is a possibility but I hope to keep it all on the database, because I'll be needing the replication/clustering features. With the file I'd need to roll out my own clustering, I guess.
Robert Gould
+2  A: 

Why don't you segment the binary data and store it in 4K chunks? You could either have four different columns for these chunks (and a length column for rebuilding them into your big structure) or the more normalized way of another table with the chunks in it tied back to the original table record.

This would provide for expansion should you need it in future.

For example:

Primary table:
    -- normal columns --
    chunk_id integer
    chunk_last_len integer
Chunk table:
    chunk_id integer
    chunk_sequence integer
    chunk varchar2(whatever)
    primary key (chunk_id,chunk_sequence)

Of course, you may find that your DBMS does exactly that sort of behavior under the covers for BLOBs and it may be more efficient to let Oracle handle it, relieving you of the need to manually reconstruct your data from individual chunks. I'd measure the performance of each to figure out the best approach.

paxdiablo
this is the solution I was considering, but kind of hoped there was a "better" standard solution, Anyways I'm thinking I'll be doing this unless someone has a better idea
Robert Gould
The better solution is to use a BLOB in the first place.
WW
+3  A: 

A blob is not a file behind the scene. It is stored in the database. Why does it matter that it has variable length? You can just use a blob column (or clob if your data is text data) and it gets stored in its own segment.

Yas
+2  A: 

Don't store binary data in varchar2 columns, unless you are willing to encode them (base64 or similar). Character set issues might corrupt your data otherwise!

Try the following statement to see the effect:

select * from (select rownum-1 original, ascii(chr(rownum-1)) data from user_tab_columns where rownum<=256) where original<>data;

ammoQ
had forgotten about this one!
Robert Gould
+2  A: 

You should use a BLOB.

A BLOB is not stored as an extra file, it's stored as a block in one of your datafiles (just like other data). If the BLOB becomes too large for a single block (which may not happen in your case) then it will continue in another block.

If your BLOB data is really small, you can get Oracle to store it inline with other data in your row (like a varchar2).

Internally, Oracle is doing something similar to what PAX suggested. The chunks are as big as a DB block minus some overhead. If you try and re-invent Oracle features on top of Oracle it's only going to be slower than the native feature.

You will also have to re-implement a whole heap of functionality that is already provided in DBMS_LOB (length, comparisons, etc).

WW