What is the most elegant method for dumping a list in python into an sqlite3 DB as binary data (i.e., a BLOB cell)?
data = [ 0, 1, 2, 3, 4, 5 ]
# now write this to db as binary data
# 0000 0000
# 0000 0001
# ...
# 0000 0101
What is the most elegant method for dumping a list in python into an sqlite3 DB as binary data (i.e., a BLOB cell)?
data = [ 0, 1, 2, 3, 4, 5 ]
# now write this to db as binary data
# 0000 0000
# 0000 0001
# ...
# 0000 0101
Assuming you want it treated as a sequence of 8-bit unsigned values, use the array
module.
a = array.array('B', data)
>>> a.tostring()
'\x00\x01\x02\x03\x04\x05'
Use different typecodes than 'B'
if you want to treat the data as different types. eg. 'b' for a sequence of signed bytes, or 'i'
for a signed integer.
It seems that Brian's solution fits your needs, but keep in mind that with that method your just storing the data as a string.
If you want to store the raw binary data into the datbase (so it doesn't take up as much space), convert your data to a Binary sqlite object and then add it to your database.
query = u'''insert into testtable VALUES(?)'''
b = sqlite3.Binary(some_binarydata)
cur.execute(query,(b,))
con.commit()
(For some reason this doesn't seem to be documented in the python documentation)
Here are some notes on sqlite BLOB data restrictions:
See this general solution at SourceForge which covers any arbitrary Python object (including list, tuple, dictionary, etc):
y_serial.py module :: warehouse Python objects with SQLite
"Serialization + persistance :: in a few lines of code, compress and annotate Python objects into SQLite; then later retrieve them chronologically by keywords without any SQL. Most useful "standard" module for a database to store schema-less data."