I would like to store Python objects into a SQLite database. Is that possible?
If so what would be some links / examples for it?
I would like to store Python objects into a SQLite database. Is that possible?
If so what would be some links / examples for it?
You can use pickle to serialize the object. The serialized object can be inserted to the sqlite DB as a bytearray field.
f=open('object.dump', 'rw')
pickle.dump(obj, f)
Now read object.dump
from the file, and write it to the sqlite DB. You might want to write it as a binary data type; read about storing binary data and blob in SQLite here. Note that according to this source, SQLite limits the size of such datafield to 1Mb.
I think that a better option would be serializing your object into a file, and keeping the file name, not contents, in the database.
One option is to use an O/R mapper like SQLObject. It will do most of the plumbing to persist the Python object to a database, and it supports SQLite. As mentioned elsewhere you can also serialise the object using a method such as pickle, which dumps out a representation of the object that it can reconstruct by reading back in and parsing.
You can't store the object itself in the DB. What you do is to store the data from the object and reconstruct it later.
A good way is to use the excellent SQLAlchemy library. It lets you map your defined class to a table in the database. Every mapped attribute will be stored, and can be used to reconstruct the object. Querying the database returns instances of your class.
With it you can use not only sqlite, but most databases - It currently also supports Postgres, MySQL, Oracle, MS-SQL, Firebird, MaxDB, MS Access, Sybase, Informix and IBM DB2. And you can have your user choose which one she wants to use, because you can basically switch between those databases without changing the code at all.
There are also a lot of cool features - like automatic JOIN
s, polymorphing...
A quick, simple example you can run:
from sqlalchemy import Column, Integer, Unicode, UnicodeText, String
from sqlalchemy import create_engine
from sqlalchemy.orm import sessionmaker
from sqlalchemy.ext.declarative import declarative_base
from random import choice
from string import letters
engine = create_engine('sqlite:////tmp/teste.db', echo=True)
Base = declarative_base(bind=engine)
class User(Base):
__tablename__ = 'users'
id = Column(Integer, primary_key=True)
name = Column(Unicode(40))
address = Column(UnicodeText, nullable=True)
password = Column(String(20))
def __init__(self, name, address=None, password=None):
self.name = name
self.address = address
if password is None:
password = ''.join(choice(letters) for n in xrange(10))
self.password = password
Base.metadata.create_all()
Session = sessionmaker(bind=engine)
s = Session()
Then I can use it like this:
# create instances of my user object
u = User('nosklo')
u.address = '66 Some Street #500'
u2 = User('lakshmipathi')
u2.password = 'ihtapimhskal'
# testing
s.add_all([u, u2])
s.commit()
That would run INSERT
statements against the database.
# When you query the data back it returns instances of your class:
for user in s.query(User):
print type(user), user.name, user.password
That query would run SELECT users.id AS users_id, users.name AS users_name, users.address AS users_address, users.password AS users_password
.
The printed result would be:
<class '__main__.User'> nosklo aBPDXlTPJs
<class '__main__.User'> lakshmipathi ihtapimhskal
So you're effectively storing your object into the database, the best way.
You other choice instead of pickling is to use an ORM. This lets you map rows in a database to an object. See http://wiki.python.org/moin/HigherLevelDatabaseProgramming for a starting point. I'd recommend SQLAlchemy or SQLObject.
Yes it's possible but there are different approaches and which one is the suitable one, will depend on your requirements.
Pickling
You can use the pickle module to serialize objects, then store these objects in a blob in sqlite3 (or a textfield, if the dump is e.g. base64 encoded). Be aware of some possible problems: questions/198692/can-i-pickle-a-python-dictionary-into-a-sqlite3-text-field
Object-Relational-Mapping
You can use object relational mapping. This creates, in effect, a "virtual object database" that can be used from within the programming language (Wikipedia). For python, there is a nice toolkit for that: sqlalchemy.
You can use pickle.dumps, its return pickable objects as strings, you would not need to write it to temporary files.
Return the pickled representation of the object as a string, instead of writing it to a file.
import pickle
class Foo:
attr = 'a class attr'
picklestring = pickle.dumps(Foo)
in short the answer is yes -- in fact, if the BLOB is compressed and generally under 2M, using SQLite is very fast. There is a easy to implement module called y_serial which can used for any arbitrary Python object including classes: http://yserial.sourceforge.net
Its syntax is NoSQL so even queries are simple ;-)