I have a 100 mega bytes sqlite db file that I would like to load to memory before performing sql queries. Is it possible to do that in python?
Thanks
I have a 100 mega bytes sqlite db file that I would like to load to memory before performing sql queries. Is it possible to do that in python?
Thanks
Here's an example (taken from here) in Tcl (could be useful for getting the general idea along):
proc loadDB {dbhandle filename} {
if {$filename != ""} {
#attach persistent DB to target DB
$dbhandle eval "ATTACH DATABASE '$filename' AS loadfrom"
#copy each table to the target DB
foreach {tablename} [$dbhandle eval "SELECT name FROM loadfrom.sqlite_master WHERE type = 'table'"] {
$dbhandle eval "CREATE TABLE '$tablename' AS SELECT * FROM loadfrom.'$tablename'"
}
#create indizes in loaded table
foreach {sql_exp} [$dbhandle eval "SELECT sql FROM loadfrom.sqlite_master WHERE type = 'index'"] {
$dbhandle eval $sql_exp
}
#detach the source DB
$dbhandle eval {DETACH loadfrom}
}
}
apsw is an alternate wrapper for sqlite, which enables you to backup an on-disk database to memory before doing operations.
From the docs:
###
### Backup to memory
###
# We will copy the disk database into a memory database
memcon=apsw.Connection(":memory:")
# Copy into memory
with memcon.backup("main", connection, "main") as backup:
backup.step() # copy whole database in one go
# There will be no disk accesses for this query
for row in memcon.cursor().execute("select * from s"):
pass
connection
above is your on-disk db.
Note that you may not need to explicitly load the database into SQLite's memory at all. Simply prime your operating system disk cache by copying it to null.
Windows: copy file.db nul:
Unix/Mac: cp file.db /dev/null
This has the advantage of the operating system taking care of memory management, especially discarding it if something more important comes along.