tags:

views:

84

answers:

2

I am creating a master database using SQLite. This single file contains a dozen tables. I want everybody in my group to have access to it (either through Python or through the SQLite command line) and I was thinking of simply putting the file in a group readable directory and calling it 'master.db'.

Now, the buzz word 'ODBC' comes to mind.

I remember my old Windows days (thank God they're gone), when I could "register" and Excel spreadsheet into my "data sources"; it would be made visible from within any application.

Can I take advantage of this in the Linux world? Does this make any sense?

Many thanks.

A: 

First off, here's the wikipedia article for ODBC. It's a standard software API for interacting with a database. However, I doubt you need to use it if you want to use SQLite & Python together. Python has a built-in module called sqlite3 that should help you along the way. Good Luck.

BenHayden
+1  A: 

If it's only small numbers of people who access you db then you should be ok. However, SQlite sucks at large numbers of concurrent accesses - I switched over from sqlite to mysql when I had lots of concurrent access because the db kept getting locked.

For many concurrent accesses, you need something in between it that takes the brunt of the requests, like a (small) webserver or even a python program that can take all those requests and can regulate the access to the sqlite db. Use an ORM like SQLAlchemy or SQLObject that can access sqlite in a threadsafe way.

As for using odbc on linux to make sqlite accessible as a data source to the linux users, while ODBC certainly works on linux, and an sqlite odbc driver exists ( http://www.ch-werner.de/sqliteodbc/ ) I don't think you can 'register' your sqlite via odbc into a central list of data sources - that's pure windows...

Alex Boschmans
Please define "lots" of concurrent access. Since SQLite locks the whole database for a write; a few writers may not interfere much with each other. But how many became a problem for you?
S.Lott
At that time I was using threads to query a webservice and store the results into the sqlite database. I had 4 threads running and ran into "database is locked" when I had more than 10 keywords queueing up (which is strange, methinks, as I did not change the nr of threads, just the number of keywords to look for). I found a post similar to http://stackoverflow.com/questions/524797/python-sqlite-and-threading that discussed this and recommended switching to mysql. I did that and have had no more problems.
Alex Boschmans