Due to "circumstances beyond my control"tm, I am using sqlalchemy with an MS Access backend. I'm having an issue that the following code:
def delete_imports(self, files_imported_uid):
table_name = 'my_table'
delete_raw = self.meta.tables[table_name].delete()
self.engine.execute(delete_raw.where(
self.meta.tables[table_name].c.files_imported_uid == files_imported_uid)
)
throws a "File sharing lock count exceeded." error with large tables. The statement generated is just:
DELETE FROM my_table WHERE my_table.files_imported_uid = ?
with a parameter of the uid. The statement is then executed via pyodbc. MSDN provided some suggestions to get around the problem, before informing me that they won't work if the database is on a Novell NetWare server, which it is.
Is there a known work around that I can use (preferably at the sqlalchemy layer), or do I need to create some ugly hack that selects the top 9,000 records at a time to delete and loops until finished?