views:

411

answers:

3

I am working on an application that uses Oracle's built in authentication mechanisms to manage user accounts and passwords. The application also uses row level security. Basically every user that registers through the application gets an Oracle username and password instead of the typical entry in a "USERS" table. The users also receive labels on certain tables. This type of functionality requires that the execution of DML and DDL statements be combined in many instances, but this poses a problem because the DDL statements perform implicit commits. If an error occurs after a DDL statement has executed, the transaction management will not roll everything back. For example, when a new user registers with the system the following might take place:

  1. Start transaction
  2. Insert person details into a table. (i.e. first name, last name, etc.) -DML
  3. Create an oracle account (create user testuser identified by password;) -DDL implicit commit. Transaction ends.
  4. New transaction begins.
  5. Perform more DML statments (inserts,updates,etc).
  6. Error occurs, transaction only rolls back to step 4.

I understand that the above logic is working as designed, but I'm finding it difficult to unit test this type of functionality and manage it in data access layer. I have had the database go down or errors occur during the unit tests that caused the test schema to be contaminated with test data that should have been rolled back. It's easy enough to wipe the test schema when this happens, but I'm worried about database failures in a production environment. I'm looking for strategies to manage this.

This is a Java/Spring application. Spring is providing the transaction management.

+2  A: 

First off I have to say: bad idea doing it this way. For two reasons:

  1. Connections are based on user. That means you largely lose the benefits of connection pooling. It also doesn't scale terribly well. If you have 10,000 users on at once, you're going to be continually opening and closing hard connections (rather than soft connection pools); and
  2. As you've discovered, creating and removing users is DDL not DML and thus you lose "transactionality".

Not sure why you've chosen to do it this but I would strongly recommend you implement users at the application and not the database layer.

As for how to solve your problem, basically you can't. Same as if you were creating a table or an index in the middle of your sequence.

cletus
I wish I could upvote this more than once...
Dan Vinton
I didn't get to choose the architecture, but I believe it was chosen to support row-level security. Also, that's interesting that you brought connections up. We have noticed that a large amount (around 150) of connections are being used up for five users. The application injects the user's credentials into Spring's UserCredentialsDataSourceAdapter on each request.
hoffmandirt
A: 

I'll disagree with some of the previous comments and say that there are a lot of advantages to using the built-in Oracle account security. If you have to augment this with some sort of shadow table of users with additional information, how about wrapping the Oracle account creation in a separate package that is declared PRAGMA AUTONOMOUS_TRANSACTION and returns a sucess/failure status to the package that is doing the insert into the shadow table? I believe this would isolate the Oracle account creation from the transaction.

dpbradley
+1  A: 

You should use Oracle proxy authentication in combination with row level security.

Read this: http://www.oracle.com/technology/pub/articles/dikmans-toplink-security.html

tuinstoel