views:

1297

answers:

7

I have a need to do auditing all database activity regardless of whether it came from application or someone issuing some sql via other means. So the auditing must be done at the database level. The database in question is Oracle. I looked at doing it via Triggers and also via something called Fine Grained Auditing that Oracle provides. In both cases, we turned on auditing on specific tables and specific columns. However, we found that Performance really sucks when we use either of these methods.

Since auditing is an absolute must due to regulations placed around data privacy, I am wondering what is best way to do this without significant performance degradations. If someone has Oracle specific experience with this, it will be helpful but if not just general practices around database activity auditing will be okay as well.

+1  A: 

I'm not sure if it's a mature enough approach for a production system, but I had quite a lot of success with monitoring database traffic using a network traffic sniffer.

Send the raw data between the application and database off to another machine and decode and analyse it there.

I used PostgreSQL, and decoding the traffic and turning it into a stream of database operations that could be logged was relatively straightforward. I imagine it'd work on any database where the packet format is documented though.

The main point was that it put no extra load on the database itself.

Also, it was passive monitoring, it recorded all activity, but couldn't block any operations, so might not be quite what you're looking for.

Colin Coghill
A: 

What, exactly, are you looking to audit? Data changes? Queries? Something else?

If you want to audit every query and every change to the data, that is necessarily going to put some load on the database. Intercepting packets probably isn't going to work well because the Oracle packet format isn't documented.

Most organizations, though, are looking to audit a subset of "everything" if only because an audit trail that includes everything is nearly impossible to query and thus it makes it relatively easy for an attacker to hide an attack in the noise of the base line audit trail. If we can narrow down the focus a bit, that's probably the most efficient way to move forward.

Also, what version (and edition) of Oracle are you using? Fine grained auditing implies that you're probably on the enterprise edition at least. Is this something that you would be thinking about spending money on (i.e. investing in Audit Vault, Total Recall, etc)? Or is this something where you'd want to just use the basic functionality included with the enterprise edition?

Justin Cave
A: 

Justin, thanks for the good follow up questions. The database is on Oracle 10g Enterprise Edition. We are looking to audit only part of the total data, what is classified as "sensitive" information. But the volume of activity on that data is high, so I want to use some way that will minimize the load on the database.

GotoError
A: 

So is the intention to audit data changes? Or are you also auditing queries?

Auditing data changes is something that can obviously cause a fair amount of additional IO. Total Recall (an extra cost option in 11g) is a really slick way to capture the changes asynchronously without interfering with the OLTP transactions. But it doesn't sound like this would be terribly useful in your case since you're still on 10g (unless upgrading is an option, of course). What, exactly, do you need to record when data changes? Are you just recording the fact that the data changed or do you need information about who changed it? If you need the "who", how are your applications authenticating? Auditing actual Oracle users is obviously a bit easier than if you have a bunch of application users in a USERS table and everything is being done as a single Oracle database user.

If this is a data privacy issue, is it possible that you could add appropriate security policies (potentially Fine Grained Access Control/ Virtual Private Database/ Label Security) to more closely control who has access to what data? If you can limit what rows a particular user can see, that can often obviate the need to audit to verify that the user didn't query unauthorized data.

Justin Cave
A: 

There is no need to "roll your own". Just turn on auditing:

  1. Set the database parameter AUDIT_TRAIL = DB.
  2. Start the instance.
  3. Login with SQLPlus.
  4. Enter the statement
    audit all;
    This turns on auditing for many critical DDL operations, but DML and some other DDL statements are still not audited.
  5. To enable auditing on these other activities, try statements like these:
    audit alter table; -- DDL audit
    audit select table, update table, insert table, delete table; -- DML audit

Note: All "as sysdba" activity is ALWAYS audited to the O/S. In Windows, this means the Windows event log. In UNIX, this is usually $ORACLE_HOME/rdbms/audit.

Check out the Oracle 10g R2 Audit Chapter of the Database SQL Reference.

The database audit trail can be viewed in the SYS.DBA_AUDIT_TRAIL view.

It should be pointed out that the internal Oracle auditing will be high-performance by definition. It is designed to be exactly that, and it is very hard to imagine anything else rivaling it for performance. Also, there is a high degree of "fine-grained" control of Oracle auditing. You can get it just as precise as you want it. Finally, the SYS.AUD$ table along with its indexes can be moved to a separate tablespace to prevent filling up the SYSTEM tablespace.

Kind regards, Opus

Opus
A: 

If you want to record copies of changed records on a target system you can do this with Golden Gate Software and not incur much in the way of source side resource drain. Also you don't have to make any changes to the source database to implement this solution.

Golden Gate scrapes the redo logs for transactions referring to a list of tables you are interested in. These changes are written to a 'Trail File' and can be applied to a different schema on the same database, or shipped to a target system and applied there (ideal for reducing load on your source system).

Once you get the trail file to the target system there are some configuration tweaks you can set an option to perform auditing and if needed you can invoke 2 Golden Gate functions to get info about the transaction:

1) Set the INSERTALLRECORDS Replication parameter to insert a new record in the target table for every change operation made to the source table. Beware this can eat up a lot of space, but if you need comprehensive auditing this is probably expected.

2) If you don't already have a CHANGED_BY_USERID and CHANGED_DATE attached to your records, you can use the Golden Gate functions on the target side to get this info for the current transaction. Check out the following functions in the GG Reference Guide: GGHEADER("USERID") GGHEADER("TIMESTAMP")

So no its not free (requires Licensing through Oracle), and will require some effort to spin up, but probably a lot less effort/cost than implementing and maintaining a custom solution rolling your own, and you have the added benefit of shipping the data to a remote system so you can guarantee minimal impact on your source database.

David Mann
A: 

use dbsecrets from www.softmagnet.in

info