views:

128

answers:

3

As part of a release we run a load of PL/SQL scripts against a database. Recently someone left the ; off the end of a line in one script that was called another script so this meant that script did not get run. Because this did not cause an error, it just didn't get run, it took quite a while to track down what had happened.

I want to check the scripts before they are run for lines in them that are missing either a ; at the end or a / on the line after. This is made more complicated as 'lines' in the script could actually span more than one line if it is statement or block of code.

To me this seems like to do this I'm going to have to parse the scripts then check they meet the above.

I've found ANTLR and wonder if this might be a way to do it since there seem to be existing PL/SQL grammars but looks like that's going to be a step learning curve for what's just a simple check.

Does anyone know an easy way or any other tools, eclipse plugins etc that I can use to check for lines in the scripts that are missing either a ; at the end or a / on the line after?

Update We already do most of the stuff Tom H suggested. The scripts are run into our test server and we have a version table that gets updated at the end. The problem was that the missing semi-colon in the container script meant one script did not get run but the rest including the one to update the version number ran without errors. Therefore the problem only got picked up quite a way into testing. This needed the database restored before running the scripts with the missing semi-colon added so basically resulted in half a day of testing time being lost. If there was a simple way to check this before running the scripts into the test server it could save quite a bit of time.

+1  A: 

Even if the scripts are different for every release (and not part of a defined source control structure that creates or replaces database objects) I would adopt a practice of breaking the scripts down into the most fundamental units of work per file and deploying them through Ant with the standard sql task. You probably have these types of scripts:

  • CREATE or REPLACE dbobject...
  • SQL DML scripts
  • Anonymous PL/SQL blocks

If you standardize on a consistent statement delimiter (I suggest using "/" since it works with all of the cases above) and set the deployment to fail on error, then Ant will either deploy all of the files or indicate why it couldn't.

I think it would be very difficult to otherwise parse files of one or more SQL and/or PLSQL statements and find missing delimiters if there are no standards on delimiter choice or statements per file.

dpbradley
A: 

Just a thought, but are you going about this the wrong way?

I assume, at the file-level, the lack of a semi colon in the file was not a problem? but it only became a problem when run via the batch processing? If that's the case maybe you can change your batch processing to cope with this.

If it was the file, then testing should have picked it up. You don't want to parse your input files to make sure they compile etc.

MattH
+2  A: 

I agree with MattH that you may be going about this the wrong way. I would just add an insert statement to the end of all of your scripts which insert a "version" row into a table in the database. At the end of your deployment scripts it's then an easy task to check that the version table has all of the correct rows in it.

Also, you should have all of your release scripts being run exactly as they will be in production against your QA server. That's where all of the testing takes place. You never do anything to the server besides what is in your release steps - you only run the release scripts and if those release scripts are ever changed then you refresh the QA server with them and redo testing.

When you go to production your release process has then been fully tested. As a fail safe measure you can also use tools like Red Gate's SQL Compare and SQL Data Compare to check that production matches the QA server. The data compare would only be against certain tables (look-up tables, etc.). If you have data changes to major tables (1M rows, etc.) then you can right a custom script to check that they are correct.

Tom H.
We do most of this already, the issue was that it was only picked up by testing. A lot of time could have been saved if there was a way to quickly check the scripts for this kind of thing **before** they get run into the test server.
Chris R
As I said then, a log table or versions table that tracks all of the scripts that have been successfully run against the database is probably your safest bet.
Tom H.