views:

64

answers:

2

We are using Visual Studio 2010, but this was first conceived with VS2003.

I will forward the best suggestions to my team. The current setup almost makes me vomit. It is a C# solution with most projects containing .sql files. Because we support Microsoft, Oracle, and Sybase, and so home-brewed a pre-processor, much like C preprocessor, except that substitutions are performed by a home-brewed C# program without using yacc and tools like that. #ifdefs are used for conditional macro definitions, and yeah - macros are the way this is done. A macro can expand to another macro or two, but this should eventually terminate. Only macros have #ifdef in them - the rest of the SQL-like code just uses these macros.

Now, the various configurations: Debug, MNDebug, MNRelease, Release, SQL_APPLY_ALL, SQL_APPLY_MSFT, SQL_APPLY_ORACLE, SQL_APPLY_SYBASE, SQL_BUILD_OUTPUT_ALL, SQL_COMPILE, as well as 2 more.

Also: Any CPU, Mixed Platforms, Win32.

What drives me nuts is having to configure it correctly as well as choosing the right one out of 12 x 3 = 36 configurations as well as having to substitute database name depending on the type of database: config, main, or gateway. I am thinking that configuration should be reduced to just Debug, Release, and SQL_APPLY. Also, using 0, 1, and 2 seems so 80s ... Finally, I think my intention to build or not to build 3 types of databases for 3 types of vendors should be configured with just a tic tac toe board like:

XOX
OOX
XXX

In this case it would mean build MSFT+CONFIG, all SYBASE, and all GATEWAY.

Still, the overall thing which uses a text file and a pre-processor and many configurations seems incredibly clunky. It is year 2010 now and someone out there is bound to have a very clean and/or creative tool/solution. The only pro would be that the existing collection of macros has been well tested.

Have you ever had to write SQL that would work for several vendors? How did you do it?


SqlVars.txt (Every one of 30 users makes a copy of a template and modifies this to suit their needs):

// This is the default parameters file and should not be changed.
// You can overwrite any of these parameters by copying the appropriate
// section to override into SqlVars.txt and providing your own information.

//Build types are 0-Config, 1-Main, 2-Gateway
BUILD_TYPE=1

REMOVE_COMMENTS=1

// Login information used when applying to a Microsoft SQL server database
SQL_APPLY_MSFT_version=SQL2005
SQL_APPLY_MSFT_database=msftdb
SQL_APPLY_MSFT_server=ABC
SQL_APPLY_MSFT_user=msftusr
SQL_APPLY_MSFT_password=msftpwd

// Login information used when applying to an Oracle database
SQL_APPLY_ORACLE_version=ORACLE10g
SQL_APPLY_ORACLE_server=oradb
SQL_APPLY_ORACLE_user=orausr
SQL_APPLY_ORACLE_password=orapwd

// Login information used when applying to a Sybase database
SQL_APPLY_SYBASE_version=SYBASE125
SQL_APPLY_SYBASE_database=sybdb
SQL_APPLY_SYBASE_server=sybdb
SQL_APPLY_SYBASE_user=sybusr
SQL_APPLY_SYBASE_password=sybpwd

... (THIS GOES ON)
+2  A: 

Have you ever had to write SQL that would work for several vendors? How did you do it?

I used ORM's. Check NHibernate. Since you are on VS2010, you could also check MS ADO.Net Entity Framework.
ORM enables you to abstract database in high degree. There will still be queries that should be handcrafted for each RDBMS separately, but in my experience, it is less than 10%.

For other commercial and open source ORM's check Wikipedia article: List of object-relational mapping software.

Edit: Re trading platform comment:

I never worked on such a project, so I really cannot give any meaningful advice. As a general suggestion, ORM can help you consolidate data access layer and minimize need for RDBMS specific code. Most of ORM's also can automatically generate database.

What would be the performance penalty of this is impossible for me to tell. You should make some tests with top ORMs, both commercial and open source and then decide (Microsoft would probably be interested to work with company that uses Entity Framework in your market).

Also, not to forget, introducing ORM in existing product is hard, so this is another problem you need to take into account.

zendar
Hm ... speed is very much an issue however. We cannot take a factor of 2 or even 1.3 performance hit, or else our competition will have us for breakfast. I will research this though.
Hamish Grubijan
Also, the databases themselves get built by this process - not only stored procedures but also tables. I think you are suggesting an alternative to stored procs. Unfortunately that is a no go given speed requirements and lots of legacy code, but a good thought.
Hamish Grubijan
What kind of application is this? Enterprise CRUD app like CRM, sales, inventory or something that needs much of number crunching like OLAP?
zendar
A trading platform
Hamish Grubijan
A: 

Have you ever had to write SQL that would work for several vendors? How did you do it?

Fundamentally, you do it by adding a layer of abstraction between the data tier and any other tier. Then you have a couple of choices:

  1. Use a smart ORM to build the data tier and account for database idiosyncracies
  2. Create a data tier using a provider model where you create a library for each database system that will be supported and use code that reads something out of the config file to determine which provider to load.

It sounds like the fundamental problem in your situation is that the data tier is not encapsulated to a single library and entirely shielded from any other tier. If that is true, the loose cohesion and tight coupling are some of the sources of your woes.

You should consider refactoring the current libraries so that all database interaction is done within single library and is entirely abstracted away from any tier above it via interfaces. This can be done gradually through multiple release cycles until eventually all calls go through an interface or abstract class that is database agnostic where you have factory methods that load the appropriate concrete class for the database in question.

For more info:

Thomas