I am designing a system that will allow users to take data from one system and send to other systems. One of the destination systems has a sophisticated SOA (web services) and the other is a mainframe that accepts flat files for input.
I have created a database that has a PublishEvent table and PublishEventType table. There are also normalized tables that are specific to the type of event being published.
I also have an "interface" table that is a flatened out version of the normalized data tables. The end user has a process that puts data into the interface table. I am not sure of the exact process - I think it's some kind of reporting application that they can export results to a SQL table. I then use an SSIS package to take the data out of the interface table and put it into the normalized data structure and create a new row(s) in the PublishEvent table. I use the flat table becausw when I first showed them the relational tables they seemed to be very confused.
I have a windows service that watches for new rows in the PublishEvent table. The windows service is extended with plug-ins (using the MEF framework). Which plug0in is called depends on the value of the PublishEventTypeID field in the PublishEvent row.
PublishEventTypeID 1 calls the plug-in that reads data from one set of tables and calls the SOA Web service. PublishEventTypeID 2 calls the plug-in that reads data from a different set of tables and created the flat file to be sent to the mainframe.
This seems like I am implementing the "Database as IPC" anti-pattern. Should I change my design to use a messaging based system? Is the process of puting data into the flat table then into the normalized tables redundant?
EDIT: This is being developed in .NET 3.5