views:

160

answers:

9

I have a client who has a product-based website with hundreds of static product pages that are generated by Microsoft Access reports and pushed up to the ISP via FTP (it is an old design). We are thinking about getting a little more sophisticated and creating a data-driven website, probably using ASP.NET MVC.

Here's my question. Since this is a very small business (a handful of employees), I'd like to avoid enterprise patterns like web services if I can. How does one push updated product information to the website, batch-style? In a SQL Server environment, you can't just push up a new copy of the database, can you?

Clarification: The client already has a system at his facility where he keeps all of his product information and specifications. I would like to refresh the database at the ISP with this information.

A: 

Why do you need to push anything?

You just need to create a product management portion of the webpage and a secondly a public facing portion of the webpage. Both portions would touch the same SqlServer database.

Nathan Koop
He's not going to host the website, or the database. It will be on an ISP. Sorry I didn't say that.
Robert Harvey
A: 

.Net has the ability to monitor a database and check for updates. then you can run a query to [push] the data elsewhere.

or use sql to push the data with a trigger on the table(s) in question.

Is this what you were looking for?

griegs
We already know how to get the data out of his application, we just need a simple way to update the database on the ISP with it.
Robert Harvey
+1  A: 

If you are modernizing, and it is a handful of employees, why would you push the product info out batch style?

I don't know exactly what you mean by "data driven", but why not allow the ASP.NET app to query the SQL Server product catalog database directly? Why generate static pages at all?


UPDATE: ok, I see, the real question is, how to update the SQL database running at the ISP.

Cheeso
"data driven" meaning live data, not generated static pages. And generating static pages means you can host those static pages anywhere that can host html, you don't need any dynamic language engines running on the server, nor do you need the database on the server. not to mention its very fast, easy to mirror and easy for search engines to find and traverse.
Soviut
We will query the catalog directly, but it won't be in-house. It will be at the ISP, so we need a simple way to refresh the data on that database from ours.
Robert Harvey
ok, well this sounds like a job for a bulk upload. Depending on the ISP you should have access to the SQL Server hosted there. From your corp network, where the master database is, you can connect to the ISP's hosted SQL Server and do bulk uploads. If you have a .mdf database file, you may be able to just transfer it via FTP or some secure upload. Just ask the ISP - they will have seen this requirement before.
Cheeso
Even if you wanted to keep your app local you could still have it talk to the SQL service on the ISP remotely. In most real database installations, the database runs on its own complete server that everything from websites to client apps connect to in order to get their information.
Soviut
@Cheeso why keep two copies of the database in the first place?
Soviut
@Soviut - Well, it's not my question, so I don't know for sure, but I am guessing that the master copy is on the corporate network, and it is not directly accessible from the ISP-hosted web app. So the goal is to get the product database from the corp network to the ISP (and the hosted SQL Server there), so that the ASP.NET code running at the ISP can have access to the data. I think. This is pretty standard - even when companies host their own web apps, they keep a master product db and then move snapshots to the web app network periodically.
Cheeso
That's a good guess, Cheeso.
Robert Harvey
+1  A: 

You create an admin panel so the client can edit the data directly on the server. It is perfectly reasonable to have the client keep all their records on the server as long as the server is backed up nightly. Many cloud and virtual services offer easy ways to do replicated backups.

The additional benefit of this model is that more than one user can be adding or updating records at a time, making the workforce a lot more scalable. Likewise, the users can log in from anywhere they have a web browser to add new records, fix mistakes made in old records, etc.

EDIT: This approach assumes you can convince the client to abandon their current data entry system in favor of a centralized web-based management panel. Even if this isn't the case, the SQL database can be hosted on the server and the client's application could be made to talk to that so you're only ever using one database. From the sounds of it, it's a set of Access forms and macros which you should have source access to.

Soviut
If he has his own system, and an admin panel for the website, won't he be doing double-entry?
Robert Harvey
No, the ONLY place the client will enter data will be on the server. Never in a local app. What's the point once its all online and accessible from anywhere?
Soviut
The client is already entering this information into his work systems. Why make him do it again for the website?
Robert Harvey
You could have him enter it using the admin panel *instead* of the work systems. This unifies the interface and centralizes the data.
Soviut
A: 

You can try Dynamic Data Web Application.

Jack
We already know how to build the website, we just need a simple way to exchange data between his data entry system, and the web server's database located on the ISP
Robert Harvey
what's his data entry system? does it use database? if you try to export it, is the file going to be in CSV, Excel spreadsheet or what? i do a lot of import/export at work and what i do is write a simple windows console app. this approach became less ideal when dealing with large record set. just an idea.
Jack
+3  A: 

You don't mention what exactly the data source is, but the implication is that it's not already in SQL Server. If that's the case, have a look at SSIS.

If the source data is in SQL Server, then I think you'd want to be looking at either transactional replication or log shipping to sync the two databases.

Mike Powell
An interesting idea. I will consider it. Can it be done over the web?
Robert Harvey
You mean SSIS? I don't actually know a whole lot about it other than it can be used to transform data to and from SQL (in fact it used to be called Data Transformation Services). If your source data isn't in SQL Server, you'd need some process locally that periodically packages updates into a file and sends them (via FTP?) to your hosting site. Then you'd have an SSIS package on the remote SQL Server that processes the file and applies it to the database.
Mike Powell
Just like to add my two cents here and vote up for a DTS scenario. I've sucessfully used it (on a batch file process using dtsrun.exe) to replicate data in certain tables between two remote databases.
Paul
If i were you i would prefer to have the application consume the data, rather than just dumping it into the db (even if that is easier). You dont want to have to rely on the validity of the batch-dumped file.
cottsak
+1  A: 

Assuming that there is no way to sync the data directly between your legacy system DB (is it in Access, or is Access just running the reports) and the SQL Server DB on the website (I'm not aware of any):

The problem with "pushing" the data directly into the SQL server will be that "old" (already in the DB) records won't be updated, but instead removed and then recreated. This is a big problem with foreign keys. Plus, I really don't like the idea of giving the client any access to the db at all.

So considering that, I find that the best is to write a relatively simple page that takes an uploaded file and updates the database. The file will likely be CSV, possibly XML. After a few iterations of writing these pages over the years, here's what I've come up with:

  1. Show file upload box.

  2. On next page load, save file to temp location

  3. Loop through each line (element in XML) and validate all the data. Foreign keys, especially, but also business validations. You can also validate that the header row exists, etc. Don't update the database. 3a. If invalid data exists, save an error message to an array
  4. At the end of the looping, show the view. 4a. If there were errors, show the list of error messages and tell them to re-upload the file. 4b. If there were no errors, create a link that has the file location from #2 and a confirmation flag

  5. After the file location and confirm flag have been submitted run the loop in #3 again, but there's an if (confirmed) {} statement that actually makes the updates to the db.

EDIT: I saw your other post. One of the assumptions I made is that the databases won't be the same. ie, the legacy app will have a table or two. Maybe just products. But the new app will have orders, products, categories, etc, etc. This will complicate "just uploading the file".

James S
A: 

You should have a service that regularly updates the data in the target DB. It will probably run on your source data machine (where the Access-DB is)

The service can use SSIS or ADO.NET to write the data. You can do this over the web, because you have access via TCP/IP to the server I assume.

Please check when the updates are done and how long it takes. If you can do the updates during the night you are fine. If not you should check, if you can still access the web during the import. That is sometimes not the case.

Malcolm Frexner
A: 

Use wget to push the new data file to the mvc app and once the data is received by the action, the mvc app invokes the processing/importing of the data (maybe in a worker process if you dont want long requests).

cottsak