views:

53

answers:

4

We have several SQL Server databases containing measurements from generators that we build. However, this useful data is only accessible to a few engineers since most are unfamiliar with SQL (including me). Are there any tools would allow an engineer to extract chosen subsets of the data in order to analyze it in Excel or another environment? The ideal tool would

  1. protect the database from any accidental changes,
  2. require no SQL knowledge to extract data,
  3. be very easy to use, for example with a GUI to select fields and the chosen time range,
  4. allow export of the data values into a file that could be read by Excel,
  5. require no participation/input from the database manager for the extraction task to run, and
  6. be easy for a newbie database manager to set up.

Thanks for any recommendations or suggestions.

A: 

Excel allows to load the output of stored procedures directly into a tab. That IMO is the best way: users need no knowledge of SQL, they just invoke a procedure, and there are no extra moving parts besides Excel and your database.

AlexKuznetsov
+1  A: 

I would recommend you build your own in Excel. Excel can make queries to your SQL Server Database through an ODBC connection. If you do it right, the end user has to do little more than click a "get data" button. Then they have access to all the GUI power of Excel to view the data.

FrustratedWithFormsDesigner
+3  A: 

First off, I would never let users run their own queries on a production machine. They could run table scans or some other performance killer all day.

We have a similar situation, and we generally create custom stored procedures for the users to "call", and only allow access to a backup server running "almost live" data.

Our users are familiar with excel, so I create a stored procedure with ample parameters for filtering/customizations and they can easily call it by using something like:

EXEC YourProcedureName '01/01/2010','12/31/2010','Y',null,1234

I document exactly what the parameters do, and they generally are good to go from there.

To set up a excel query you'll need to set up the data sources on the user's PC (control panel - data sources- odbc), which will vary slightly depending on your version of windows.

From in excel, you need to set up the "query", which is just the EXEC command from above. Depending on the version of Excel, it should be something like: menu - data - import external data - new database query. Then chose the data source, connect, skip the table diagram maker and enter the above SQL. Also, don't try to make one procedure do everything, make different ones based on what they do.

Once the data is on the excel sheet, our users pull it to other sheets and manipulate it at will.

Some users are a little advanced and "try" to write their own SQL, but that is a pain. I end up debugging and fixing their incorrect queries. Also, once you do correct the query, they always tinker with it and break it again. using a stored procedure means that they can't change it, and I can put it with our other procedures in the source code repository.

KM
Minor note: Microsoft Query is the Office component that allows you to do this in Excel [1]: http://office.microsoft.com/en-us/excel-help/use-microsoft-query-to-retrieve-external-data-HA010099664.aspx
Conrad Frix
A: 

Depending on your version of SQL server I would be looking at some of the excellent self service BI tools with the later editions such as Report Builder. This is like a stripped down version of visual studio with all the complex bits taken out and just the simple reporting bits left in.

If you setup a shared data source that is logging into the server with quite low access rights then the users can build reports but not edit anything.

I would echo the comments by KM that letting the great unwashed run queries on a production system can lead to some interesting results with either the wrong query being used or massive table scans or cartesian joins etc

Kevin Ross