views:

128

answers:

4

I am working on a .NET internal test tool at the moment. The tool is GUI based at the moment. One of the things I want the tool to be able to do is to run in command line mode. This way we can run it in an automated fashion and have it crunch on some data every day.

We started to put in a Command line mode in it but I am just not happy with it. It feels clumsy-ish and just tacked on. I am looking for a more elegant solution that will scale relatively easily as we provide more functionality to the app.

One of the thoughts I had is modeled after PowerShell and Exchange Server. Exchange server apparently built some 800 cmdlets and then used them to build their UI on top of. This way everything their UI can do, you can do via a script using those cmdlets. I really like that to be honest. It's elegant and scales naturally as they add more functionality.

What ideas do you guys have for something like this? Anyone out there tried the PowerShell route I mentioned? Share your thoughts.

Thanks

+3  A: 

Building on a common API is a great idea. Then the API method becomes the 1st class citizen which works equally well from the UI layer or console.

However, you need to weigh how much this will be used and how far you want to take it. Writing a powershell provider is great if you have a large audience or a lot of use, but it may be overkill for a smaller audience.

A easy solution is to create a shared class for all of your business logic which both the console app and the UI can reference. Don't put any actual logic in either the UI project or console project, then you only need to write it once. That's an easy way to scale and not have it feel clumsy-ish.

Scott Forsyth
A: 

One of the elegant things about the unix/linux shell is the ability to chain together small applets to build functionality. Perhaps you can build the core functionality as a daemon/service and use command line or GUI to chat with it (as unknown suggested).

Jay
+3  A: 

PowerShell provides a very robust framework to have a CLI interface along with GUI. The great thing is that these two very different interfaces can have share the same code!

I've done this before and I can tell you the experience was a pleasant one. I cannot say enough positive things about the Cmdlet framework. The architecture of the whole system is quite exquisite and purposeful.

As it relates to your question, I think it always comes down to choosing the right tool for the right job. This cannot be overlooked as I think we as software engineers tend to always want to try the latest and greatest for our current problems. It's part of what makes our job fun! Given the limited insight I have into your project, it does seem like PowerShell could fit the bill from a design choice. However, as I'm sure you're aware, rewriting the GUI to use the PowerShell pipeline will most likely change a massive portion of the existing application (which might be a good thing if budget allows for it).

Overall, if you're wanting a scriptable interface into your application along with a GUI front end, and the added bonus of true code sharing... PowerShell is a good choice. Also, if you end up going this route you could even check out psake to help with your automation.

Scott Saad
A: 

Powershell scripts allow you to take advantage of .NET libraries and write scripts which are almost as powerful as the .NET code itself. You can do many powerful operations like call external DLLs, use .NET namespaces like System.IO, System.Net, run processes and intercept output, call web services etc. The possibilities are endless. Here, I will show you a Powershell script which assists you in day to day deployment of websites. Everyday, we make changes to web projects, which need to be deployed to development servers, sometimes on beta servers, and finally on the production server. Using this script, you can automate all the manual work that you do again and again on your deployment package every time you upload your website to some server. We use this script in Pageflakes every single day during our local development server upload, beta release, and final production server release. All we do is run the script, go to the server, and extract a zip file on the web folder, and that's all. The new version gets deployed within two minutes without any manual work at all, and completely removes any possibility of human error during deployment. Automating deployment

i have written a Powershell script does the following for you:

* Maintains different configuration information for different deployments. For example, different connection strings for development servers and production servers (one or more production servers).
* Creates a deployment folder using the deployment date, time, and version so that you have a separate folder for each deployment and can keep track of things deployed on a day, e.g., 20061214-1.
* Copies only the change files and some predefined files to the deployment folder. So, you don't deploy the whole website every day.
* Copies the web.config and customizes the <appSettings>, <connectionString>, <assemblies> etc., as per the deployment configuration. For example, you can have different connection strings for different servers.
* Updates all JavaScript files with a version number so that in every deployment, a new file gets downloaded by client browsers.
* Updates default.aspx automatically with the modified script file name.
* Compresses all JavaScript files that gets deployed.
* Compresses all static HTML files using an Absolute HTML Optimizer.
* Creates a zip file which contains the deployment package.
* FTP the zip file to a target server.

After the deployment script runs, all you need to do is extract the zip file on the server and that's all!

You can easily FTP the modified files instead of copying only the zip file, by changing the FTP part at the end of the script.

Prathamesh