views:

162

answers:

4

I have the Target Framework set to 2.0 on my windows application, yet when I try to install my app on the server, after publishing it through VS 2008, it is trying to install .Net 3.5 on the server.

I do not want to install 3.5 on my server.

When I copy the files from my local /bin/debug/ to the server and double click on the exe, nothing happens. On my local machine, my app runs.

How can I make this app run on the server without it needing the .Net 3.5 framework?

+4  A: 

Do any of your dependencies require .NET 3.5? Do you have anything in any config files which might require .NET 3.5?

I suggest you take a copy of what you've got for safekeeping, and then cut it down to the very smallest app which demonstrates the problem. In fact, you might want to start from scratch with a "no-op" app and see whether that has the same behaviour.

Jon Skeet
Good point, but no. All dependencies target 2.0.
Picflight
+1  A: 

Check unused references, perhaps? Are you actually getting an error about the 3.5 framework?

CodeByMoonlight
this bit me ...
kenny
A: 

Try building the application in release mode and deploy it to the server. You will need to grab the application from the /bin/release folder instead of the /bin/debug folder.

Also, check the target framework under the application section of the project properties.

Michael Kniskern
A: 

If you're using Visual Studio to build your setup project, open the setup project's properties and look through the settings. One setting says which .Net version will be demanded by the installer package. You have to set that; it doesn't inherit from known properties of your other projects.

Windows programmer