views:

449

answers:

11

I am finalizing one of my projects and taking a look over the whole project looking for mistakes, bugs and performance mistakes. I am using MVC. I caught one Don't and that is:

Never put a RenderPartial within a loop. it will drastically slow down your whole server.

+5  A: 

Have you run your program through FxCop? It has a set of rules for performance.

Jay Riggs
+10  A: 

Never store a WebControl to Session.

Because it has a reference to the Page object, it ends up storing every control to session.

womp
Besides, it's a component that belongs to the page, so if you store it in the Session state and then try to use it in another page, it won't work properly. (Been there, done that...) :)
Guffa
I would like to add avoid Session state or other global state whenever possible. Not so much for performance, but for general maintainability.
JohannesH
+9  A: 

Don't optimize prematurely. :) If the site is not performant, profile the code to determine where to spend your time.

TrueWill
+3  A: 

Don't profile or otherwise judge performance in the debug configuration. The debug configuration isn't intended to be fast, and you may make performance conclusions which are wrong (like the idea that partial views/user controls are slow; this is true in debug configuration but not in release configuration). When you profile to measure performance, you should use the release configuration so that you can see where the real problems are.

Craig Stuntz
Very, VERY true. Also, don't forget to publish in release mode ;-)
Erik van Brakel
A: 

Only use try/catch blocks when necessary. They do slow down your application.

EDIT for Clarity: By "necessary" I mean, for catching real errors.

If you can write some code and be proactive to ensure the error won't be thrown do it, as it will be more performant than letting an exception be thrown, then handling it.

Don't use exceptions to control program flow. I don't know who said it first, but I recall the phrase "Exceptions should be exceptional!". They should be for the cases where unforseen issues occur, things that couldn't be tested prior to code executing and throwing them.

The worst example I see all to often is something along these lines...

int i = 0;
try
{
    i = int.Parse(txt);
} catch {Exception x) {
    // Do nothing, i = 0
}
Chad
This is of course a specific version of the more general rule "only write code that is necessary; unnecessary code slows down your application". Figuring out which lines are "unnecessary" is the hard part.
Eric Lippert
There is only a cost when an exception is actually thrown. A try catch block will have no performance impact on your code at all.
Charlie
I guess this should be more "only throw exceptions for error handling, never for the common, valid application flow because throwing an exception indeed is expensive."
Alex
@Alex not only is it expensive to throw exceptions for valid flow, it's also very illogical and hard to understand at first glance. One of the things people tend to overlook ;-)
Erik van Brakel
fantastic...clarify the meaning, and it's still getting downvoted with no explanations...
Chad
A: 

In C#, objects are always created with new. This alone can be a drawback in a certain perspective. For example, if you create an object inside a loop (meaning that a new object is created for each iteration in the loop) you can slow down your program.

for (int i = 0; i < 1000; ++i)
{
   Object o = new Object();
   //...
}

Instead create an instance outside the loop. Object o = new Object();

Object o = new Object();
for (int i = 0; i < 1000; ++i)
{
   //...
}

Only create an object in a loop if you really have to...

Perhaps doing a bit of C++ would help you understand the mechanics behind and know when and where to optimize your code. Although C++ is a different language there are a lot of things you can apply to other languages once you understand the basic of memory management (new, delete, pointers, dynamic arrays/static arrays, etc.).

Partial
This is exactly the kind of performance optimization one should avoid unless it proves to be a problem. Very often, code is just made a lot more complex without any improvement to performance at all. C++ memory management is a completly different story...
Alex
That being said, creating objects is really, really cheap. Ayende profiled this: http://ayende.com/Blog/archive/2008/02/27/Creating-objects--Perf-implications.aspx
Erik van Brakel
@Erik: If his class would actually be doing some serious operations it would not be so cheap ;)@Alex: For every new there must be a delete somewhere... in C# the garbage collector does the delete for you. That being said if you constantly create and destroy objects that are allocating a lot of memory or need to do certain jobs... well you will finish by getting a mess! If you understand C++ you would understand this.
Partial
"For every new there must be a delete somewhere...", wrong. Garbage collection it based on traversing roots. Not "deleting" unused objects. The effort to compact a manage heap and propagate them to a new generation depends on how many objects are still rooted or even worse pinned. The amount of objects being discarded is completely irrelevant if they don't need finalization. And the scientific term for "... well you will finish by getting a mess!" is "memory fragmentation".
Alex
@Alex: If objects in C# are not deleted you will be getting some memory leaks... The garbage collector deletes objects. The garbage collector's optimizing engine determines the best time to perform a collection, (the exact criteria is guarded by Microsoft) based upon the allocations being made. When the garbage collector performs a collection, it checks for objects in the managed heap that are no longer being used by the application and performs the necessary operations to reclaim their memory.
Partial
+1  A: 

Most performance problems are due to disk access or calls across networks.

So be carefull how and how often you access the file system or a database. Do you need to make so many calls across the network, or could you do it in a single call.

One good example:

  • value is stored in session
  • session is configured to use SQL server
  • value is used only once every ten requests
  • for each request the value will be read from the database then written to the database

In this case a better solution may be to write custom code to store and read the value.

Shiraz Bhaiji
+2  A: 

Do NOT fiddle around with explicit garbage collection.

Alex
+1  A: 

Caching would help you improve the performance but you should be careful use it only where it makes sense

Yassir
A: 

DO use static methods - but only if the method is frequently used.

DON'T mark a variable as static unless you really want the variable's value to be the same across all instances (another developer did this and I had fun debugging why we got odd behavior only when multiple users hit the site). This is not for performance reasons, but just good advice.

Lee Harold