views:

272

answers:

11

Which paradigm is better for design and analysis of algorithms? Which is faster? Because I have a subject called Design and Analysis of Algorithms in university and have a time limit for programs. Is OOP slower than Procedure programming? Or the time difference is not big?

+4  A: 

IMO algorithms exist separat from the OO or PP issue.

Neither OO or PP are 'slow', in either design-time or program performance, they are different approaches.

lexu
A: 

For design, analysis and development: definitely OOP. It was invented solemnly for the benefit of designers and developers. For program runtime execution: sometimes PP is more efficient, but often OOP gets reduced to plain PP by the compiler, making them equivalent.

SF.
A: 

The difference (in execution time) is marginal at best.

Note that there is a more important factor than sheer performance: OOP provide the programmer with better means to organize his code which results in programs that are well structured, understandable, and more reliable (less bugs).

Itay
+4  A: 

I would think that Functional Programming would produce cleaner implementation of algorithms.

Having said that, you shouldn't see much of a difference whatever approach you take. An algorithm can be expressed in any language or development paradigm.

Update: (following comments)

Apparently functional programming does not lend itself to implementing algorithms as well as I thought it may. It has other strengths and I mostly mentioned it for completeness sake, as the question only mentioned OOP (object oriented programming) and PP (procedural programming).

Oded
I disagree. Algorithms are fundamentally about *how* you do something. The standard claim about functional programming is that you specify *what* you want, *not* how you do it. In particular, in-place updates are essential to many algorithms. With functional programming, you basically end up trying to *imply* the algorithm you need and hoping that the compiler/optimiser picks up on the implication.
Steve314
For example, for years, the whole functional programming community completely failed to spot that the normal functional implementation of the sieve of eratosthenes (a very simple and well understood algorithm) simply *wasn't* the sieve or eratosthenes at all and had entirely the wrong complexity. When thousands of intelligent people fail to spot that a simple algorithm isn't what it claims to be, you know that you don't have a clean or clear way to describe algorithms.
Steve314
For decades (in fact, until today), the whole procedural programming community completely failed to spot that the normal procedural implementation of integer arithmetic (a very simple and well understood arithmetic) simply wasn't integer arithmetic at all and had entirely the wrong results. When tens of millions of intelligent people fail to spot that a simple arithmetic isn't what it claims to be, you know that you don't have a clean or clear way to describe arithmetic. BTW: most Sieves I have seen were wrong, independent of whether they were implemented functional, procedural, OO, logic, ...
Jörg W Mittag
I believe that the O'Neill paper is brilliant, but there's plenty of examples of longstanding bugs in imperative algorithms as well.
Jörg W Mittag
@Jorg - WRT integer arithmetics, that's simply wrong. It has always been widely and well understood what was being done "wrong" and why. As for bugs in the imperative world, yes - but it isn't a fundamental requirement to be buggy as it is with functional. Functional programming *bans* a fundamental operation - in-place update. No algorithm that requires that can be implemented in a functional language. Copying is a *fundamental* part of the time and space complexity of an algorithm - if you ignore it and pretend it doesn't exist, a clever compiler may protect you - but only to a point.
Steve314
@Jorg - interestingly, integer arithmetic is one of those things are handled wrong in functional programming too. The assumption that arithmetic operations are constant time is false when the integers don't have a fixed bit-width representation. Not that arbitrary width integers are unique to functional languages - the same issue equally occurs in Python, for instance.
Steve314
One last point - recently GHC aquired an alternative LLVM-based back-end. That it gives different performance is irrelevant, but the diagnosis of why some cases are much better is interesting - the old GHC back-end fails to spot cases that should be optimised from copying to in-place updates. Unsurprising, really - doing that optimisation reliably is np-hard or np-complete (I forget which). The same Haskell code can have different time and space *complexity* depending on whether you use old-GHC, LLVM-GHC, HUGS or whatever. The algorithm itself changes depending on what compiler you use.
Steve314
I have to agree with Steve; functional programming is great for simple multithreading, but implementing algorithms simply (or sometimes at all) is *not* one of its strong points.
BlueRaja - Danny Pflughoeft
@Steve314, @BlueRaja - answer updated.
Oded
@Oded - WRT the update, I agree entirely. In fact had you recommended an impure functional language (as opposed to the functional paradigm) I'd have no complaints. The ML family languages are IMO *very* good for expressing algorithms - the functional tools are very good, but you can still use the imperative toolkit too.
Steve314
A: 

My guess is that the difference is not big enough to worry about, and the time limit should allow using a slower language, since the algorithm used would be what's important. The purpose of the time limit should IMO be to get you to avoid using for example a O(n3) algorithm when there is a O(n log n)

Ledhund
A: 

Object oriented programming abstracts many low level details from the programmer. It is designed with the goal

  • to make it easier to write and read (and understand) programs
  • to make programs look closer to the real world (and hence, easier to understand).

Procedural programming does not have many abstractions like objects, methods, virtual functions etc.

So, talking about speed: a seasoned expert who knows the internals of how an object oriented system will work can write a program that runs just as fast.

That being said, the speed advantage achieved by use PP over OOP will be very marginal. It boils down to which way you can write programs comfortably.


EDIT:

An interesting anecdote comes to my mind: in the Microsoft Foundation Classes, message passing from one object to the other was implemented using macros that looked like BEGIN_MESSAGE_MAP() and END_MESSAGE_MAP(), and the reason was that it was faster than using virtual functions.

This is one case where the library developers have used OOP, but have knowingly sidestepped a performance bottleneck.

Here Be Wolves
+1  A: 

the weak link is liekly to be your knowledge - what language & paradigm are you most comfortable with. use that

jk
+7  A: 

Object-Oriented programming isn't particularly relevant to algorithms. Procedural programming you will need, but as far as algorithms are concerned, object-oriented programming is just another way to package up procedural programming. You have methods instead of functions and classes instead of records/structs, but the only relevant difference is run-time dispatch, and that's just a declarative way to handle a run-time decision that could have been handled some other way.

Object-Oriented programming is more relevant to the larger scale - design patterns etc - whereas algorithms are more relevant to the smaller scale involving a small number (often just one) of procedures.

Steve314
Absolutely. OO is more about the structure of app design than anything else.
kyoryu
A: 

To make writing code easy and less error prone, you need a language that supports Generics - such as C++ with STL or Java with the Java Collections Framework. If you are implementing an algorithm against a deadline, you may be able to save time by not providing your algorithm with a nice O-O or Generic interface, so making the code you write yourself entirely procedural.

For run time efficiency, you would probably be best writing everything in procedural C - see e.g. the examples in "The Practice Of Programming" - but it will take a lot longer to write, and you are more likely to make mistakes. This also assumes that all the building blocks you need are available in their most up to date and efficient from in procedural C as well, which is quite an assumption these days. Most likely making use of the STL or the JFC will in practice save you cpu time as well as development time.

As for functional languages, I remember hearing functional programming enthusiasts point out how much easier to use their languages were than the competition, and then observing that those members of the class who chose a functional language were still struggling when those who wrote in Fortran 77 had finished and gone on to draw graphs of the performance of their program. I see that the claims of the functional programming community have not changed. I do not know if the underlying reality has.

mcdowella
I completely disagree about the "taking longer to write and more likely to make mistakes" statement when referring to C. It is very dependent on what we are talking about. OOP gives some great benefits when writing large user apps with lots of user interaction where the data/input is dynamic and unknown (use of generics). But for many other systems out there, writing in vanilla C is cleaner, easier, faster, and less error prone. Especially in embedded systems.
Awaken
@Awaken - in context I agree. Pedantically, though, much of C is valid C++ of course, so you can write C++ as if it were C, but still have that extra tool available when it's needed. C++ isn't Java - you don't have to wrap everything in a class, you don't get run-time dispatch by default, etc etc.
Steve314
@mcdowella - There are some good solid impure functional languages. Haskell is "pure" and quite good, but I'd argue that Haskell is impure due to monads - and while the idea is very interesting, I've come to dislike monadic pseudo-imperative code that is translated to pure functional only to be translated back to imperative, with two layers of leaky abstraction between the imperative source and the imperative machine. IMO the functional toolkit is extremely good, but you need the imperative toolkit too. Also, functional languages take some getting used to. Familiarity is key.
Steve314
A: 

Steve314 said it well. OOP is more about the design patterns and organization of large applications. It also lets you deal with unknowns better, which is great for user apps. However, for analyzing algorithms, most likely you are going to be thinking functionally about what you want to do. In that case, I'd stick to more simple PP and not try to create a fully OO design, when you care about the algorithm. I'd want to work with C or Matlab (depending on how math intensive the algorithm is). Just my opinion on it.

Awaken
A: 

I once adapted the Knuth-Morris-Pratt string search algorithm so that I could have an object that would take a character at a time and return a match/no-match status. It wasn't a straight-forward translation.

Mark Ransom