views:

169

answers:

4

Seeing what it gives I see several huge advantages:

  • Better approach to bug-free programming. Example. If you have to enable/disable a menu item with imperative programming, you should not only remember on what condition this item is enabled, but also don't forget to track all the moments where this piece of the code should be executed. In LP the latter (I suppose) is not necessary
  • Potentially a great to way to write program that starts faster. Since everything is packed in dependencies, the actual code that is necessary, runs only when it is necessary. Generally many pieces of codes take much time during the start not because it is needed right now, but because it would be needed sometimes in the future.
  • Also seems a great way to automatically apply concurrency. This is because if we can track all the dependencies of the items, we can theoretically see that some branches of the graph can be evaluated in parallel.

This is just my speculation, since I really didn't write any program with a logic programming language, but it seemed like very impressive concept. So are there disadvantages or my positive items not really true in real life?

Thanks

Max

+3  A: 

Speaking in terms of Prolog, which I've had some experience in, I find that logical programming is suited towards certain tasks but very unfathomable when it comes to debugging an application beyond a certain size. It's paradigm doesn't work with certain problems or certain scales of problems.

Prolog is not a general purpose programming language. It's intended for AI. It has a set purpose - and it doesn't really give attention to other things.

A modern programming langauge doesn't have a purpose. It just does. It is generic and equally applicable to the majority of programs a typical business case encounter. This is a huge advantage. C# knowledge is tranferable into many domains, Prolog just isn't. Writing certain types of application (let's say real-time graphics) would be horrendously painful in a logical programming language. The very concept gives me a terrible headache (seriously).

I don't think logical programming has ever been a competitor. It is always rightly used in a specialist context, not generic. It isn't fighting for popularity.

I don't know if F# brings anything new to the party. It seems to be quite popular, although I wouldn't know if you'd call it strictly logical programming, it seems like a kind of hybrid.

Rushyo
F# isn't logic programming at all. It's impure functional programming - an ML variant heavily based on Objective CAML. There is pattern matching, as in many functional languages, but no unification.
Steve314
Actually (WRT my previous comment) I should say I haven't looked at F# since around 2008, so I could be out of date - it wouldn't surprise me that much if Microsoft added some big extensions.
Steve314
Microsoft tout F# as a solution for logical programming problems. Whether that stands up to scrutiny, you probably have a better idea than I :)
Rushyo
Rushyo: Do they? I'd like to see the context.
Gabe
+2  A: 

AFAIK, Prolog and related logic programming languages never died. They are used quite heavily for some kinds of problems, and they are used more often than you might think in a Domain-Specific-Language kind of way (ie to solve a particular problem in an application mostly written in some other language).

As you note in the question, logic programming languages aren't well suited to a lot of problems involving state. But equally, imperative languages aren't well suited to a lot of problems that don't involve state.

To me, the question is a bit like asking why yacc didn't win. It (and its relatives) did win (or at least got a good placing) - but only in the particular sport of parsing. There are other sports with other winners.

EDIT Perhaps a better comparison is SQL. You wouldn't expect it to replace C, but there are plenty of C programs that use SQL to handle database querying. A prolog program is basically a database with a more sophisticated query system - Turing complete but never meant to act as a general purpose language.

Steve314
thanks, Steve314, parsing is a good example. Logical programming fits very well when a grammar is described.
Maksee
+1  A: 

Well, maybe because it don't describe what the "hardware" will actually do?

Almost all well known languages inherit from C and those languages are all imperative because C was imperative. It was required as it was first thought as some kind of high level assembler (I'm obviously too fuzzy here but you get the idea) that is essentially a (dialect made of) list of instructions for the processing unit, the hardware.

It might be easier to think linearly like a processor does computation (not today - there is a lot of internal parallelisation for processing optimization- but the general idea of a cpu is that is apply your instructions in the order you give it). As multi-threaded way of programming seems a lot of trouble for a lot of developers used to one thread execution, I guess that's true.

However I'm just guessing here, I'm not a specialist at all.

Klaim
Good point, compiling imperative languages into assembly instructions is more natural than doing the same with a plenty of declarations
Maksee
+3  A: 

Logic inference is hard to understand and implement efficiently both in terms of run time and memory. Many otherwise simple things (like side effects) are hard to express in a logic language either due to its "pseudo undeterministic" execution model (like built in backtracking in prolog) or reliance on unification instead of simpler functional model of stuff comes in stuff comes out.

Logic programming is a great match for specific applications, but horrible for 90% percent of everyday programming which is basically moving data around and updating some state.

artificialidiot