views:

1185

answers:

13

As time goes by, it appears more and more like functional programming is having more of an effect on other programming languages. We're starting on Prolog in my AI class, and it seems like there are some things there that would make programming in non-AI fields easier. My question is this: why hasn't logic programming caught on in the same way?

In this topic, it seems that a general consensus was reached that logic programming is useful but must be proven as such. Is there a reason why it's not seen as useful?

Update: Perhaps I should be a bit more clear. I'm not really asking about Prolog. I can see why it wouldn't be a good idea to choose Prolog for most real-world applications.

To give an example of more what I'm talking about, consider list comprehensions/map/filter in Python. These are clearly influenced by functional languages. Why is it that languages such as Python haven't also picked up on these kinds of things from logic programming languages like they have functional languages?

+3  A: 

From my own experience with Prolog it is very nice for specific tasks, but as a general purpose programming language it is simply not as flexible as a lot of other languages in my opinion.

Brian Rasmussen
+3  A: 

Functional programming is becoming more popular because it has some important benefits when applied to multi-threaded programming, and as we move more and more toward multi-core processors multi-theaded programming will become more and more important.

Joel Coehoorn
Logic programs can parallelize quite well too.
TrayMan
+2  A: 

It is a fact that Prolog is very nice for specific tasks but in my 11 years of work experience I have never had a problem where Prolog would be the best solution. It is just a matter that the kind of problems where Prolog would be ideal, is very rare.

Michael
+6  A: 

They're still teaching Prolog? The percentage of my university courses that Prolog took up was vastly disproportionate to its usefulness in the real world. I've never seen it used anywhere. I'm sure it's popular in some academic circles and in some niche of the market, but for the vast majority of students it's a complete waste of time.

Prolog is not used because its strong points are very (very!) rarely needed and when they are, the overhead of learning it is just not worth it. Add to that, that you can't find a lot of people with the necessary knowledge to support programs written in Prolog and it's no wonder companies avoid it. Besides you can achieve the same thing in other far more prevalent languages that come with vastly superior IDEs and other tools.

There are of course cases where Prolog is the way to go, but these form a mind-numbingly small percentage of the cases you'll deal with in the industry.

Manos Dilaverakis
Not every worthwhile piece of software is written by a company.
mquander
We covered it for a few weeks in Programming Languages. Not a whole lot more than "Here's Prolog; isn't it neat? You won't see this again unless you take AI."
Adam Jaskiewicz
It's taught because it's an important learning tool. There are something that are very easy to express in Prolog-type systems. Likewise, Scheme / LISP are good teaching tools for recursion and garbage collections.
Bob Cross
@Bob I agree, it's taught basically to blow your mind and show you that there's more to programming than C#/Java style languages.
Outlaw Programmer
@Bob I agree. What annoyed me about Prolog is that, at my university at least, they wasted months on it when there were far more important things. To give you an idea, when I graduated about 7 yrs ago I had no idea what source control was. And I suspect they still do this kind of thing today.
Manos Dilaverakis
@Manos Universities exist to teach you how to think about things. There should have been a software engineering class that taught source code control; did you take one? Prolog and other odd languages will open up your mind more than source code control will.
David Thornley
@David They were far too busy teaching us the waterfall model in the software engineering class. Getting a taste of something different like Prolog was interesting, but my employer/clients benefited far more from me knowing things like source control, than understanding Prolog.
Manos Dilaverakis
Prolog was used at Ericsson for telco switch programming. Erlang started out as a Prolog library, then an extension to Prolog before it became its own language. *When* it became its own language, the first implementation was written in Prolog.
Jörg W Mittag
The current JIT compiler prototype for the PyPy Dynamic Language VM Framework is written in Prolog, because it makes experimentation *way* easier than writing it in RPython or even C.
Jörg W Mittag
That's just two examples of usages of Prolog outside of AI. Also, as others mentioned, the idea of logic programming isn't necessarily intertwined with Prolog.
Jörg W Mittag
@Manos: Waterfall? Ewwww. If Software Engineering didn't teach source code control, either it was a bad course or you're getting rather old. Then again, some of the Mech E students I knew didn't seem to get the idea of a mechanism very well.
David Thornley
@Manos Dilaverakis: "the cases you'll deal with in the industry" -- well, that depends on what you expect to do in your job. Complex problems may need quick prototypes, and I *do* know some developers who will implement them in either Perl or Prolog first, then later in Java if the idea turns out to work as expected.Prolog is a *very* nice prototyping language, so long as you don't need fast number-crunching (and even then, there are Prolog implementations which implement efficient mutable arrays).
Jay
+1  A: 

The fact that you're starting on Prolog in your AI class should be a hint. AI hasn't caught on too well, either.

I remember back in '80 challenging a professor to demonstrate significant uses for AI (ok, "scoffing' would be a more accurate term, but I was younger then). He couldn't do it then, and today, I suspect he's teaching about 1/10th the applications for AI that he was raving about back then.

Maybe the same applies to Prolog. I don't remember the last time I've seen a company looking for Prolog experience. Maybe never, or maybe I saw it and ignored it.

John Saunders
AI is the study of stuff we can't really do yet. If we can do something, and understand it, it isn't AI any more. There never have been and never will be widespread AI applications, by definition. That doesn't make it useless.
David Thornley
I didn't say useless; I said "no significant uses". I'm sure its use might be significant in an academic sense; just not in a commercial sense.
John Saunders
Pattern recognition (by various means, including neural networks) and fuzzy logic have plenty of significant use, if you accept those as AI.
TrayMan
@John: Yup. My point is that the reason there are no commercial applications of AI is that, by the time it's commercial, it ain't AI any more.
David Thornley
So you're saying weak AI isn't AI and strong AI which doesn't exist only exists in academia but had it existed then when it was commercialised, then it wouldn't be strong any more?
Henrik
One very good example of a significant use for AI was the DARPA Grand Challenge. The technologies developed for this challenge all applied a lot of topics that come from AI studies, and will soon (next 5 years) be present in the cars you drive.
Laurent
To reiterate what others have said: there are many technologies which we use every day that came out of AI research. Search engines, grammar checkers, optical character recognition (OCR), navigation (eg, Google Maps), computer opponents in video games, credit card fraud detection....
Barry Brown
Those cases make my point - they are the low-hanging fruit and windfalls off of a very high tree. We were promised an entire orchard.
John Saunders
And we were supposed to be jetting around in flying cars by 1990, too.
Barry Brown
+2  A: 

I found Prolog to be absolutely fascinating when I took it in my AI classes in college, but I can only think of a handful of situations where I'd use it outside of AI today. And even in those situations, I'd rather not use it. It took me a long time to finally "get" Prolog, and when I did, I thought it was great, but I immediately saw it's limited range of usefulness.

Still, I highly recommend learning it, if for no other reason than to learn to approach programming in another different way. Being able to look at a problem from many different vantage points absolutely makes you a better programmer.

unforgiven3
+2  A: 

The first problem I had with Prolog is that it isn't a logic programming language. It lacks the three-valued standard logic of "true", "false", and "don't know", conflating the two latter ones. In other words, the two truth values are actually "can be shown" and "cannot be shown". This gives Prolog real problems with the idea of "not", which is pretty basic to logical reasoning.

In normal logic, it's perfectly reasonable to prove a proposition by disproving its negation, this being called "reductio ad absurdam" (unless I've misspelled it). (Yes, there are people who have tried to reconstruct mathematics without using it, but that's getting a bit esoteric.) This simply doesn't work in Prolog, since there's no distinction between proved false and not proved anything.

Therefore, when I did a class project in Prolog, I got into trouble whenever I thought of it as programming logic. I'd always wind up doing something that required actual negation. Perhaps other people don't do that, but I wound up thinking of it as a pattern-matching language, and then had little difficulty finishing the project.

It's not possible to have a true logic-based language where the programmer can write things and really rely on the results. First-order predicate calculus (i.e., logic with variables, true-or-false functions, "and", "or", "not", "for all", and "there exists") is positively undecidable. (There are reasons why we keep pouring coffee into mathematicians rather than generate all possible theorems mechanically, after all.) There is no way for a programmer to know a priori whether a given proposition will be proved or not, even if the programmer already knows it to be true or false.

Edit: I also forgot the critical necessity of ordering clauses properly. In logic, it doesn't matter in what order you write things down. In Prolog, I kept getting into infinite loops, until I stopped treating it as a logic-based language. Again, it has some nice features as a pattern-matching language, but it isn't logic, and it seemed to me like a one-trick pony of a language. YMMV, but some other people seem to agree with me.

David Thornley
It's 'reductio ad absurdum' and Gödel's Incompleteness Theorem that tells us that the existence of a (dis)proof for a formula is undecidable is closely related to the termination problem, which hints why automated theorem proving can in fact work quite well in practice.
TrayMan
Prolog has the cut rule so it doesn't need reductio ad absurdum. Are you using it?
rjh
@rjh: Cut prevents backtracking. It's not in any sense a logical operation. It allows dealing with negation only in some limited cases and easily leads to mistakes by changing the program semantics in a non obvious way.
TrayMan
The order of clauses matters in Prolog precisely because it is a programming language. It's not and it doesn't try to be an automated theorem prover. Prolog differs from ML style pattern matching by instantiating variables in input terms. This wouldn't make any sense in a functional language.
TrayMan
Right - it's a misnamed programming language. It isn't even a fully declarative language. A true logical programming language would be much more useful, in my opinion.
David Thornley
I suppose I should clarify my above comment. It's probably not possible to have a true logical programming language, but it would be very useful. Something like cold fusion, possibly anti-gravity.
David Thornley
+5  A: 

My impression of plain Prolog is that it's a toy language. That's not to say that logic programming can't be useful. For example, in Twelf it's possible to declare semantics for a simple programming language quite easily and have the declarations act as an interpreter. I've also heard some good things about λProlog.

The problem I think in trying to use a logic programming language as a general purpose language is that some tasks just don't fit the concept very well. I think that logic programming features need to be incorporated into a language that also has imperative and functional constructs. There is at least one such language: Oz, but I've yet to try it out.

Edit: There's one idea I've wanted to try for some time: Feed a relational database to Prolog as atoms and use it to do queries instead of SQL. I've a feeling it would be a great improvement over SQL.

TrayMan
+2  A: 

This is the standard case of the using the right tool for the job. You do see logic programming in certain situations: they're usually called something like rule-based or expert systems.

In your computability theory course, you'll discuss the fact that all of these general purpose languages are effectively equivalent (from a mathematical standpoint). However, prototyping and long-term development are very different domains. So, it's entirely possible that, in a rule-based system, you might work out your ruleset using something like Prolog (if that works well for you) and then later implement the final system on your delivery platform (e.g., Java).

BTW, I always loved the way Prolog's answer to almost everything is "no."

Bob Cross
+4  A: 

When we learned prolog (just a couple of weeks in a programming languages class, so I'm hardly an expert), the professor also pointed out that depending on your definitions, logic programming may not actually be programming at all.

When we say programming, we usually refer to something like 'issuing a series of tasks or instructions to the computer'. That's what imperative or functional programming languages do. That's what every "real" programming language does. Prolog, or logic programming, doesn't really do that. It's more like SQL. You get to ask the computer a number of questions, pretty much. It'll answer, to the best of its ability, based on the data you fed it previously, but unlike other programming paradigms, you're not really telling the computer what to do.

It's just very specialized, and not suited for general purpose programming. And the things it is specialized for are not often needed.

Functional programming, on the other hand, is definitely general purpose programming, and can be used for anything, with no major problems. Which is why the latter is catching on, and logic programming isn't. I think... :)

jalf
The type of programming you're calling "real" is generally called "imperative programming". SQL and Prolog are both examples of "declarative programming". http://en.wikipedia.org/wiki/Declarative_programming
rmeador
Nope, I'm not talking about imperative programming specifically (I include functional programming as well). I'm simply pointing out that depending on how you define it, declarative languages may not be programming at all (there are plenty of debates of whether HTML or SQL are programming languages)
jalf
+15  A: 

When you learn about logic-programming in Computer Science classes using Prolog, the main point is not to make you a proficient Prolog programmer, but rather to open up your mind to alternate forms of programming techniques (data-structures/algorithms) that you'd not have considered before.

To illustrate my point when I started studying Computer Science my engineering school used to require all students to write their software programs in Pascal, but since I graduated I've never used used Pascal even once. But the skills I learned by picking the right data-structures and algorithms I am still using every single day.

Pascal is not showing up on my resume as a language I know, but it has been instrumental in my training as a software engineer. Its usefulness cannot only be measured by the number of line of Pascal code currently in production.

When you'll develop software you'll realize that, even though you are not writing a single line of Prolog code, you are at times re-using techniques that you may have first learned in these "useless" Prolog or AI classes you attended.

Evaluating the usefulness of a technology (specific programming language, specific software tool/application) is not simply a matter of evaluating its actual level of use, but rather its influence.

If you look at the influence logic-programming has had in the field of expert systems, computer games AI, air traffic control, and probably quite a number of other fields (suggestions anybody?) I don't think it can be said logic-programming has not caught on...

Laurent
+1  A: 

I spent about 4 years of my programming career working on a rule based "Expert System" for provisioning and configuring hardware for telephone exchanges based on customer requirements.

It was very successful, and as far as I know is still in daily use more than 10 years later. But finding programmers who could understand how it worked was a greater task than developing the system itself.

I think this is why the approach has not taken off, because few people have the necessary mindset for logic based programming compared to the number of people that can understand the concepts of procedural and functional programming.

Logic programming languages provide a mechanism for feeding facts and rules of inference into an "inference engine" which is then set in motion to apply the rules to the given facts in order to produce new facts. A particular logic language stands or falls on the strength of it's particular inference engine.

Prolog's inference engine has a very naive implementation and is very inefficient. The same problem could be solved more efficiently in most procedural languages just by writing a lot of if statements in a loop.

The language we chose for our "Configurator" was DEC's RuleWorks which is a refinement of the more widely known OPS5 language. This has an inference engine based on the Rete Algorithm which makes it far more efficient than the procedural approach.

Since DEC got swallowed by Compaq which got swallowed by HP, RuleWorks has become open source and be obtained from this web page.

It's a shame there is not more interest in such techniques because they can be very effective for solving a variety of otherwise intractable problems.

Noel Walters
+4  A: 

I recently used a pile of logic programming in a game design AI research project (teaser video!), but at least half of my logic-heavy project was functional or imperative Scala code implementing a basic game engine. The point of programming, if I may claim such a thing, is to bring the machine's understanding of what you want into sync with your own -- and most of syncing is requires giving the imperative details, at some level, of how you want something done. Those silly machine are always so literal...

Logic programming, whether it be the traditional, deductive, prolog-style stuff, or the more exotic inductive logic programming or answer set programming flavors, gives massive leverage for some flavors of problems at the cost of being able to easily communicate imperative knowledge (which is always needed somewhere in real-world apps). Sometimes the concerns of an interactive application make even the slightest hits to your productivity in expressing imperative knowledge unacceptable. Writing an entire game engine in a logic programming style will always be a bad idea (likewise for trying it using perl-compatible regular expressions which have the same computational power). In a hybrid language (or in one that lets you embed a logic interpreter easily) you can have the best of both worlds (I used jTrolog to embed Prolog in my Scala engine, forming a multi-paradigm voltron of sorts).

I think logic programming could certainly stand to be more popular and well understood, but, in some sense, pure-logic programming can't do much better than SQL or regexes in terms of "catching on" because its magic comes from taking away your imperative expressiveness (keeping you from getting lost in unimportant details, ideally). This explanation applies almost equally to functional programming. I love logic programming, but only because I have a choice about when to use it. The best way forward seems to be hybrid languages that present that choice in a consistent, well-designed manner.

rndmcnlly