views:

213

answers:

9

Why do most languages seem to only exhibit fairly basic control structures from a logic point of view? Stuff like If ... then, Else..., loops, For each, switch statement, etc. The standard list seems fairly basic from a logic point of view.

Why is there not much more in the way of logic syntactical sugar? Perhaps something like a proposition engine, where you could feed an array of premises or functions that return complicated self referential interdependent functions and results. Something where you could chain together a complex array of conditions, but represented in a way that was easy and clear to read in the code.

Premise 1

Premise 2 if and only if Premise 1

Premise 3

Premise 4 if Premise 2 and Premise 3

Premise 5 if and only if Premise 4

etc...

Conclusion

I realize that this kind of logic this can be constructed in functions and/or nested conditional statements. But why are there not generally more syntax options for structuring these kind of logical propositions without resulting in hairy looking conditional statements that can be hard to read and debug?

Is there an explanation for the kinds of control structures we typically see in mainstream programming languages? Are there specific control structures you would like to see directly supported by a language's syntax? Does this just add unnecessary complexity to the language?

+11  A: 

Have you looked a Prolog? A Prolog program is basically a set of rules that is turned into one big evaluation engine.

From my personal experience Prolog is a bit too weird and I actually prefer ifs, whiles and so on but YMMV.

Brian Rasmussen
I have heard of it but not done anything with it. Will have to explore it.
Gordon Potter
+2  A: 

It's been a long time since my Logic class in college but I would guess it's a mixture of difficulty in writing them into the language vs. the frequency with which they'd be used. I can't say I've ever had the need for them (not that I can recall). For those times that you would require something of that ilk the language designers probably figure you can work out the logic yourself using just the basic structures.

Just my wild guess though.

Scott Vercuski
+5  A: 

Boolean algebra is not difficult, and provides a solution for any conditionals you can think of, plus an infinite number of other variants.

You might as well ask for special syntax for "commonly-used" arithmetic expressions. Who is to say what qualifies as commonly-used? And where do you stop adding special-case syntax?

Adding to the complexity of a language parser is not preferable to using constructive expression syntax, combined with extensibility through defining functions.

Bill Karwin
This is exactly why. The 'complex array of conditions' listed in the question is expressed perfectly easily in terms of boolean algebra. Making a an evaluation 'engine' (something that binds values to conditions) would be pretty trivial in pretty much any environment.
Quintus
I guess I agree. But I was wondering if there are logic patterns that are not easily implemented in boolean algebra. My example is admittedly simple. I guess I would need to crack open the old college logic book to come up with an example. But you are right in that the example would probably be too rare to bother implementing.
Gordon Potter
Sounds like you are looking for a problem then. You have not given an example of when something like this would be useful.
Ed Swangren
+1  A: 

Because computers are binary, all decisions must come down to a 1/0, yes/no, true/false, etc.

To be efficient, the language constructs must reflect this.

Neil N
100111? 1 1011 11011 10.
harpo
no idea why this was downvoted ?
Neil N
+1  A: 

Eventually all your code goes down to a micro-code that is executed one instruction at a time. Until the micro-code and accompanying CPU can describe something more colorful, we are stuck with a very plain language.

Robert
But your compiler can always take complicated/high-level statements and compile it into a form suitable for a processor -- that's part of its job, after all. CPUs also don't "understand" things like objects, classes, etc., but a compiler can suitable translate those high-level constructs into the simpler, low-level instructions understood by a processor.
mipadi
+1  A: 

Because most programming languages don't provide sufficient tools for users to implement them, it is not seen as an important enough feature for the implementer to provide as an extension, and it isn't demanded enough or used enough to be added to the standard.

If you really want it, use a language that provides it, or provides the tools to implement it (for instance, lisp macros).

Brian
+1  A: 

It sounds as though you are describing a rules engine.

teabot
Yeah I think so. Thanks for the link.
Gordon Potter
+1  A: 

The basic control algorithms we use mirror what processor can do efficiently. Basicly this boils down to simple test-and-branches.

It may seem limiting to you, but many people don't like the idea of writing a simple-looking line of code that requires hundreds or thousands (or millions) of processor cycles to complete. Among these people are systems software folks, who write things like Operating Systems and compilers. Naturally most compilers are going to reflect their own writer's concerns.

T.E.D.
+1  A: 

It relates to the concern regarding atomicity. If you can express A,B,C,D in simpler structures Y, Z, why not simply not supply A,B,C,D but supply Y, Z instead?

The existing languages reflect 60 years of the tension between atomicity and usability. The modern approach is "small language, large libraries". (C#, Java, C++, etc).

Paul Nathan