views:

357

answers:

10

RDBMS are based on Relational Algebra as well as Codd's Model. Do we have something similar to that for Programming languages or OOP?

+4  A: 

One concept may be Turing Machine.

Padmarag
+1  A: 

As I know, Formal grammars is used for description of syntax.

iburlakov
lol. If I understood your English, this would probably be the best answer. Shame I can't vote 1/2 a point for being on the right track.
T.E.D.
not quite correct. Formal grammar just specify the syntactic part. The complete semantic of a piece of software can't specified with a grammar. Relational Algebra, instead, can be used to fully specify how a DB works.
Remo.D
What exactly is 'not quite correct' about 'Formal grammars is used for description of syntax'? You said exactly the same thing yourself.
EJP
The original question is about Relational Algebra that describes the semantic of a database. The OP asked if there was something similar for programming languages and OOP, i.e. something that could describe the semantic of a piece of code, not its syntax.
Remo.D
+1  A: 

Programming languages is product of application of following theories:

  • Algorithm theory
  • Theory of computation
  • State automata
  • and more
Andrey
+5  A: 

Functional languages like lisp inherit their basic concepts from Church's "lambda calculs" (wikipedia article here). Regards

Giuseppe Guerrini
I wouldn't, necessarily call lisp functional. It CAN be written in a functional manner but is more of a multi-paradigm language.
Vatine
+1  A: 

The history section of Wikipedia's Object-oriented programming could be enlightening.

Lex
+1  A: 

The closest analogy I can think of is Gurevich Evolving Algebras that, nowadays, are more known under the name of "Gurevich Abstract State Machines" (GASM).

I've long hoped to see more real applications of the theory when Gurevich joined Microsoft, but it seems that very few is coming out. You can check the ASML page on the Microsoft site.

The good point about GASM is that they closely resemble pseudo-code even if their semantic is formally specified. This means that practitioners can easily grasp them.

After all, I think that part of the success of Relational Algebra is that it is the formal foundation of concepts that can be easily grasped, namely tables, foreign keys, joins, etc.

I think we need something similar for the dynamic components of a software system.

Remo.D
+6  A: 

Lisp is based on Lambda Calculus, and is the inspiration for much of what we see in modern languages today.

Von-Neumann machines are the foundation of modern computers, which were first programmed in assembler language, then in FORmula TRANslator. Then the formal linguistic theory of context-free-grammars was applied, and underlies the syntax of all modern languages.

Computability theory (formal automata) has a hierachy of machine-types that parallels the hierarchy of formal grammars, for example, regular-grammar = finite-state-machine, context-free-grammar = pushdown-automaton, context-sensitive-grammar = turing-machine.

There also is information theory, of two types, Shannon and Kolmogorov, that can be applied to computing.

There are lesser-known models of computing, such as recursive-function-theory, register-machines, and Post-machines.

And don't forget predicate-logic in its various forms.

Added: I forgot to mention discrete math - group theory and lattice theory. Lattices in particular are (IMHO) a particularly nifty concept underlying all boolean logic, and some models of computation, such as denotational semantics.

Mike Dunlavey
*Then the formal linguistic theory of context-free-grammars was applied, and underlies the syntax of all modern languages* - a little like saying that the theory of Western classical music underlies the work of The Beatles. :) It can be used to analyse the results, but that doesn't mean it was consciously applied by the creators. Perhaps surprisingly, many modern languages have initially been implemented with "hand-written" top-down parsers, and the BNF representation comes later, and rarely is it particularly useful by itself. C++ is notorious for being intractable by academic parsing theory.
Daniel Earwicker
@Daniel: You're refining my flip sentence. Thanks.
Mike Dunlavey
+1  A: 

If you study programming languages (eg: at a University), there is quite a lot of theory, and not a little math involved.

Examples are:

T.E.D.
+20  A: 
Norman Ramsey
@Norman - Thanks a lot for your informative and thorough answer.
Padmarag
Very comprehensive answer. Although it would be nice if you mention axiomatic semantics too. I haven't read the book "A Theory of Objects" (http://lucacardelli.name/TheoryOfObjects.html) from two reputable Microsoft researchers, but my gut feeling is that this book should cover the theory behind OOP.
Wei Hu
Mike Dunlavey
@Wei Hu: I debated over axiomatic semantics but in the end this feels to more like program logic and less like what I would call a "model". This is probably because I've been conditioned to distinguish between "syntactic theories" and "models", a distinction I doubt OP had in mind. I'd forgotten about the book by Abadi and Cardelli; I will have to revisit it and see what I can add to my answer.
Norman Ramsey
+1: excellent answer as always :)
Juliet
Wow, this answer kind of blew my mind. By the way, Norman, I was the 7777th person to view your profile! Seems like you'd appreciate that.
harpo
@Norman, why do you say "comes with a fairly large collection of 'reduction rules' for simplifying expressions"? We can get by without alpha-conversion, and when we're not reducing inside abstractions, we don't need eta-conversion either, leaving just one: beta-reduction. Are you thinking of different evaluation strategies as being the many "reduction rules"?
profjim
@Profjim: Yes. In my view, the various "structural rules" that determine when and where you can perform beta reduction are what distinguish the highly nondeterministic underlying calculus from typically deterministic programming languages.
Norman Ramsey
A: 

There are many dimensions to address your question, scattering in the answers.

First of all, to describe the syntax of a language and specify how a parser would work, we use context-free grammars.

Then you need to assign meanings to the syntax. Formal semantics come in handy; the main players are operational semantics, denotational semantics, and axiomatic semantics.

To rule out bad programs you have the type system.

In the end, all computer programs can reduce to (or compile to, if you will) very simple computation models. Imperative programs are more easily mapped to Turing machines, and functional programs are mapped to lambda calculus.

If you're learning all this stuff by yourself, I highly recommend http://www.uni-koblenz.de/~laemmel/paradigms0910/, because the lectures are videotaped and put online.

Wei Hu