RDBMS are based on Relational Algebra as well as Codd's Model. Do we have something similar to that for Programming languages or OOP?
views:
357answers:
10Programming languages is product of application of following theories:
- Algorithm theory
- Theory of computation
- State automata
- and more
Functional languages like lisp inherit their basic concepts from Church's "lambda calculs" (wikipedia article here). Regards
The history section of Wikipedia's Object-oriented programming could be enlightening.
The closest analogy I can think of is Gurevich Evolving Algebras that, nowadays, are more known under the name of "Gurevich Abstract State Machines" (GASM).
I've long hoped to see more real applications of the theory when Gurevich joined Microsoft, but it seems that very few is coming out. You can check the ASML page on the Microsoft site.
The good point about GASM is that they closely resemble pseudo-code even if their semantic is formally specified. This means that practitioners can easily grasp them.
After all, I think that part of the success of Relational Algebra is that it is the formal foundation of concepts that can be easily grasped, namely tables, foreign keys, joins, etc.
I think we need something similar for the dynamic components of a software system.
Lisp is based on Lambda Calculus, and is the inspiration for much of what we see in modern languages today.
Von-Neumann machines are the foundation of modern computers, which were first programmed in assembler language, then in FORmula TRANslator. Then the formal linguistic theory of context-free-grammars was applied, and underlies the syntax of all modern languages.
Computability theory (formal automata) has a hierachy of machine-types that parallels the hierarchy of formal grammars, for example, regular-grammar = finite-state-machine, context-free-grammar = pushdown-automaton, context-sensitive-grammar = turing-machine.
There also is information theory, of two types, Shannon and Kolmogorov, that can be applied to computing.
There are lesser-known models of computing, such as recursive-function-theory, register-machines, and Post-machines.
And don't forget predicate-logic in its various forms.
Added: I forgot to mention discrete math - group theory and lattice theory. Lattices in particular are (IMHO) a particularly nifty concept underlying all boolean logic, and some models of computation, such as denotational semantics.
If you study programming languages (eg: at a University), there is quite a lot of theory, and not a little math involved.
Examples are:
- Finite State Machines
- Formal Lanugages (and Context Free Grammars like BNF used to describe them)
- The construction of LRish parser tables
There are many dimensions to address your question, scattering in the answers.
First of all, to describe the syntax of a language and specify how a parser would work, we use context-free grammars.
Then you need to assign meanings to the syntax. Formal semantics come in handy; the main players are operational semantics, denotational semantics, and axiomatic semantics.
To rule out bad programs you have the type system.
In the end, all computer programs can reduce to (or compile to, if you will) very simple computation models. Imperative programs are more easily mapped to Turing machines, and functional programs are mapped to lambda calculus.
If you're learning all this stuff by yourself, I highly recommend http://www.uni-koblenz.de/~laemmel/paradigms0910/, because the lectures are videotaped and put online.