During my studies at university I had to learn a lot about the theory of computation. I studied the subject for three terms. I had a hard time and I have to admit that I forgot a lot.
I am wondering whether this is a personal problem, or if we just had to learn a lot of (more or less) useless stuff.
So my question is: What topics in the field of the theory of computation do you think are most important, which parts are worth learning about, and which topics do you use during your normal work?
Personally, I am glad that I heard about the theory of languages (especially the regular languages => regular expressions - when they can be applied and when not) and about the different time (and space) complexities, in particular the O(n) notations.
But we had to study a lot more, including:
- computability theory
- halting problem
- semidecidable problems
- theory of complexity
- p=np?
- theory of logic
- propositional calculus
- predicate logic
It was interesting to hear about these topics, but I am not sure how necessary it is to study them in depth.
I know this question is subjective and the answers will differ a lot depending on your day-to-day work and personal experience. But I'd like to know about topics that might be more interesting than I remember.