views:

882

answers:

21

Why is theory useful? Do you ever use it in your day-to-day coding? For example, we learned about the halting problem, Turing machines, reductions, etc... a lot of classmates are saying it's abstract and useless and there's no real point to knowning any of it (meaning, you can forget it once the course is over and not lose anything).

+14  A: 

The things I use most:

  • computational complexity to write algorithms that scale gracefully
  • understanding of how memory allocation, paging, and CPU caching work so I can write efficient code
  • understanding of data structures
  • understanding of threading, locking, and associated problems

As to that stuff on Turing machines etc. I think it is important because it defines the constraints under which we all operate. Thats important to appreciate.

Nick
Complexity is a good thing to know about. It helps with the design part of development - the actual coding is not nearly as important.
Wahnfrieden
A: 

I do not use it on a daily basis. But it gave me a lot of understanding that helps me each day.

Gamecat
+6  A: 

A friend of mine is doing work on a language with some templates. I was asked in to do a little consulting. Part of our discussion was on the template feature, because if the templates were Turing complete, they would have to really consider VM-ish properties and how/if their compiler would support it.

My story is to this point: automata theory is still taught, because it still has relevance. The halting problem still exists and provides a limit to what you can do.

Now, do these things have relevance to a database jockey hammering out C# code? Probably not. But when you start moving to a more advanced level, you'll want to understand your roots & foundations.

Paul Nathan
+2  A: 

I think understanding the fundamental models of computation is useful: sure you never need to be able to translate a Turing machine into a register machine in practice, but learning how to see that two very different problems are really instances of the same concept is a critical skill.

Rob Walker
+3  A: 

Most knowledge is not "practical", but helps you connect dots in ways that you cannot anticipate, or gives you a richer vocabulary for describing more complex ideas.

Christopher
+17  A: 

True story:

When I got my first programming job out of graduate school, the guys that owned the company that I worked for were pilots. A few weeks after I was hired, one of them asked me this question:

There are 106 airports in Arkansas. Could you write a program that would find the shortest rout necessary to land at each one of them?

I seriously thought he was quizzing me on my knowledge of the Traveling Salesman Problem and NP-Completeness. But it turns out he wasn't. He didn't know anything about it. He really wanted a program that would find the shortest path. He was surprised when I explained that there were 106-factorial solutions and finding the best one was a well-known computationally intractable problem.

So that's one example.

Jeffrey L Whitledge
Finding an absolute answer isn't feasible, but there are search algorithms that could give a pretty good "guesstimate". Good example never the less.
Matthew Scharley
So, did you tell him N!o? Har har.
MrBoJangles
I had a similar experience in 1996. My boss wanted me to write a spider to query the web server at all IP addresses (4+ billion of them). Not an NP problem, but he still failed to grasp that it would take many, many years for such a program to finish.
Barry Brown
A common example, but not a very good one. You can use markov chains and things to find "good enough" solutions.
Wahnfrieden
@Wahnfrieden - Obviously there are lots of ways to attack this problem, but the point is that a computer science education and knowledge of computer science theory does have real world application. If nothing else, it helps you know what to google for, or why you would want to google for something. And you won't look back on the 3.63×10¹³² years that were wasted on the brute force solution and wonder if there might have been a better way.
Jeffrey L Whitledge
The Arkansas airport problem can be approximated as a "Euclidean TSP" (http://en.wikipedia.org/wiki/Travelling_salesman_problem#Euclidean_TSP). There is an approximation algorithm that will provide a solution that is no more than 1/106 longer than the "best" solution.
Jay Elston
A: 

I guess it depends on which field you go into.

jotham
+16  A: 

Sure, it's useful.

Imagine a developer working on a template engine. You know the sort of thing...

Blah blah blah ${MyTemplateString} blah blah blah.

It starts out simple, with a cheeky little regular expression to peform the replacements.

But gradually the templates get a little more fancy, and the developer includes features for templatizing lists and maps of strings. To accomplish that, he writes a simple little grammar and generates a parser.

Getting very crafty, the template engine might eventually include a syntax for conditional logic, to display different blocks of text depending on the values of the arguments.

Someone with a theoretical background in CS would recognize that the template language is slowly becoming Turing complete, and maybe the Interpreter pattern would be a good way to implement it.

Having built an interpreter for the templates, a computer scientist might notice that the majority of templating requests are duplicates, regenerating the same results over and over again. So a cache is developed, and all requests are routed through the cache before performing the expensive transformation.

Also, some templates are much more complex than others and take a lot longer to render. Maybe someone gets the idea to estimate the execution of each template before rendering it.

But wait!!! Someone on the team points out that, if the template language really is Turing complete, then the task of estimating the execution time of each rendering operating is an instance of the Halting Problem!! Yikes, don't do that!!!

The thing about theory, in practice, is that all practice is based on theory. Theoretically.

benjismith
+6  A: 

Although I don't directly apply them in day-to-day work, I know that my education on formal computer science has affected my thinking process. I certainly avoid certain mistakes from the onset because I have the lessons learned from the formal approaches instilled in me.

It might seem useless while they're learning it; but I bet your classmate will eventually comes across a problem where they'll use what they were taught, or at least the thinking patterns behind it...

Wax on... Wax off... Wax on... Wax off... What does that have to do with Karate, anyways?

Toybuilder
+1  A: 

I found that all I need for daily bliss from the CS theoretical world is the utterance of the mantra "Low coupling and High Cohesion". Roger S. Pressman made it scholarly before Steve McConnell made it fashionable.

Bill
+18  A: 

When I graduated from college, I assumed that I was on par with everyone else: "I have a BS in CS, and so do a lot of other people, and we can all do essentially the same things." I eventually discovered that my assumption was false. I stood out, and my background had a lot to do with it--particularly my degree.

I knew that there was one "slight" difference, in that I had a "B.S." in CS because my college was one of the first (supposedly #2 in 1987) in the nation to receive accreditation for its CS degree program, and I graduated in the second class to have that accreditation. At the time, I did not think that it mattered much.

I had also noticed during high school and in college that I did particularly well at CS--much better than my peers and even better than many of my teachers. I was asked for help a lot, did some tutoring, was asked to help with a research project, and was allowed to do independent study when no one else was. I was happy to be able to help, but I did not think much about the difference.

After college (USAFA), I spent four years in the Air Force, two of which were applying my CS degree. There I noticed that very few of my coworkers had degrees or even training related to computers. The Air Force sent me to five months of certification training, where I again found a lack of degrees or training. But here I started to notice the difference--it became totally obvious that many of the people I encountered did not REALLY know what they were doing, and that included the people with training or degrees. Allow me please to illustrate.

In my Air Force certification training were a total of thirteen people (including me). As Air Force officers or the equivalent, we all had BS degrees. I was in the middle based on age and rank (I was an O-2 amongst six O-1s and six O-3s and above). At the end of this training, the Air Force rubber-stamped us all as equally competent to acquire, build, design, maintain, and operate ANY computer or communication system for ANY part of the Department of Defense.

However, of the thirteen of us, only six had any form of computer-related degree; the other seven had degrees ranging from aeronautics to chemistry/biology to psychology. Of the six of us with CS degrees, I learned that two had never written a program of any kind and had never used a computer more than casually (writing papers, playing games, etc.). I learned that another two of us had written exactly one program on a single computer during their CS degree program. Only one other person and myself had written more than one program or used more than one kind of computer--indeed, we found that we two had written many programs and used many kinds of computers.

Towards the end of our five-month training, our class was assigned a programming project and we were divided into four groups to separately undertake it. Our instructors divided up the class in order to spread the "programming talent" fairly, and they assigned roles of team lead, tech lead, and developer. Each group was given a week to implement (in Ada) a full-screen, text-based user interface (this was 1990) for a flight simulator on top of an instructor-provided flight-mechanics library. I was assigned as tech lead for my team of four.

My team lead (who did not have a computer degree) asked the other three of us to divide up the project into tasks and then assigned a third of them to each of us. I finished my third of the tasks by the middle of that first day, then spent the rest of the day helping my other two teammates, talking to my team lead (BSing ;^), and playing on my computer.

The next morning (day two), my team lead privately informed me that our other two teammates had made no progress (one could not actually write an "if" statement that would compile), and he asked me to take on their work. I finished the entire project by mid-afternoon, and my team spent the rest of the day flying the simulator.

The other guy with the comparable CS degree was also assigned as a tech lead for his team, and they finished by the end of day three. They also began flying their simulator. The other two teams had not finished, or even made significant progress, by the end of the week. We were not allowed to help other teams, so it was left at that.

Meanwhile, by the middle of day three, I had noticed that the flight simulator just seemed to behave "wrong". Since one of my classmates had a degree in aeronautics, I asked him about it. He was mystified, then confessed that he did not actually know what made a plane fly!?! I was dumbfounded! It turns out that his entire degree program was about safety and crash investigations--no real math or science behind flight. On the other hand, I had maybe a minor in aeronautics (remember USAFA?), but we had designed wings and performed real wind tunnel tests. Therefore, I quietly spent the rest of the week rewriting the instructor-provided flight-mechanics library until the simulator flew "right".

Since then, I have spent nearly two decades as a contractor and occasionally as an employee, always doing software development plus related activities (DBA, architect, etc.). I have continued to find more of the same, and eventually I gave up on my youthful assumption.

So, what exactly have I discovered? Not every one is equal, and that is okay--I am not a better person because I can program effectively, but I am more useful IF that is what you need from me. I learned that my background really mattered: growing up in a family of electricians and electrical engineers, building electronics kits, reading LITERALLY every computer book in the school/public libraries because I did not have access to a real computer, then moving to a new city where my high school did have computers, then getting my own computer as a gift, going to schools that had computers of many different sizes and kinds (PCs to mainframes), getting an accredited degree from a VERY good engineering school, having to write lots of programs in different languages on different kinds of computers, having to write hard programs (like my own virtual machine with a custom assembly language, or a Huffman compression implementation, etc.), having to troubleshoot for myself, building my own computers from parts and installing ALL the software, etc.

Ultimately, I learned that my abilities are built on a foundation of knowing how computers work from the electrical level on up--discrete electronic components, circuitry, subsystems, interfaces, protocols, bits, bytes, processors, devices, drivers, libraries, programs, systems, networks, on up to the massive enterprise-class conglomerates that I routinely work on now. So, when the damn thing misbehaves, I know exactly HOW and WHY. And I know what cannot be done as well as what can. And I know a lot about what has been done, what has been tried, and what is left relatively unexplored.

Most importantly, after I have learned all that, I have learned that I don't know a damned thing. In the face of all that there is potentially to know, my knowledge is miniscule.

And I am quite content with that. But I recommend that you try.

Rob Williams
I graduated from one of the few accredited software engineering programs in the country. And now that I'm in the work force, I see a huge difference between those who did not receive the quality of education I did. Also, it helps that I've been messing with computers since I was 10.
Kibbee
Good post :) I recently took a couple of math related classes (PDEs, Tensors, Cryptography, image processing, etc) and I'm sure it will pay off.
Nils
+2  A: 

It's not the specific problems that you study that matters, it's the principles that you learn through studying them. I use concepts about algorithms, data structures, programming languages, and operating systems every day at my job. If you work as a programmer you'll make decisions all the time that affect system performance. You need to have a solid foundation in the fundamental abstract concepts in order to make the right choices.

Bill the Lizard
+4  A: 

It's a classic dichotomy, between "how" and "what". Your classmates are looking at "how" to program software, and they're very focused on the near focus; from that perspective, the perspective of implementation, it seems like knowing things like halting states and Turing machines are unimportant.

"How" is very little of the actual work that you get expected to do with Computer Science, though. In fact, most successful engineers I know would probably put it at less than 20 percent of the actual job. "What" to do is by far more important; and for that, the fundamentals of Computer Science are critical. "What" you want to do relates much more to design than implementation; and good design is... well, let's just call it "non-trivial".

"There are two ways of constructing a software design: One way is to make it so simple that there are obviously no deficiencies, and the other way is to make it so complicated that there are no obvious deficiencies. The first method is far more difficult." - C.A.R. Hoare

Good luck with your studies!

McWafflestix
A: 

Ya, I generally use state diagrams to design the shape and flow of the program. Once it works in theory, I start coding and testing, checking off the states as i go.

I find that they are also a useful tool to explain the behavior of a process to other people.

EvilTeach
Software engineering doesn't have too much to do with most of CS.
Wahnfrieden
+10  A: 

it's the difference between learning algebra and being taught how to use a calculator

if you know algebra, you realize that the same problem may manifest in different forms, and you understand the rules for transforming the problem into a more concise form

if you only know how to use a calculator, you may waste a lot of time punching buttons on a problem that is either (a) already solved, (b) cannot be solved, or (c) is like some other problem (solved or unsolved) that you don't recognize because it's in a different form

pretend, for a moment, that computer science is physics... would the question seem silly?

Steven A. Lowe
A: 

Simple. For example: if I'm using RSACryptoServiceProvider I'd like to know what is that and why I can trust it.

rafek
A: 

Because C++ templates are actually some kind of lambda calculus. See www.cs.nott.ac.uk/types06/slides/michelbrink_types_06.pdf

Roman Plášil
+5  A: 

At one job I was assigned the task of improving our electrical distribution model's network tracing algorithm as the one they were using was too slow. The 3-phase network was essentially three n-trees (since loops aren't allowed in electrical networks). The network nodes were in the database and some of the original team couldn't figure out how to build an in-memory model so the tracing was done by successive depth SELECTs on the database, filtering on each phase. So to trace a node ten nodes from the substation would require at least 10 database queries (the original team members were database whizzes, but lacked a decent background in algorithms).

I wrote a solution that transformed the 3 n-tree networks of nodes from the database into a data structure where each node was stored once in a node structure array and the n-tree relationship was converted to three binary trees using doubly-linked pointers within the array so that the network could be easily traced in either direction.

It was at least two orders of magnitude faster, three on really long downstream traces.

The sad thing was that I had to practically teach a class in n-trees, binary trees, pointers, and doubly-linked lists to several of the other programmers who had been trained on databases and VB in order for them to understand the algorithms.

CMPalmer
A: 

I'm studying for my Distributed algorithms course now. There is a chapter about fault tolerance and it contains some proofs on the upper bound for how many processes can fail (or misbehave) so that the distributed algorithm can handle it correctly.

For many problems, the bound for misbehaving processes is up to one third of total number of processes. This is quite useful in my opinion because you know that it's pointless to try to develop a better algorithm (under given assumptions).

Roman Plášil
A: 

Even if theoretical courses aren't going to be used directly, it might help you think better of something.

You don't know what your boss is going to ask you to do, you may have to use something that you thought it won't be benefical, as Jeffrey L Whitledge said.

Loai Najati
A: 

To be honest, I sort of disagree with a lot of the answers here. I wrote my first compiler (for fun; I really have too much coffee/free time) without having taken a course in compilers; basically I just scanned the code for another compiler and followed the pattern. I could write a parser in C off the top of my head, but I don't think I could remember how to draw a pushdown automaton if my life depended on it.

When I decided I wanted to put type inference in my toy (imperative) programming language, I first looked over probably five papers, staring at something called "typed lambda calculus" going what.... the.... **....? At first I tried implementing something with "generic variables" and "nongeneric variables" and had no idea what was going on. Then I scrapped it all, and sat there with a notebook figuring out how I could implement it practically with support for all the things I needed (sub-typing, first-class functions, parameterized types, etc.) With a couple days of thinking & writing test programs, I blew away more than a weeks worth of trying to figure out the theoretical crap.

Knowing the basics of computing (i.e. how virtual memory works, how filesystems work, threading/scheduling, SMP, data structures) have all proved HIGHLY useful. Complexity theory and Big-O stuff has sometimes proved useful (especially useful for things like RDBMS design). The halting problem and automata/Turing Machine theory? Never.

Robert Fraser
Hmm.. I disagree about your type inference example. It's somewhat straightforward if you look in the right place - it's a Unification algorithm that is about 30 lines of code and is easy to understand once you 'get it'. First class funcs are treated just like parametrized types with the algorithm - it's very elegant. You also can't do sub-typing in the general case with type inference, so whatever you came up probably worked for your purposes but had cases where it wouldn't work that you don't know about. This might be a case where having more bg might have helped you read that paper!
Claudiu
Not to sound mean, though - I agree with you that often, trying it yourself will yield good results. And I think you're right about halting problem and automata being not too useful, and VM, filesystems, DB, etc, being useful - that's whwere the practical stuff is at.
Claudiu
Most of the papers were about typing with sub-types, but again I didn't understand what they were talking about. The way I did it was if two types need to be unified, the "least common denominator" or whatever was found. So if Int < Num and Float < Num, Int ∩ Int = Int, Int ∩ Num = Num, and Int ∩ Float = Num. Then each unresolved type variable collects a *set* of types. When unifying a type with a type variable, you add the specific type to the set in the variable and then check to see if there are any conflicts.
Robert Fraser