Travelling salesman problem is said to be practically "unsolvable" when number of nodes increase.
What other programming problems that are considered unsolvable?
Travelling salesman problem is said to be practically "unsolvable" when number of nodes increase.
What other programming problems that are considered unsolvable?
I find the knapsack problem particularly interesting as it's applicable to so many different domains.
First, there's a big difference between unsolvable problems and problems that are difficult to solve. I know that sounds obvious, but a lot of people label hard problems (such as traveling salesman) as unsolvable when it's better to say that they're not solvable efficiently, i.e., in polynomial time.
The best known unsolvable problem is the halting problem. See this page for more info. One common technique to show that a problem is unsolvable is to show that it's equivalent to the halting problem.
Getting all three of: scope (features), timeframe (schedule), budget (resources—developers); instead of "pick two".
Problems which are undecidable can not be solved with any algorithm.
http://en.wikipedia.org/wiki/Undecidable%5Fproblem
NP Complete problems can be solved in polynomial time non-deterministically. So they are indeed solvable, especially quickly on a quantum computer.
Examples of undecidable problems include the Halting problem, which says: given an algorithm, decide if the algorithm eventually ends or continues on forever (eg: in an infinite loop). Alan Turing proved that the Halting problem is undecidable.
creating an unbreakable DRM for non interactive content, where intended users still can use the content ...
As many people have already responded, the ur-problem which is impossible to solve is the halting problem: write a program H
which takes a program P
as input and answers whether the input program P
is going to loop forever or not. Note that our program H
may answer correctly for many input programs, the challenge is to write a program that can answer this for all programs.
Why is the halting problem unsolvable? The proof is pretty simple and rather fun:
Assume that we can write the program H
which solves the halting problem. (The goal is to show that by assuming this we are going to produce results which are just nonsense, hence our assumption is faulty.) Now, construct a program EVIL
which uses H
as a subroutine. We can write EVIL
like this:
EVIL p = if H p then loop
else false
A brief explanation is in order. EVIL
takes a program and if that program halts then EVIL
loops. It the input program loops forever then EVIL
returns false.
The next step is really cool: apply EVIL
to itself! The question is, what will the answer of EVIL EVIL
be? It clearly depends on whether EVIL
is looping forever or not. Let's consider the two alternatives:
Suppose EVIL
loops forever. But then when we feed is as input program to itself it must return false
since H
detected that it would loop forever. But returning false
contradicts what we first said that EVIL
loops forever so that clearly cannot be the case.
Ok, so what is EVIL
doesn't loop. Well, when we feed EVIL
to itself it will just loop because H
returned true
. So we have a contradiction here as well.
Bummer! Both the alternatives (EVIL
loops or it doesn't) leads to contradictions. So our assumptions must be faulty. There is no such program as H
which can decide whether a function halts or not.
There are many interesting problems which are uncomputable that derive from the halting problem. Some of my favorites come from compiler technology. For example: finding the smallest equivalent program.
Another one is this: given that you've annotated your program with assertions, write a program which checks if any of the assertions will be violated at runtime. Such a program is sometimes called a model checker, and they can be really useful but they can never be complete, i.e. they sometimes have to answer "I don't know" when the problem gets too hard.
If you're looking for problems that are similar to the traveling salesman problem then I suggest you google for NP-complete problems. This is a class of problems which doesn't have any know efficient solution, meaning an algorithm which computes the result in polynomial time. The intriguing thing with the class of NP problems is that no-one has been able to show that any algorithm for solving them must take exponential time. It's perhaps the biggest unsolved question in computer science.
It should be said though that it is often rather easy to find a solution to an NP problem, the tricky business is to find the optimal solution.
Here's a list of other NP-complete problems:
I could go on but I think most other problems I know are rather theoretical. You'd be better off googling by yourself.
Actually, NP-complete problems, like the Traveling Salesman, are hard to solve in general, but a specific instance (or even most instances of them in some domain [eg compiling Haskell code]) may be easy to solve!
There are a lot of problems that are, in theory, incredibly difficult – but because the difficult cases are very rare and rather contrived, they’re actually very easy to solve. Two examples of this that I find particularly interesting are both NP complete. Type checking in Haskell is one of them: in fact, the general type inference in Haskell is worse that NP complete: the type validation is NP-complete; type inference is NP-hard. But on real code, it’s effectively approximately linear. The other one is a logic problem called 3-SAT. I once attended a great talk by a guy named Daniel Jackson, talking about a formal specification language he’d designed called Alloy. Alloy reduces its specification checking to 3-SAT. Dan explained this saying: “The bad news is, analyzing Alloy specifications is 3-SAT, so it’s exponential and NP-complete. But the good news is that analyzing Alloy specifications is 3-SAT, so we can solve it really quickly
— MarkCC
Trying to find a dev who is happy with the code they've written!
Given two lists of words (a1, a2, .., an) and (b1, b2, .., bn), find a sequence of indices so that the concatenated words are equal.
E.g. given the lists (1: a, 2: ab, 3: bba) and (1: baa, 2: aa, 3: bb), the sequence (3, 2, 3, 1) is a solution since bba+ab+bba+a = bb+aa+bb+baa.
This problems looks like there should be an algorithm solving it (we're just talking about finding a sequence so that the strings match, nothing complex like Turing Machines is involved); but it's undecidable just like the halting problem.
Not all programming languages can have solve-able problems. For example I would rather do a regrep with perl then applescript. Most programming languages such as PHP have "bugs" that are unsolve-able until someone at the php team fixes the php engine.
He might mean that the code gets very very complicated and not worth completing the project because it takes too much time