Oppenheimer and the bomb are often invoked to illustrate the limits of what science and technology should do (rather than what it can do). Are there a computer science or programming problems that deserve a similar level of moral reflection before they are solved?
the usual argument is P=NP because of the risk to existing encryption schemes.
It should be noted that no ethically-trained software engineer would ever consent to write a DestroyBaghdad procedure. Basic professional ethics would instead require him to write a DestroyCity procedure, to which Baghdad could be given as a parameter. (Nathaniel S Borenstein)
Are there things you as a computer scientist (or a person) shouldn't do, yes. Are there open problems in computer science that shouldn't be solved, no. Even if something can be used for evil, it doesn't follow that it shouldn't be studied. In so far as CS is basically math, it's problems (in the math sense) are relatively amoral. That doesn't mean that the uses to which it is put are. That's where the ethics come in.
Designing email messaging algorithm that can't be detected by a spam filter :)
I think we all (know it or not) have a vested interest in factoring very large numbers continuing to be hard. But I expect massively parallel molecular computing to solve that whether we like it or not.
the short answer is: no
the longer answer is:
Oppenheimer and the bomb are often invoked by people who would rather the Allies had lost WWII - so I don't put much stock in their opinions
the progress of science is inevitable; things cannot be un-invented
blaming the tool is what children do; adults take responsibility for their own actions
[and drive-by downvoting an answer you don't like is spiteful and cowardly; a downvote is supposed to mean "not helpful", not "I don't like this answer/person". If you disagree, say so and say why, one or both or us might learn something]
If it is proven that P=NP, would we all lose our jobs or make more?
- More things solvable, even on iPhones
- One algorithm could solve anything (once reduced)
Won't happen ever though (on silicon binary computers), so can put that one to bed.
Quantum computing?
"This universe has performed an illegal operation and will now implode"
Strong AI.
Whilst I believe that Strong AI is laughable, there are those that are pouring money and resources into trying to create intelligent computers on par with human beings, with the ultimate goal of being able to essentially create an artificial life-form with perceived consciousness.
This in itself opens an endless bucket of ethical issues. If humans are capable of creating 'life-forms' exceeding our own ability then will we really value our own? What steps are after creating beings that are far superior to ourselves? Will we eventually rebuild ourselves and speed up evolution? Will we use this power to exceed the human-bounds of knowledge and try to discover life's greatest questions, things we would have to adapt ourselves into understanding?
It's all a bit crazy, but proving the existence of Strong AI would vastly stretch the bounds of our capabilities as human beings. We could create a utopia, but basic human nature dictates that the power would create unimaginable destruction.
I think there are some data-mining projects that you probably shouldn't work on.
Anything which is really going to extinguish humanity's last shreds of privacy.
What makes you think that anybody can declare a research area closed? If it's got potential, somebody's going to be working on it. What well-intentioned people can do is ensure that, when a new technology is developed, it will be developed by ill-intentioned people.
Consider the atomic bomb. The Manhattan Project was not the only such program. Germany had one (that we found out postwar went wildly astray), and the Japanese had two (one for the Army, one for the Navy - they weren't big on interservice cooperation). The drive to make the A-bomb was based on the belief that Nazi Germany couldn't have been allowed to get one first, and that fear was reasonably well-founded at the time.
Well, I can think of some purely malicios programs. Say "a virus that is able to spread to every device connected and is impossible to eradicate without complete memory wipe of all possible memory devices in the computer". Something like the Ultimate Virus. I don't know if it's possible, but I'm sure that nobody should attempt to succeed at this. ;)
I need to emphasize the none answer once more.
Any other answer displays a deep misunderstanding of science. There must be no forbidden questions because this would break the whole system. The whole notion that there are questions that should not be explored is inherently anti-scientific.
On the other hand, I don’t think (like tvanfosson) that CS is necessarily amoral. Questions of strong encryption in particular raise a whole host of moral issues that need to be addressed by software architects (believe me – it’s better that way! At the moment, politicians all over the world try to address issues they don’t understand, with catastrophical and often ridiculous results).
Is this a problem? Well, it might be one since there are dangerous answers. But I still believe that the danger posed by these answers cannot be countered by ignoring the question. Rather, we need to explore even further. Nothing, nothing is more dangerous than lack of knowledge (again, I refer to the abovementioned politicians as just one example).
Now, this has been rather general but yes, it also applies to computer science. In particular, answering the question P=NP isn’t dangerous at all. What may become dangerous is if the answer unexpectedly were “yes.” In that case, we would need to rebuild much of today’s IT infrastructure from scratch. But on the other hand, we would get an untapped problem solving potential.
If the question is about CS, I'm not so much worried about programs that might get loose in the world's computers, at least in the short term.
With my AI background, I'm used to thinking of people's heads as computers. The programs that get loose in those are really scary. Examples are fundamentalism of all kinds.