views:

87

answers:

4

How much trust can I put in a standard computer running windows? To what certainty can I be sure it will run my code the way I wrote it? How can I be sure that if I declare something like "int j = 5;", j will alway be 5? Is there a way to measure trust in a standard x86 computer system? What kind of protections are there to make sure that j = 5?

I'm thinking about critical systems where nothing can be off even by one bit and everything must run exactly the way it was written to run.

+1  A: 

Essentially, none. You should read this article, and consider reevaluating your concept of trust. :-)

Benson
+4  A: 

If "nothing can be off by even a single bit", the only way in general to do this is to have three computers:

  • all must use different hardware
  • all must use different OS;s
  • the application software must have been written by different people
  • the application must be compiled using different compilers

All three computers are given the same inputs and calculate the output. If two or more outputs are the same, this is accepted, otherwise an error condition is flagged.

You can probably see that this full system is never used in practice, althogh variants are used in avionics and similar critical systems.

anon
Do you have any sources for this type of computing? How do they do this in avionics? I would like to read up more on this.
danmine
I was thinking of multiple sensor systems using "voting" - I'm not an avionics expert and so may be wrong abot that. The basic concept I describe is well known though.
anon
I believe they use this style system in the space shuttle, with several computers voting
Michael
If you read the article I linked, you could be convinced that even this would not be sufficient if you want to be absolutely certain of trustworthiness. In fact, it is impossible to have absolute trust in any computer.
Benson
+1  A: 

Is this about the possibility of someone malicious altering your code to do something you didn't program it to do? Or is it about the possibility of random errors messing up your computation? In the latter case, you probably don't have to worry because the error rates on modern computers are something like 1 in 10^17 - that's less than one per processor per year, and if even that is intolerable you can use error-correcting algorithms to reduce the effective error rate as close to zero as you want (at the cost of needing more time to do a given computation).

If it's hackers you're worried about, though... there's really no expectation of security at all. Someone in physical control of a computer can, in principle, modify it to do absolutely anything that can conceivably be done by a computer. They could disassemble your code and alter assembly instructions at will, if they want, to make your program behave however they want it to. We don't usually worry about this in practice, though, because most of us aren't writing anything worth the effort to hack. Those who are, e.g. programmers designing military encryption hardware or nuclear missile control chips, can fall back on tamper-resistant hardware which makes it extremely difficult to alter the code. (That stuff can be expensive, though)

You might want to take a look at a book called "Security Engineering" by Ross Anderson, which describes some of this tamper-resistant hardware, and in general how people who really need to secure their code can do so.

David Zaslavsky
+1  A: 

Something as simple as "j=5" cannot be absolutely trusted.

There are so many ways you could have a single bit error:

  • CPU's do have errata.
  • Hard drives could return errors as they age
  • Memory can be corrupted by cosmic rays
  • Loose/dirty/etc. connections anywhere in the system.
  • Drivers do corrupt memory (including code pages).
Michael