#1
|
|||
|
|||
Flipping a bit
How do computer flip a bit (change a 0 to a 1)? Is it due to physical flaws in the chip? I was recently involved in a discussion involving an algorithm that was essentially treated as correct because it had a 1/2^80 chance of being wrong, and the reasoning behind accepting such an algorithm for computing purposes was that a computer makes an error every 2^50 bits or so. Anyone familiar with the more technical details of this, I'd appreciate your input.
|
|
|