bit decay

from Jargon File (4.4.4, 14 Aug 2003)
bit decay
 n.

   See {bit rot}. People with a physics background tend to prefer this
   variant for the analogy with particle decay. See also {computron},
   {quantum bogodynamics}.
    
from The Free On-line Dictionary of Computing (8 July 2008)
bit rot
alpha particle
bit decay

   <jargon> A hypothetical disease the existence of which has
   been deduced from the observation that unused programs or
   features will often stop working after sufficient time has
   passed, even if "nothing has changed".  The theory explains
   that bits decay as if they were radioactive.  As time passes,
   the contents of a file or the code in a program will become
   increasingly garbled.

   People with a physics background tend to prefer the variant
   "bit decay" for the analogy with particle decay.

   There actually are physical processes that produce such
   effects (alpha particles generated by trace radionuclides in
   ceramic chip packages, for example, can change the contents of
   a computer memory unpredictably, and various kinds of subtle
   media failures can corrupt files in mass storage), but they
   are quite rare (and computers are built with {error detection}
   circuitry to compensate for them).  The notion long favoured
   among hackers that {cosmic rays} are among the causes of such
   events turns out to be a myth.

   Bit rot is the notional cause of {software rot}.

   See also {computron}, {quantum bogodynamics}.

   [{Jargon File}]

   (1998-03-15)
    

[email protected]