bit pattern

from The Free On-line Dictionary of Computing (8 July 2008)
bit pattern

   <data> A sequence of {bits}, in a memory, a communications
   channel or some other device.  The term is used to contrast
   this with some higher level interpretation of the bits such as
   an integer or an {image}.  A {bit string} is similar but
   suggests an arbitrary, as opposed to predetermined, length.

   (1998-09-27)
    

[email protected]