from
Jargon File (4.4.4, 14 Aug 2003)
hexadecimal
n.
Base 16. Coined in the early 1950s to replace earlier sexadecimal,
which was too racy and amusing for stuffy IBM, and later adopted by
the rest of the industry.
Actually, neither term is etymologically pure. If we take binary to be
paradigmatic, the most etymologically correct term for base 10, for
example, is `denary', which comes from `deni' (ten at a time, ten
each), a Latin distributive number; the corresponding term for base-16
would be something like `sendenary'. "Decimal" comes from the
combining root of decem, Latin for 10. If wish to create a truly
analogous word for base 16, we should start with sedecim, Latin for
16. Ergo, sedecimal is the word that would have been created by a
Latin scholar. The `sexa-' prefix is Latin but incorrect in this
context, and `hexa-' is Greek. The word octal is similarly incorrect;
a correct form would be `octaval' (to go with decimal), or `octonary'
(to go with binary). If anyone ever implements a base-3 computer,
computer scientists will be faced with the unprecedented dilemma of a
choice between two correct forms; both ternary and trinary have a
claim to this throne.
from
The Free On-line Dictionary of Computing (8 July 2008)
hexadecimal
sexadecimal
<mathematics> (Or "hex") {Base} 16. A number representation
using the digits 0-9, with their usual meaning, plus the
letters A-F (or a-f) to represent hexadecimal digits with
values of (decimal) 10 to 15. The right-most digit counts
ones, the next counts multiples of 16, then 16^2 = 256, etc.
For example, hexadecimal BEAD is decimal 48813:
digit weight value
B = 11 16^3 = 4096 11*4096 = 45056
E = 14 16^2 = 256 14* 256 = 3584
A = 10 16^1 = 16 10* 16 = 160
D = 13 16^0 = 1 13* 1 = 13
-----
BEAD = 48813
There are many conventions for distinguishing hexadecimal
numbers from decimal or other bases in programs. In {C} for
example, the prefix "0x" is used, e.g. 0x694A11.
Hexadecimal is more succinct than {binary} for representing
{bit-masks}, machines addresses, and other low-level constants
but it is still reasonably easy to split a hex number into
different bit positions, e.g. the top 16 bits of a 32-bit word
are the first four hex digits.
The term was coined in the early 1960s to replace earlier
"sexadecimal", which was too racy and amusing for stuffy
{IBM}, and later adopted by the rest of the industry.
Actually, neither term is etymologically pure. If we take
"binary" to be paradigmatic, the most etymologically correct
term for base ten, for example, is "denary", which comes from
"deni" (ten at a time, ten each), a Latin "distributive"
number; the corresponding term for base sixteen would be
something like "sendenary". "Decimal" is from an ordinal
number; the corresponding prefix for six would imply something
like "sextidecimal". The "sexa-" prefix is Latin but
incorrect in this context, and "hexa-" is Greek. The word
{octal} is similarly incorrect; a correct form would be
"octaval" (to go with decimal), or "octonary" (to go with
binary). If anyone ever implements a base three computer,
computer scientists will be faced with the unprecedented
dilemma of a choice between two *correct* forms; both
"ternary" and "trinary" have a claim to this throne.
[{Jargon File}]
(1996-03-09)