Noun
binary-coded decimal (plural binary-coded decimals)
(computing) A decimal number encoded in such a way that each digit is represented by its own binary sequence, simplifying conversion from binary to decimal.
One nibble corresponds to one digit in hexadecimal and holds one digit or a sign code in binary-coded decimal. Source: Internet
Consequently, a system based on binary-coded decimal representations of decimal fractions avoids errors representing and calculating such values. Source: Internet
From there, it is converted by the decoder unit into a decimal number (usually binary-coded decimal ), and then shown on the display panel. Source: Internet