Archived posting to the Leica Users Group, 1999/10/13
[Author Prev] [Author Next] [Thread Prev] [Thread Next] [Author Index] [Topic Index] [Home] [Search]>In fact "Digital" means using only the digits 0 and 1, these >numbers producing the simplest scale (binary scale) that uses >positional notation where the value of each digit is multiplied >by a power of the base depending on its position. This is also incorrect. The digits in digital encoding are required to be individual numbers. The dictionary I quoted earlier quoted the arabic sequence from 0 to 9 as base10 decimal systems are generally speaking the most common for numerical manipulation but you could have any number base for representations. Computer systems generally use binary number representations because it's so easy to represent a '1' and a '0' with an on-off switch and a little bit electricity. But the number manipulations are generally operated on in base10 arithmetic, thus the term "binary-decimal computing". How you represent the numbers is independent of the encoding of the information. Godfrey