When engineers of a my advancing years talk about raw ADC outputs, they tend to call them counts. That word choice is not entirely a matter of habit or stubbornness. It reflects how those systems were actually experienced and debugged at the time. In early systems, the output of an analog to digital converter was not an abstract value retrieved from memory. It was a parallel word on a backplane or ribbon cable. If something was misbehaving, you clipped a logic analyzer onto the bus and watched the bits change. As you turned a knob or varied an input voltage, the pattern on the analyzer stepped up or down in integer increments. Those steps were the quantization bins of the converter made visible. What you saw were counts in the most literal sense. The converter was counting which bin the input landed in, and the hardware was showing you that count.
That perspective shaped how engineers thought and spoke. Noise was something that caused the least significant bits to flicker, so it was described as a certain number of counts RMS. Offset was a shift of the entire transfer function by some number of counts. Linearity errors were discussed in terms of missing counts or codes that repeated. None of that language required or even encouraged attaching physical units at that stage. The numbers were understood to be dimensionless indices produced by a quantizer. The mapping to volts, pressure, or mass lived elsewhere, usually in a calibration constant applied later.
The notion of an LSB followed the same logic. An LSB was not primarily a bit weight in some abstract binary representation. It was the width of one quantization bin, the range of input values that produced the same output count. One count corresponded to one LSB step, and adjacent counts corresponded to adjacent bins. That interpretation was natural when you were watching the codes advance one at a time on a logic analyzer display.
Now, direct contact with the converter output has faded. ADCs moved inside microcontrollers and system on-chip devices. The raw codes were no longer visible on a bus but were read from registers or memory buffers. In imaging systems especially, those integers quickly became pixels, array elements, and statistics. In that context, the term data number emerged. It fit a world in which the integer was treated primarily as data to be processed, averaged, and displayed, rather than as the immediate output of a measuring device. Data number is redundant. Once a quantity is a number, it is already data. Adding the word data does not clarify its meaning or origin, it merely reflects a shift in perspective away from measurement and toward representation.
Both terms suppress physical units, and both require an external scale factor to recover meaning. A count by itself is no more a volt or an electron than a data number is. The difference lies in what the term invites you to think about. Counts point back to the act of quantization and to the hardware that produced the number. Data number points forward to the pipeline that will consume it.
If you spent years watching ADC outputs on a logic analyzer, it is hard not to hear “counts” as the more honest description. It names what was actually observed. The number was not yet data in any richer sense. It was simply the converter telling you, one integer at a time, where the input landed.
I am bilingual on this point, but I prefer counts to DNs.
Leave a Reply