The same command run via SSH reveals that the numbers are kilobytes. Did you use 1000 (i.e. 10^3) or 1024 (i.e. 2^10) to convert from bytes to kilobytes?
Sorry if I’m a bit dull-witted on the matter, but if I understand you correctly you’re saying one Megabyte equals 1000^2 (i.e. 1000 squared = one million) bytes? So 10^3 is used everywhere and 2^10 is never used?
(I’m a bit confused because the index-two is usually used to indicate a notation in base-two, i.e. 10002 = 810.)
Thanks for the pointer, but unfortunately this doesn’t answer my question: Under GNU/Linux and many other operating systems, 2^10 is used as the factor between kilo, mega, giga, … when talking about memory (both, physical and virtual) but 10^3 is used when talking about disk space and traffic. This is due to the fact that memory is usually manufactured in powers of two, so a DRAM chip with, say, one gigabyte is actually larger, i.e. 2^30 bytes (one gibibyte).
Flash chips usually are organized in blocks that are a power of two in size, too. So assuming a physical size that’s a power of two here is reasonable, too. On the other hand, usually a percentage of those blocks is reserved for replacing bad blocks. Neither assuming 1000 nor 1024 gives me a power of two with >99% accuracy, so I thought I’d rather ask instead of assuming things.