The author decidedly has expert syndrome -- they deny both the history and rational behind memory units nomenclature. Memory measurements evolved utilizing binary organizational patterns used in computing architectures. While a proud French pedant might agree with the decimal normalization of memory units discussed, it aligns more closely to the metric system, and it may have benefits for laypeople, it fails to account for how memory is partitioned in historic and modern computing.
It’s not them denying it, it’s the LLM that generated this slop.
All they had to say was that the KiB et. al. were introduced in 1998, and the adoption has been slow.
And not “but a kilobyte can be 1000,” as if it’s an effort issue.
They are managed by different standards organizations. One doesn't like the other encroaching on its turf. "kilo" has only one official meaning as a base-10 scalar.
What are you talking about? The article literally fully explains the rationale, as well as the history. It's not "denying" anything. Seems entirely reasonable and balanced to me.
They are definitely denying the importance of 2-fold partitioning in computing architectures. VM_PAGE_SIZE is not defined with the value of '10000' for good reason (in many operating systems it is set to '16384').
That's why I said "usually acceptable depending on the context". In spoken language I also don't like the awkward and unusual pronunciation of "kibi". But I'll still prefer to write in KiB, especially if I document something.
Also If you open major Linux distro task managers, you'll be surprised to see that they often show in decimal units when "i" is missing from the prefix. Many utilities often avoid the confusing prefixes "KB", "MB"... and use "KiB", "MiB"...
No they're not? They very specifically address it.
Why do you keep insisting the author is denying something when the author clearly acknowledges every single thing you're complaining about?