s'ok I did mention that was slanted towards the geeks
anyhow binary is how computers store information. you see computers use electrical charges which are either charged (which is a 1) or not charged (which is a 0) each charge or non-charge is called a "bit" Obviously "on" and "off" alone have a very limited variety of uses so they grouped this bits into units of 8 . Each group of 8 is called a "byte."
Think about it like this: a penny is the base unit of American currency. And a nickel represents 5 pennies.
so really the actual "smallest piece of information you can have in a computer" is a "bit". But the smallest "useful" (detailed) piece of information in a computer is a "byte"
So since the beginning computers were wrapped up in the powers of 2 (2 possible values for a bit, and even bytes are calculated by assigning exponents to placeholders) and there was no real reason to deviate from this.
And RAM (where computers keep information when they're on; as opposed to hard disks where they keep information when they're off) was designed with this in mind: The smallest piece of information will be 8 bits, and data grows in powers of 2.
when ram sizes got up pretty high they decided that they ought to start using suffixes, and people tend to understand kilo, mega, etc. But the closest they could get to an actual kilo was 1024 because of the whole "data grows by powers of 2" thing. And so that was used for a long time.
Then manufacturers, once we reached the "mega" and then "giga" sizes realized that if we have 60,000,000 bytes we can sell it as 60 "mega" because (after all, thats what it means anywhere else), and not have to come up with the extra 2,914,560 to make a computer 60 mega (62,914,560). So corporate greed caused, forever, there to be 2 meanings for kilo in the computer world... 1024 or 1000, whichever was more... convenient.
And from that moment on consumers everywhere have been duped... buying a 60Gb hard drive that only formats to 55Gb, and so on and so forth.
The bandwidth measurement difference here, though, is actually the result of a completely different issue.
Back in the day modems were these nasty things which required that you pick up a telephone, dial a number, wait for some nasty screeching sounds, and then (quite literally) set the phone down on the modem which was like an electronic ear and mouth.
And they were measured in "baud" or "bits per second" because they were so damned slow, and nobody figured it would ever matter anyhow. so a 300 baud modem would transfer 300 bits (or 37 bytes) per second. (thats 2 kilobyte in a minute for those keeping track)
So from the very beginning bandwidth was measured in bits, since it was mostly academic and governmental back then using a name which made sense in the long run didn't matter as much as using a name which was accurate.
It's stayed on for the same reason... selling a 256k line sounds a hell of a lot better than selling a 32K line, doesn't it? Just marketing and greed...
*shakes head* Huh? Whaaa???
:lol-sign: