Don't you hate how some hard drive and DVD-R manufacturers use 1000 for everything instead of 1024?
For those of you who don't know what I'm talking about, the standard size of a kilobyte is 1024 bytes, 1024 kilobytes in a megabyte and 1024 megabytes make up a gigabyte. Manufacturers like to use 1000 for all of those values. The difference? You actually getting shortchanged 24 kilobytes, which might not seem like a lot, but once it adds up, you could be short several gigs.
Example: if you have a "100" gigabyte hard drive, that's 1000^3 actual kilobytes you have. When divided by the real size you have 1000^3/1024^3, which ends up to be 93 gigabytes. You lose 7 gigabytes just because they use a smaller denomination. That way, they can charge you more while providing you less storage space.
Thoughtful replies are welcome.