I'm very conscious of the use of notation conventions when expressing data rates. For whatever reasons, I've settled on using "Mbps" for "megabits per second" and "MB/s" to indicate "megabytes per second." This works fine for my own purposes, but the many conventions can lead to disastrous misunderstandings. Just this week, I ran processes (encoding h.264 video) that took two days on three computers. If I'd mistaken someone's "MPBS" notation as "megabytes per second" when they really meant "megabits per second" when estimating the time for the jobs to complete, I'd have been waiting an extra two weeks for the three computers.
Engineer Lee Goldberg has written a
nice treatise on the state of data rate notation.
For my part, I'll try to start using Goldberg's "old-school" electrical-engineering conventions, which are invulnerable to mis-interpretation:
- xbits/s (with x = k, M or G as kilo-, Mega- and Giga-)
- xbytes/s