When talking about anything technology or engineering-related, especially when multiple companies are involved, standards are an absolute requirement to ensure that software and hardware play nicely together. These standards are often filled with technical jargon and field-specific terminology, and rightfully so, as they need to be very specific in what information they're standardizing. A blueprint that does not indicate its scale may yield a product far off from what it is supposed to be. Software, computers and consumer devices are no different, however there is an extra element thrown into the scenario, the consumers. This is especially true with the explosive growth of the internet and other high-tech devices floating around in every purse and pocket. Blueprints and industry standards are generally only viewed and useful by the engineers themselves. However standards, especially as of late, are being distributed to the mass population to help sell products and services, (all this jargon makes some people glaze over and just assume the company must know what they're talking about). How many boxes for wifi products sitting in retail stores boast '802.11N standard, get x-times the speed and range!'? The part that is not mentioned very often is the fact that increased range only applies if you use compatible hardware by the same manufacturer, as the N standard still isn't a technical standard and is still changing.
This rant however is not for hardware manufacturers upselling their products with technology buzz words that confuse consumers; this rant is dedicated to internet service providers, specifically Verizon and Comcast. They get to use a very misinterpreted tidbit of information, the terminology for data, ie: bits and bytes. Many IT-inclined people are well aware of the definitions of bits and bytes, their differences, and their connection to bitrates and bandwidth. For those of you who do not know about this IT jargon however, here is a quick computer lesson.
* Bit - The smallest part of data. Is a tiny part of your hard drive that contains either a '0' or a '1', on or off, etc..
* Byte - 8-bits that are grouped together to form a letter (essentially).
* A = 01000001
* B = 01000010
* C = 01000011
Each letter is created by combining 8 bits in a specific arrangement. So, when you read a word on the screen, the computer is actually using a long string of 0's and 1's in such order to represent that word. If the word is "Hello", the string in the computer is "01001000 01100101 01101100 01101100 01101111"; 40 bits of data to represent 5 letters.
* Mbps - Million bits per seccond, 2Mbps is 2 million bits of information transferred in a second.
* MB/s - Million bytes per seccond, 2MB/s is 2 million bytes of information transferred in a second.
* 1Mbps = 0.125MB/s
* 1MB/s = 8Mbps
With this in mind, network speeds are all listed in bits, the smaller of the two; file sizes are listed in bytes, the larger of the two. Therefore, a connection speed of 1Mbps does NOT mean you can download a 1MB file in a second, but 8 seconds instead. Same applies to the connection speeds that ie: Comcast provides; they offer 5Mbps and 12Mbps speeds in one area, I'll use the faster of the two in this illustration. 12Mbps means literally 12 million bits per 1 second. Blazingly fast, right?.. well... on the 8-to-one ratio, that would give 1.5 million bytes of actual data per 1 second. Plus there is a network limitation of about 80% utilization cap, as the other 20% is used for background communication and overhead just to send the information, so knock that down to 1.2 million bytes per second (MB/s).
Another limitation with cable is the shared access, so it's not that you get that 1.2MB/s speed, but a group of people get a chunk of the overall trunk'd connection. (A trunk is a term for combining multiple lines together.) So, if the cable trunk supports speeds 10x faster then their fastest connection, (your cable modem will hard-enforce the speed that you are "allowed" to get), but you have 20 people connected to that trunked line, then your speed ranges between 1.2MB/s and 0.6MB/s, all depending on how much your neighbors are using their internet. This limitation does not apply quite as much to fiber connections such as Verizon's FiOS, as you get a dedicated connection of up to 100Mbps to their data center, (that's when the trunking occurs, but it's generally much higher then cable's trunking speeds).
So, as a breakdown on the two different common internet service technologies. With a connection speed of 12Mbps and best possible speed, it would take someone 2.5 seconds to download a 3MB song, and 84 seconds to download a 100MB video; much longer then the presumed 'couple second download time' that most people expect.
This falls directly into the main element of this rant, (aside from Comcast's marketing BS), is Verizon's marketing BS. In a recent radio commercial by Verizon, they state that with a 20Mbps download and upload speed, you can "upload a 20 meg photo gallery in a second". For this reason alone, someone needs to bitch slap whoever approved that advertisement. Based on the method for calculating actual network speeds, (Z / 8 * 0.8, where Z is the bitrate), 20 / 8 = 2.5 * 0.8 = 2.24MB/s. 2.24MB/s transfer speed would yield just shy of 9 seconds to upload or download a 20Meg file, not even remotely close to their claim. If however they were talking about a 20Mbit file, (which would be absurd because no one uses bits to indicate filesize), then it would take 1.25 seconds to transfer, much closer to their claim. This would actually only be a 2.25MB file under that premise. [If anyone happens to have a recording of that commercial, please post it..]
A direct relation to this is going to a car dealer and buying a car under the premise that it can do 0-100 in 5 seconds and has a top speed of 200, only to realize after-the-fact that the asshole who told you that was talking about km/h, even though you both live in a country that uses MPH for speed.