What does an Internet speed of 4 Mbit/s mean?

This speed refers to the link between you and your ISP. It does not guarantee than you can get that speed from any place on the Internet.

Lets create the example where you upload a file from your desktop to a server in London:

  1. Data is on your PC.
  2. Data leave via your local LAN to the default gateway. (Most likely at 100Mbit/sec or 1Gbit/sec if you have a wired network).
  3. Data arrives at the modem and is uploaded at 4Mb/s to your ISP. If that is a global ISP then it will be uploaded to their local data center.
  4. Data is then routed in an unspecified way to the server in London.

Step 4 is intentionally vague. The routing can change if lines go down, if inter-ISP connections are changed. If lines are overloaded and routing is intentionally changed etc. It was intentionally build this flexible. If you want more details in why look up ARPA net and the cold war.

... and my internet speed is 4Mbps, will I upload this file in 1 seconds irrespective of server actual physical location (be it Australia or New York or any other location in world).

Assuming the 4Mb/sec is the slowest link in the path to the destination: Yes.

It might help if you think of these two analogies:

  1. Build a chain with different thickness of links. The chain is as strong as the weakest link. Or a set of pipelines. Your local pipelink is 4cm. Flow though ti will not go faster if it is connected to a bigger pipe. If can slow down if it needs to go though a thinner pipe (e.g. if the server in London is on 33600 bps
  2. Qua routing: You do not set up a full path to the destination. It is more like posting a letter. if it is for a local house, put it in theior mailbox, else put if in the postbox. You do not care how the mail flows internally, just as long as it arrives. Routing for IP is similar.

Bits vs Bytes

Bit = A Single 1 or 0

  • = _

Byte = 8 1's or 0's

  • = _ _ _ _ _ _ _ _

  • To get [Bytes per Second] (or megabytes, giga, etc) Simply take ___ Bits and divide by 8

Storage is measured in bytes, why?

  • Bytes are [Data] because a Byte, being 8 1's and 0's add up to make [A Single Letter] Letters are information to a computer. But a single bit means nothing until you get 8 bits.
  • Bytes are 8 bits.
  • 1 MegaByte is 1000 KiloBytes. 1000 MegaBytes make a GigaByte etc... metrics.

==>> Important Update <<==

For those of your trying to correct my Bytes

Please visit Wikipedia /wiki/Mebibyte

Thankyou

End Of UPDATE

Data Transfer is measured in BITS, why?

Because the lowest piece of information you can send is a 1 or a 0 (on or off). So if you turn on a flashlight, thats "On" thats a 1, and turn it off thats "Off" or 0. - This is how computers talk to each other by pulsing each other 1's and 0's.

But how fast do they pulse at each other in a second?

Well that would be how many bits per second?

So we say "Bits per second".


I'm assuming the file size is 4 mega bits, even though file sizes are usually measured in bytes (8 bits). this means the file is 4,000,000 bits large.

If the connection between you and the receiving party is 4 mb/s (4,000,000 bits per second) exactly, without speed changes during the transfer, the transfer will take exactly 1 second to complete. The total time between you starting the transfer and it actually completing may be larger due to the latency between you and the recipient.

When using a site like speedtest.net, you test the transfer speed between your computer and one of their test servers (They show a little map indicating the position of the server). The result of this test depends heavily on intermediate networks, since your final speed will be that of the slowest link in the chain.