Transfer 10 TB of files from USA to UK datacenter

Solution 1:

Ship hard drives across the ocean instead.

At 11 Mbps with full utilization, you're looking at just shy of 90 days to transfer 10 TB.


11 Mbps = 1.375 MBps = 116.015 GB/day.

10240 GB / 116.015 GB/day = ~88.3 days.

Solution 2:

I'd say rsync, at 11 MB/s you will look at 10-14 days and even if you get interrupted, rsync will easily start where it stopped last time.

At 11 Mbps I'd ship the hard disks like suggested above :)


Solution 3:

Rsync of course.

At least you can continue at any time after a break, and it's without any pain.


Solution 4:

Never underestimate the bandwidth of a station wagon full of tapes

-- Trad.

In your case, disks or tapes sent by courier, but the principle still applies. If you're not concerned about latency, this will be vastly cheaper than the network bandwidth to transfer 10TB of data in any reasonable length of time.


Solution 5:

You should use rsync. It will compress the data and de-duplicate it before sending. It can also resume partial transfers, which is very important for any large transfers.

It's likely it doesn't transfer 10 TB; if it's logs and text and such it could well be under 1 TB; perhaps way below 1 TB.

There are tools that do a better job of compression than rsync and likely find more matches. You could use lrzip, etc.

There are specific types of data that doesn't compress well and doesn't contain literal dupes - videos and other media for example. In those cases, FTP and rsync are doing much the same effort.