Moving large amount of files (~ 100 000)

Perhaps consider using a pure command line method to transfer very large amounts files, you will undoubtedly find the process is substantially faster than using a gui.

There are many different ways to accomplish this, but the following worked quickly, safely and efficiently on my system:

find . -maxdepth 1 -type f -print0 | xargs -0 mv -t <destination>

Some explanation for this command:

  1. Your input directory is the '.' character and for this particular command you need to be in that directory
  2. Your output directory is the <destination> in my example. Obviously modify this to suit your own needs and leave out the brackets.
  3. This syntax allows for filenames with spaces as a bonus :)

Endless permutations are possible but this should work well and much more efficiently than the gui. One permutation for example: if you wanted to move only pdf files you could run:

find . -iname "*.pdf" -maxdepth 1 -type f -print0 | xargs -0 mv -t <destination>

Use of xargs opens many possibilities particularly with the movement of such a large number of files. Many, many possibilities....

Potential Problems:

There are at least 2 potential pitfalls to ponder, thanks to the commenters below for these thoughts:

  1. Your destination directory could be corrupt, in a subsequently unreachable location, mistyped etc. mv will still move the files there! Be careful here...
  2. If the -t option (--target-directory) is missing and the destination folder is actually a file you will move one file and fail on the rest. mv has 2 uses: rename source to destination or move source to directory. Again be careful...

I had similar experience before, it is normal when dealing with a large number of files. I was having a large collection of PDF data-sheets (electronic parts).

GUI tools check for some file details & meta data (Icon/Thumbnail, Size, ...), it will be a big deal in such case. Even in Icon View and without thumbnails, they will freeze as most of them are not designed for such extreme case. GUI tool try to load presentation icons for all files/folders in directory even those items are not visible to user in current screen portion. Sorting is also part of the problem and no way to avoid it.

  • I end up splitting files files on separate folders based on brand/model less then 10000 each. May be you can use date (as most people do with photos/scans) or first letter(s) (as in Ubuntu package repository)
  • It is easier to use CLI tools instead as they show only what you have requested. You can use locate for quick search in stead of find.
  • For move operation, use mv in terminal (GUI tools are slow because they try to update the view periodically).

    If it is in the same partition, the command will change only the pointers in the File-system index. If not, then it will be dual operation (copy & delete). That will be expensive.

There is only one case I can help, If you are copying those files multiple times and they are not updated. As I did when I share my collection with friends, each time I try to copy it takes a decade. (This is more useful with small size files only)

  • Create a single package or few packages, like zip with no/low compression. When you copy it, it will faster, so let DMA do its job.

If you are looking for a solution which gives you the benefits of the command-line operations with the GUI feeling and flexibility combined, I recommend mc (midnight commander).

mc commander 2-pane view

It is a ncurses-based visual file manager - you have a two-pane view on your files and a menu available. Use of the mouse is possible, even over ssh. You can browse around your fs, inspect files with the file viewer, filter according to criteria on-the-fly and have the copy or move operations done on the command line.

It's a clone of the DOS program Norton commander which was popular in the mid-Eighties. It works well whenever the GUI starts to get unreliable for me, and ideal for your purpose.