Argument list too long for ls

Your error message argument list too long comes from the ***** of ls *.txt.

This limit is a safety for both binary programs and your Kernel. See ARG_MAX, maximum length of arguments for a new process for more information about it, and how it's used and computed.

There is no such limit on pipe size. So you can simply issue this command:

find -type f -name '*.txt'  | wc -l

NB: On modern Linux, weird characters in filenames (like newlines) will be escaped with tools like ls or find, but still displayed from *****. If you are on an old Unix, you'll need this command

find -type f -name '*.txt' -exec echo \;  | wc -l

NB2: I was wondering how one can create a file with a newline in its name. It's not that hard, once you know the trick:

touch "hello
world"

It depends mainly on your version of the Linux kernel.

You should be able to see the limit for your system by running

getconf ARG_MAX

which tells you the maximum number of bytes a command line can have after being expanded by the shell.

In Linux < 2.6.23, the limit is usually 128 KB.

In Linux >= 2.6.25, the limit is either 128 KB, or 1/4 of your stack size (see ulimit -s), whichever is larger.

See the execve(2) man page for all the details.


Unfortunately, piping ls *.txt isn't going to fix the problem, because the limit is in the operating system, not the shell.

The shell expands the *.txt, then tries to call

exec("ls", "a.txt", "b.txt", ...)

and you have so many files matching *.txt that you're exceeding the 128 KB limit.

You'll have to do something like

find . -maxdepth 1 -name "*.txt" | wc -l

instead.

(And see Shawn J. Goff's comments below about file names that contain newlines.)


Another workaround:

ls | grep -c '\.txt$'

Even though ls produces more output than ls *.txt produces (or attempts to produce), it doesn't run into the "argument too long" problem, because you're not passing any arguments to ls. Note that grep takes a regular expression rather than a file matching pattern.

You might want to use:

ls -U | grep -c '\.txt$'

(assuming your version of ls supports this option). This tells ls not to sort its output, which could save both time and memory -- and in this case the order doesn't matter, since you're just counting files. The resources spent sorting the output are usually not significant, but in this case we already know you have a very large number of *.txt files.

And you should consider reorganizing your files so you don't have so many in a single directory. This may or may not be feasible.

Tags:

Arguments

Ls