E2BIG
Linux / POSIXERRORNotableProcessHIGH confidence

Argument List Too Long

What this means

The combined size of command-line arguments and environment variables passed to execve() exceeds the kernel limit ARG_MAX (typically 2MB on Linux). This is most often encountered when a shell glob expands to thousands of file paths.

Why it happens
  1. 1A shell glob like rm /tmp/dir/* expands to more files than fit in ARG_MAX.
  2. 2The environment block is very large due to many or very long environment variables.
  3. 3A script constructs a very long command string programmatically.
How to reproduce

Deleting thousands of files using a shell glob.

trigger — this will error
trigger — this will error
$ rm /tmp/large-dir/*
bash: /bin/rm: Argument list too long

expected output

bash: /bin/rm: Argument list too long

Fix

Use find with -exec or xargs to process in batches

WHEN When operating on directories with very many files

Use find with -exec or xargs to process in batches
# Using find -exec (processes each file individually)
find /tmp/large-dir -type f -exec rm {} +

# Using xargs (batches arguments automatically)
find /tmp/large-dir -type f | xargs rm

Why this works

find -exec + and xargs both split the file list into batches that fit within ARG_MAX.

Sources
Official documentation ↗

Linux Programmer Manual errno(3)

xargs(1) manual

Content generated with AI assistance and reviewed for accuracy. Found an error? hello@errcodes.dev

← All Linux / POSIX errors