MemoryError
PythonCRITICALResource ErrorHIGH confidence

Operation ran out of memory

What this means

Raised when an operation runs out of memory. This indicates that the Python process cannot allocate the necessary memory from the operating system to create an object.

Why it happens
  1. 1Attempting to load a very large file entirely into memory.
  2. 2Creating a massive data structure, like a list or dictionary with billions of elements.
  3. 3A memory leak in a C extension or in the application logic where objects are not being garbage collected.
How to reproduce

This error is triggered by attempting to create a data structure that is too large for the available RAM.

trigger — this will error
trigger — this will error
# This will try to allocate a list with a huge number of elements
# Warning: This can make your system unresponsive.
try:
    huge_list = [0] * (10**9) # A list of a billion zeros
except MemoryError:
    print("Failed to allocate memory for the huge list.")

expected output

Failed to allocate memory for the huge list.
(Or the process might be killed by the OS before an exception is caught)

Fix 1

Process data in chunks or streams

WHEN Working with large files or datasets that do not fit in memory.

Process data in chunks or streams
# Instead of reading the whole file, read it line by line
try:
    with open("very_large_file.txt", "r") as f:
        for line in f:
            # process the line
            pass
except MemoryError:
    print("This is unlikely to happen with streaming.")

Why this works

By processing a file line-by-line or a dataset in chunks, you only need to hold a small portion of the data in memory at any given time.

Fix 2

Use more memory-efficient data structures

WHEN You are storing a large amount of numerical data.

Use more memory-efficient data structures
import numpy as np
# A NumPy array is more memory-efficient than a Python list for numbers
try:
    huge_array = np.zeros(10**9, dtype=np.int8)
except MemoryError:
    print("Failed to allocate memory, but it took less space than a list.")

Why this works

Libraries like NumPy store data in compact, contiguous blocks of memory, using far less space per element than a standard Python list of objects.

Code examples
Triggerpython
try:
    huge = [0] * (10**9)  # MemoryError on low-RAM systems
except MemoryError:
    print("Not enough memory")
Handle with try/exceptpython
try:
    data = load_entire_file(path)
except MemoryError:
    data = stream_file(path)  # fallback to streaming
Avoid with streamingpython
with open("large.csv") as f:
    for line in f:  # reads one line at a time
        process(line)
What not to do

Catching `MemoryError` and trying to continue

If you've run out of memory, the system is in a highly unstable state. It's unlikely you can do anything useful, and further allocations will fail. The best course of action is usually to log the error and terminate.

Sources
Official documentation ↗

cpython/Objects/exceptions.c

Content generated with AI assistance and reviewed for accuracy. Found an error? hello@errcodes.dev

← All Python errors