Operation ran out of memory
Raised when an operation runs out of memory. This indicates that the Python process cannot allocate the necessary memory from the operating system to create an object.
- 1Attempting to load a very large file entirely into memory.
- 2Creating a massive data structure, like a list or dictionary with billions of elements.
- 3A memory leak in a C extension or in the application logic where objects are not being garbage collected.
This error is triggered by attempting to create a data structure that is too large for the available RAM.
# This will try to allocate a list with a huge number of elements
# Warning: This can make your system unresponsive.
try:
huge_list = [0] * (10**9) # A list of a billion zeros
except MemoryError:
print("Failed to allocate memory for the huge list.")
expected output
Failed to allocate memory for the huge list. (Or the process might be killed by the OS before an exception is caught)
Fix 1
Process data in chunks or streams
WHEN Working with large files or datasets that do not fit in memory.
# Instead of reading the whole file, read it line by line
try:
with open("very_large_file.txt", "r") as f:
for line in f:
# process the line
pass
except MemoryError:
print("This is unlikely to happen with streaming.")
Why this works
By processing a file line-by-line or a dataset in chunks, you only need to hold a small portion of the data in memory at any given time.
Fix 2
Use more memory-efficient data structures
WHEN You are storing a large amount of numerical data.
import numpy as np
# A NumPy array is more memory-efficient than a Python list for numbers
try:
huge_array = np.zeros(10**9, dtype=np.int8)
except MemoryError:
print("Failed to allocate memory, but it took less space than a list.")
Why this works
Libraries like NumPy store data in compact, contiguous blocks of memory, using far less space per element than a standard Python list of objects.
try:
huge = [0] * (10**9) # MemoryError on low-RAM systems
except MemoryError:
print("Not enough memory")try:
data = load_entire_file(path)
except MemoryError:
data = stream_file(path) # fallback to streamingwith open("large.csv") as f:
for line in f: # reads one line at a time
process(line)✕ Catching `MemoryError` and trying to continue
If you've run out of memory, the system is in a highly unstable state. It's unlikely you can do anything useful, and further allocations will fail. The best course of action is usually to log the error and terminate.
cpython/Objects/exceptions.c
Content generated with AI assistance and reviewed for accuracy. Found an error? hello@errcodes.dev