Lazy evaluation is a key concept in efficient programming where values are computed only as needed. Historically, in Python, all built-in structures (lists, tuples) were "greedy": they preemptively create and store all elements in memory. As data volumes and streaming processing tasks grew, there arose a need for lazy evaluations.
Problem: greedy computations lead to inefficient memory and time usage where results can be gradually obtained — for instance, during filtering, transforming long collections, or streaming files.
Solution: Python has introduced many tools for lazy computations: generators, iterators, as well as standard library functions (map, filter, zip, enumerate) and the itertools module. All of them return not ready collections, but "lazy" objects that yield results one value at a time.
Code example:
result = map(lambda x: x * x, range(100)) # returns a generator-iterator for y in result: print(y) # values are computed on demand import itertools inf = itertools.count(1) for i in inf: if i > 3: break print(i) # 1, 2, 3
Key features:
Do the map/filter functions always return a list in Python 3?
No, in Python 3 these functions return iterators, not lists. To obtain a list, you need to wrap the result in list().
x = map(int, ['1', '2']) # <map object> list(x) # [1, 2]
Can you get the length of the result of map without converting it to a list?
No, the iterator does not know in advance how many elements it has until it has gone through them all. You need to compute it via list(), which cancels out laziness.
Is the range function in Python 3 greedy or lazy?
Lazy: range creates a "range object" — it "calculates" elements on demand, without storing the whole sequence.
A script processes a huge CSV file by creating a list of all lines via list(open(f)). The server "dies" due to lack of memory with a large file.
Pros:
Cons:
The code uses lazy processing: iterates over the file lines with for line in open(f), or processes them through map/filter without creating intermediate collections.
Pros:
Cons: