Python

Python | Generators

Generators are a powerful feature of Python that allow you to lazily generate a sequence of values, instead of creating them all at once and storing them in memory. This makes them an ideal tool for working with large datasets, or for situations where you only need to access a subset of the data at any given time.

To define a generator in Python, you use the yield keyword instead of return. When a function encounters a yield statement, it returns the value and pauses its execution state. The next time the function is called, it resumes execution from where it left off, and continues generating values.

Here’s an example of a simple generator that generates the first n Fibonacci numbers:

def fib(n):
a, b = 0, 1
for i in range(n):
yield a
a, b = b, a + b

You can use this generator like any other iterable object in Python, by iterating over its values using a for loop or by calling the built-in next() function:

for i in fib(10):
print(i)

This will print the first 10 Fibonacci numbers: 0, 1, 1, 2, 3, 5, 8, 13, 21, 34.

One important feature of generators is that they are “lazy” — they don’t generate any values until you actually request them. This can be useful for working with large datasets, where you only need to access a subset of the data at any given time. For example, you might have a large text file that you want to process line by line, but you don’t want to load the entire file into memory at once. You can use a generator to read the file one line at a time:

def read_file(filename):
with open(filename) as f:
for line in f:
yield line

You can use this generator to read a large file one line at a time, like this:

for line in read_file('large_file.txt'):
process_line(line)

This will process each line of the file one at a time, without loading the entire file into memory at once.

In addition to the yield keyword, there are a few other important features of generators to be aware of. One is that generators can be composed together, using the yield from syntax. This allows you to write complex generators that are built up from smaller ones. For example, you could define a generator that iterates over the contents of multiple files:

def read_files(filenames):
for filename in filenames:
with open(filename) as f:
yield from read_file(f)

This generator uses yield from to iterate over the contents of each file, and yields the lines one at a time.

Another important feature of generators is that they can be used in combination with other Python features, like list comprehensions and generators expressions. For example, you could use a generator expression to compute the sum of the first n Fibonacci numbers:

sum(fib(i) for i in range(n))

This generates the first n Fibonacci numbers using the fib() generator, and then computes their sum using the built-in sum() function.

In conclusion, generators are a powerful feature of Python that allow you to lazily generate a sequence of values. They are ideal for working with large datasets or situations where you only need to access a subset of the data at any given time. By using the yield keyword, you can define generators that pause and resume their execution state, and by using features

This same article is on my medium post at

https://medium.com/@vipinc.007/python-generators-6b90ebfccb10

Leave a Reply

Your email address will not be published. Required fields are marked *