Table Of Contents

Previous topic

Convergence Acceleration

Next topic

Fermi-Dirac Integrals

This Page

Coroutines

Coroutines provide the ability to maintain multiple stacks and to transfer control back and forth. The mechanism in Python is provided through generaor functions. We will assume Python 2.5 or later so that yield can appear as both a statement and an expression.

Generator Functions

A generator function is a function that contains yield. Such functions cannot have return. The yield keyword denotes a point at which execution will be suspended. Subsequent evolution will resume after the yield and all state (local variables, etc.) will be maintained. (This is the notion of multiple independent stacks.)

The yield can also be used to specify a “return” value for the generator and can be used to accept input. We start with the older, simpler usage of yield specifying a “return” value

Producers

Generator functions with a yield statement can be thought of as a “producer” of data. Here is an example which produces a generator of the Fibonacci numbers:

>>> def fib(max=100):
...     n_prev = 0
...     yield n_prev
...     n = 1
...     while n < max:
...         yield n
...         _n = n_prev + n
...         n_prev = n
...         n = _n

Note that fib This is not the generator, it returns the generator. Thus, we use it like this:

>>> f = fib()
>>> f.next()
0
>>> f.next()
1
>>> f.next()
1
>>> f.next()
2
>>> f.next()
3
>>> f.next()
5

The generator can also be used as an iterator:

>>> [n for n in fib()]
[0, 1, 1, 2, 3, 5, 8, 13, 21, 34, 55, 89]

In both these cases, there is no notion of “input” or “passing an argument” to the generator.

Note

The function fib() takes one optional argument max that specifies the maximum number to generate. One should think of fib() as a factory function that returns a customized generator. The generator in this example takes no arguments or inputs.

Consumers

Now we use generator expressions as consumers. Here we will use one to sum up values into a global list (which is mutated).

>>> result = [0]
>>> def sum_it():
...     while True:
...         x = (yield)
...         result[0] += x

Here yield is being used as an expression: it “has a value” that is being assigned here to x. To use this we must first start the generator (sometimes called “priming” it) by calling generator.next(). After this, one may call generator.send() to send the value. The argument of generator.send() will be the result of the yield expression.

>>> f = sum_it()
>>> f.next()                # Prime the generator
>>> for n in xrange(100):
...     f.send(n)
>>> result
[4950]
  1. The first call to generator.next() starts the function running up until the first yield. If this has an “argument”, then generator.next() will return this (but we will cover this later). Here there is no argument.
  2. Calling generator.send() with arguments will then be passed into the function. In this sense, the resulting generator can “consume” the arguments (in this case, accumulating them in the global result list).
  3. If there were an argument to the yield, then generator.send() would have returned this (more below).

Coroutines

Now we come to the point: we can define a generator function that both produces and consumes. Furthermore, we can use several such functions to transfer control back and forth between each other passing arguments. Here is a complete example. We will define a function main that we start up. This will then transfer control to the function f passing arguments and receiving results.

>>> def main(f):
...     print("main0: Starting")
...     x = 1
...     print("main1: Sending %r to f..." % (x,))
...     res = f(x)
...     x = 2
...     print("main2: Got %r. Sending %r to f..." % (res,x))
...     res = f(x)
...     x = 3
...     print("main3: Got %r" % (res,))
...     print("main4: Done.")

Now, if f can be implemented as a simple function, then we can use this without coroutines:

>>> def f(x):
...     return ("a%s" % (x,))
>>> main(f)
main0: Starting
main1: Sending 1 to f...
main2: Got 'a1'. Sending 2 to f...
main3: Got 'a2'
main4: Done.

But what if f is a state machine? In this case, it may be useful to write it in terms of a co-routine. Thus:

>>> def f_gen():
...     x = yield "This valued is returned by the first next() call"
...     x = yield ("a%r" % (x,))
...     x = yield ("b%r" % (x,))
...     x = yield ("c%r" % (x,))
...     x = yield ("d%r" % (x,))

The point here is that were are changing the first character of the response depending on which time the function was called. In this case, the internal state is maintained by the position within the function (the program counter if you like). Note that one can also use local variables as part of the state too.

To use this, we must not forget to “prime” the generator (this is the purpose of the first yield... statement. We do not use the value.) We then pass the generator.send() method in place of the function

>>> f = f_gen()
>>> f.next()           # Prime the generator
'This valued is returned by the first next() call'
>>> main(f.send)
main0: Starting
main1: Sending 1 to f...
main2: Got 'a1'. Sending 2 to f...
main3: Got 'b2'
main4: Done.

This is a bit messy, and can be cleaned up by using a decorator:

>>> def coroutine(f_gen):
...     def wrapper(*v, **kw):
...         cr = f_gen(*v, **kw)
...         cr.next()
...         return cr.send
...     return wrapper
>>> @coroutine
... def f_gen(prefix=""):
...     # Put initialization code here.
...     x = (yield)        # No significant result here.
...     x = yield ("%sa%r" % (prefix, x,))
...     x = yield ("%sb%r" % (prefix, x,))
...     x = yield ("%sc%r" % (prefix, x,))
...     x = yield ("%sd%r" % (prefix, x,))
>>> main(f_gen("pre_"))
main0: Starting
main1: Sending 1 to f...
main2: Got 'pre_a1'. Sending 2 to f...
main3: Got 'pre_b2'
main4: Done.
  1. Note that the arguments of f_gen are used to set the initial state of the generator. Think of the code before the first yield as initialization.
  2. There is still one throw-away return value of the first yield. Think of this as the starting line.

Defunctionalization

Here we show a somewhat more complicated example. Suppose we have two functions. One is a solver which takes an initial state x and a function g and implements a somewhat complicated solver algorithm. However, the code for computing the objective function is structured as an algorithm that calls a function get_new_x(x, g_x). Coroutines allow us to link the two by allowing the solver to be implemented as a state machine.

Here is a mock solver:

>>> def solver(x, g, tol=1e-12):
...     """Dummy solver that terminates when x=0"""
...     g_x = g(x)
...     while tol < norm(g_x):
...         x /= 2
...         g_x = g(x)
...     return x

Here is a mock objective function as a routine:

>>> def main(tol=1e-12):
...     x = 1.0
...     g_x = x**2
...     while tol < abs(g_x):
...         x = get_new_x(x, g_x)
...         g_x = x**2
...     return x

To link these two with a pair of coroutines seems to require “stackfull coroutines”: the ability to yield from a nested function call (i.e. to have solver and main yield from nested calls to custom g and get_new_x functions). Unfortunately, python does not support this without quite a bit of manipulation (sometimes it is said that python supports semi-coroutines).

Another solution is provided by making g and get_new_x the same function. In this way state can be shared.

Todo

Elaborate..

Exceptions and Cleanup

Todo

Complete the discussion about exceptions and cleanup.