invoking yield for a generator in another function

I don't understand the whole either (what does the main_hook caller look like ?), but i would say, Throw a StopNow exception, when you should stop, just like you should throw StopIteration when your generator is finished.

here is how i understood the thing as well as what i would do.

class StopNow(Exception):
    pass

def main_hook(self,f):
    got_stop_now_exc = False
    while (!got_stop_now_exc and self.shouldContinue()):
        #do some preparations
        try:
             f(self)
        except StopNow:
             got_stop_now_exc = True

        #do some compulsary tear down, exception or not

def stop_and_do_stuff()
    raise StopNow()
def my_f():
    if needed:
        stop_and_do_stuff()

def the_main_hook_caller():
    while i_should:
        managerthingie.main_hook(my_f)
        do_stuff()

I believe I should also add an answer from the other point of view, ie not trying to explain how you could achieve what we can understand of what you are trying to do, but why yield definitely couldn't possibly work.

When a function contains yield keyword it is deeply modified. It is still a callable but not a normal function any more : it becomes a factory that return an iterator.

From the caller's point of view there is no difference between the three implementations below (except that the yield one is so much simpler).

##########################################
print "Function iterator using yield",

def gen():
    for x in range(0, 10):
        yield x

f = gen()
try:
    while True:
        print f.next(),
except StopIteration:
    pass

for x in gen():
    print x,

print

#########################################
print "Class iterator defining iter and next",

class gen2(object):

    def __init__(self):
        self.index = 0;
        self.limit = 10;

    def __iter__(self):
        return self

    def next(self):
        if self.index >= self.limit:
            raise StopIteration
        self.index += 1;
        return self.index - 1;


f = gen2()
try:
    while True:
        print f.next(),
except StopIteration:
    pass

for x in gen2():
    print x,
print

#########################################
print "Function iterator using iter() and sentinel",
def gen3():
    def g3():
        if g3.index is None:
            g3.index = 0
        g3.index += 1;
        return g3.index - 1

    g3.index = None
    return iter(g3, 10)

f = gen3()
try:
    while True:
        print f.next(),
except StopIteration:
    pass

for x in gen3():
    print x,
print

Then you should understand that yield is not much about control flow, but about keeping call context inside variables. Once it is understood you have to decide if the API of main_loop really want to provide an iterator to it's caller. Then if so, if f may loop it must should also be an iterator (and there should be a loop around calls to f() like below).

def main_hook(self,f):
    while (self.shouldContinue()):
        #do some preparations
        for v in f(self):
            yield v
        #do some tear down

But you should not care if f() has to call inner functions g(), etc. That is completely irrelevant. You provide a lib and it is your user problem to call with an appropriate iterable. If you believe your lib user won't be able to, you will have to change the overall design.

Hope it helps.


My previous answer describes how to do this in Python2, which is very ugly. But now I ran across PEP 380: Syntax for Delegating to a Subgenerator. That does exactly what you ask. The only problem is that it requires Python3. But that shouldn't really be a problem.

Here's how it works:

def worker():
    yield 1
    yield 2
    return 3

def main():
    yield 0
    value = yield from worker()
    print('returned %d' % value)
    yield 4

for m in main():
    print('generator yields %d' % m)

The result of this is:

generator yields 0
generator yields 1
generator yields 2
returned 3
generator yields 4

Exceptions are passed through the way you would expect.