load multiple models in Tensorflow

This should be a comment to the most up-voted answer. But I do not have enough reputation to do that.

Anyway. If you(anyone searched and got to this point) still having trouble with the solution provided by lpp AND you are using Keras, check following quote from github.

This is because the keras share a global session if no default tf session provided

When the model1 created, it is on graph1 When the model1 loads weight, the weight is on a keras global session which is associated with graph1

When the model2 created, it is on graph2 When the model2 loads weight, the global session does not know the graph2

A solution below may help,

graph1 = Graph()
with graph1.as_default():
    session1 = Session()
    with session1.as_default():
        with open('model1_arch.json') as arch_file:
            model1 = model_from_json(arch_file.read())
        model1.load_weights('model1_weights.h5')
        # K.get_session() is session1

# do the same for graph2, session2, model2

Yes there is. Use separate graphs.

g1 = tf.Graph()
g2 = tf.Graph()

with g1.as_default():
    cnn1 = CNN(..., restore_file='snapshot-model1-10000',..........) 
with g2.as_default():
    cnn2 = CNN(..., restore_file='snapshot-model2-10000',..........)

EDIT:

If you want them into same graph. You'll have to rename some variables. One idea is have each CNN in separate scope and let saver handle variables in that scope e.g.:

saver = tf.train.Saver(tf.get_collection(tf.GraphKeys.GLOBAL_VARIABLES), scope='model1')

and in cnn wrap all your construction in scope:

with tf.variable_scope('model1'):
    ...

EDIT2:

Other idea is renaming variables which saver manages (since I assume you want to use your saved checkpoints without retraining everything. Saving allows different variable names in graph and in checkpoint, have a look at documentation for initialization.

Tags:

Tensorflow