Manually changing learning_rate in tf.train.AdamOptimizer

Yes, the optimizer is created only once:

tf.train.AdamOptimizer(learning_rate=myLearnRate)

It remembers the passed learning rate (in fact, it creates a tensor for it, if you pass a floating number) and your future changes of myLearnRate don't affect it.

Yes, you can create a placeholder and pass it to the session.run(), if you really want to. But, as you said, it's pretty uncommon and probably means you are solving your origin problem in the wrong way.


The short answer is that no, your new learning rate is not applied. TF builds the graph when you first run it, and changing something on the Python side will not translate to a change in the graph at run time. You can, however, feed a new learning rate into your graph pretty easily:

# Use a placeholder in the graph for your user-defined learning rate instead
learning_rate = tf.placeholder(tf.float32)
# ...
trainStep = tf.train.AdamOptimizer(learning_rate=learning_rate).minimize(trainLoss)
applied_rate = 0.001  # we will update this every training step
with tf.Session() as session:
    #first trainstep, feeding our applied rate to the graph
    session.run(trainStep, feed_dict = {input: someData,
                                        target: someTarget,
                                        learning_rate: applied_rate})
    applied_rate *= 0.1  # update the rate we feed to the graph
    #second trainstep
    session.run(trainStep, feed_dict = {input: someData,
                                        target: someTarget,
                                        learning_rate: applied_rate})