Keras: change learning rate

You can change lr during training with

from keras.callbacks import LearningRateScheduler

# This is a sample of a scheduler I used in the past
def lr_scheduler(epoch, lr):
    decay_rate = 0.85
    decay_step = 1
    if epoch % decay_step == 0 and epoch:
        return lr * pow(decay_rate, np.floor(epoch / decay_step))
    return lr

Apply scheduler to your model

callbacks = [LearningRateScheduler(lr_scheduler, verbose=1)]

model = build_model(pretrained_model=ka.InceptionV3, input_shape=(224, 224, 3))
history = model.fit(train, callbacks=callbacks, epochs=EPOCHS, verbose=1)

You can change the learning rate as follows:

from keras import backend as K
K.set_value(model.optimizer.learning_rate, 0.001)

Included into your complete example it looks as follows:

from keras.models import Sequential
from keras.layers import Dense
from keras import backend as K
import keras
import numpy as np

model = Sequential()

model.add(Dense(1, input_shape=(10,)))

optimizer = keras.optimizers.Adam(lr=0.01)
model.compile(loss='mse', optimizer=optimizer)

print("Learning rate before first fit:", model.optimizer.learning_rate.numpy())

model.fit(np.random.randn(50,10), np.random.randn(50), epochs=50, verbose=0)

# Change learning rate to 0.001 and train for 50 more epochs
K.set_value(model.optimizer.learning_rate, 0.001)
print("Learning rate before second fit:", model.optimizer.learning_rate.numpy())

model.fit(np.random.randn(50,10), 
          np.random.randn(50), 
          initial_epoch=50, 
          epochs=50,
          verbose=0)

I've just tested this with keras 2.3.1. Not sure why the approach didn't seem to work for you.


There is another way, you have to find the variable that holds the learning rate and assign it another value.

optimizer = tf.keras.optimizers.Adam(0.001)
optimizer.learning_rate.assign(0.01)
print(optimizer.learning_rate)

output:

<tf.Variable 'learning_rate:0' shape=() dtype=float32, numpy=0.01>

You should define it in the compile function :

optimizer = keras.optimizers.Adam(lr=0.01)
model.compile(loss='mse',
              optimizer=optimizer,
              metrics=['categorical_accuracy'])

Looking at your comment, if you want to change the learning rate after the beginning you need to use a scheduler : link

Edit with your code and scheduler:

from keras.models import Sequential
from keras.layers import Dense
import keras
import numpy as np

def lr_scheduler(epoch, lr):
    if epoch > 50:
        lr = 0.001
        return lr
    return lr

model = Sequential()

model.add(Dense(1, input_shape=(10,)))

optimizer = keras.optimizers.Adam(lr=0.01)
model.compile(loss='mse',
              optimizer=optimizer)

callbacks = [keras.callbacks.LearningRateScheduler(lr_scheduler, verbose=1)]

model.fit(np.random.randn(50,10), np.random.randn(50), epochs=100, callbacks=callbacks)