How do you use Keras LeakyReLU in Python?

All advanced activations in Keras, including LeakyReLU, are available as layers, and not as activations; therefore, you should use it as such:

from keras.layers import LeakyReLU

# instead of cnn_model.add(Activation('relu'))
# use
cnn_model.add(LeakyReLU(alpha=0.1))

Sometimes you just want a drop-in replacement for a built-in activation layer, and not having to add extra activation layers just for this purpose.

For that, you can use the fact that the activation argument can be a callable object.

lrelu = lambda x: tf.keras.activations.relu(x, alpha=0.1)
model.add(Conv2D(..., activation=lrelu, ...)

Since a Layer is also a callable object, you could also simply use

model.add(Conv2D(..., activation=tf.keras.layers.LeakyReLU(alpha=0.1), ...)

which now works in TF2. This is a better solution as this avoids the need to use a custom_object during loading as @ChristophorusReyhan mentionned.