How to pass a parameter to Scikit-Learn Keras model function

To pass a parameter to build_fn model, can be done passing arguments to __init__() and in turn it will be passed to model_build_fn directly. For example, calling KerasClassifier(myparam=10) will result in a model_build_fn(my_param=10)

here's an example:

class MyMultiOutputKerasRegressor(KerasRegressor):
    
    # initializing
    def __init__(self, **kwargs):
        KerasRegressor.__init__(self, **kwargs)
        
    # simpler fit method
    def fit(self, X, y, **kwargs):
        KerasRegressor.fit(self, X, [y]*3, **kwargs)

(...)

def get_quantile_reg_rpf_nn(layers_shape=[50,100,200,100,50], inDim= 4, outDim=1, act='relu'):
          # do model stuff...

(...) initialize the Keras regressor:

base_model = MyMultiOutputKerasRegressor(build_fn=get_quantile_reg_rpf_nn,
                                         layers_shape=[50,100,200,100,50], inDim= 4, 
                                         outDim=1, act='relu', epochs=numEpochs, 
                                         batch_size=batch_size, verbose=0)

Last answer does not work anymore.

An alternative is to return a function from create_model, as KerasClassifier build_fn expects a function:

def create_model(input_dim=None):
    def model():
        # create model
        nn = Sequential()
        nn.add(Dense(12, input_dim=input_dim, init='uniform', activation='relu'))
        nn.add(Dense(6, init='uniform', activation='relu'))
        nn.add(Dense(1, init='uniform', activation='sigmoid'))
        # Compile model
        nn.compile(loss='binary_crossentropy', optimizer='adam', metrics=['accuracy'])
        return nn

    return model

Or even better, according to documentation

sk_params takes both model parameters and fitting parameters. Legal model parameters are the arguments of build_fn. Note that like all other estimators in scikit-learn, build_fn should provide default values for its arguments, so that you could create the estimator without passing any values to sk_params

So you can define your function like this:

def create_model(number_of_features=10): # 10 is the *default value*
    # create model
    nn = Sequential()
    nn.add(Dense(12, input_dim=number_of_features, init='uniform', activation='relu'))
    nn.add(Dense(6, init='uniform', activation='relu'))
    nn.add(Dense(1, init='uniform', activation='sigmoid'))
    # Compile model
    nn.compile(loss='binary_crossentropy', optimizer='adam', metrics=['accuracy'])
    return nn

And create a wrapper:

KerasClassifier(build_fn=create_model, number_of_features=20, epochs=25, batch_size=1000, ...)

You can add an input_dim keyword argument to the KerasClassifier constructor:

model = KerasClassifier(build_fn=create_model, input_dim=5, nb_epoch=150, batch_size=10, verbose=0)