Keras multiple binary outputs

To optimize for multiple independent binary classification problems (and not multiple category problem where you can use categorical_crossentropy) using Keras, you could do the following (here I take the example of 2 independent binary outputs, but you can extend that as much as needed):

    inputs = Input(shape=(input_shape,))
    hidden = Dense(2048, activation='relu')(inputs)
    hidden = Dense(2048, activation='relu')(hidden)
    output = Dense(units = 2, activation='sigmoid')(hidden )

here you split your output using Keras's Lambda layer:

    output_1 = Lambda(lambda x: x[...,:1])(output)
    output_2 = Lambda(lambda x: x[...,1:])(output)

    adad = optimizers.Adadelta()

your model output becomes a list of the different independent outputs

    model = Model(inputs, [output_1, output_2])

you compile the model using one loss function for each output, in a list. (In fact, if you give only one kind of loss function, I believe it will apply it to all the outputs independently)

    model.compile(optimizer=adad, loss=['binary_crossentropy','binary_crossentropy'])

I know this is an old question, but I believe the accepted answer is incorrect and the most upvoted answer is workable but not optimal. The original poster's method is the correct way to solve this problem. His output is 200 independent probabilities from 0 to 1, so his output layer should be a dense layer with 200 neurons and a sigmoid activation layer. It's not a categorical_crossentropy problem because it's not 200 mutually exclusive categories. Also, there's no reason to split the output using a lambda layer when a single dense layer will do. The original poster's method is correct. Here's another way to do it using the Keras interface.

model = Sequential()
model.add(Dense(2048, input_dim=n_input, activation='relu'))
model.add(Dense(2048, input_dim=n_input, activation='relu'))
model.add(Dense(200, activation='sigmoid'))
model.compile(loss='binary_crossentropy', optimizer='adam', metrics=['accuracy'])