TensorFlow error: logits and labels must be same size

Thanks for sharing your code as a Gist. There are two changes that are necessary to make the shapes agree:

  1. The line:

    fc1 = tf.reshape(pool5, [-1, wd1Shape[0]])
    

    ...is responsible for the erroneous 49 in the batch dimension. The input is 1 x 7 x 7 x 256, and it is reshaped to be 49 x 256, because wd1Shape[0] is 256. One possible replacement is the following:

    pool5Shape = pool5.get_shape().as_list()
    fc1 = tf.reshape(pool5, [-1, pool5Shape[1] * pool5Shape[2] * pool5Shape[3]])
    

    ...which will give fc1 the shape 1 x 12544.

  2. After making this change, the size of the 'wd1' weight matrix (256 x 4096) doesn't match the number of nodes in fc1. You could change the definition of this matrix as follows:

        'wd1': tf.Variable(tf.random_normal([12544, 4096])),
    

    ...although you may want to modify the other weights, or perform additional pooling to reduce the size of this matrix.


I had a similar issue when using model.fit(..). Turns out my output_size was defined as 2 while using "binary_crossentropy" as the loss function, when it should have been defined as 1.

Tags:

Tensorflow