Making predictions with a TensorFlow model

I followed the given mnist tutorials and was able to train a model and evaluate its accuracy. However, the tutorials don't show how to make predictions given a model. I'm not interested in accuracy, I just want to use the model to predict a new example and in the output see all the results (labels), each with its assigned score (sorted or not).


In the "Deep MNIST for Experts" example, see this line:

We can now implement our regression model. It only takes one line! We multiply the vectorized input images x by the weight matrix W, add the bias b, and compute the softmax probabilities that are assigned to each class.

y = tf.nn.softmax(tf.matmul(x,W) + b)

Just pull on node y and you'll have what you want.

feed_dict = {x: [your_image]}
classification = tf.run(y, feed_dict)
print classification

This applies to just about any model you create - you'll have computed the prediction probabilities as one of the last steps before computing the loss.


As @dga suggested, you need to run your new instance of the data though your already predicted model.

Here is an example:

Assume you went though the first tutorial and calculated the accuracy of your model (the model is this: y = tf.nn.softmax(tf.matmul(x, W) + b)). Now you grab your model and apply the new data point to it. In the following code I calculate the vector, getting the position of the maximum value. Show the image and print that maximum position.

from matplotlib import pyplot as plt
from random import randint
num = randint(0, mnist.test.images.shape[0])
img = mnist.test.images[num]

classification = sess.run(tf.argmax(y, 1), feed_dict={x: [img]})
plt.imshow(img.reshape(28, 28), cmap=plt.cm.binary)
plt.show()
print 'NN predicted', classification[0]

2.0 Compatible Answer: Suppose you have built a Keras Model as shown below:

model = keras.Sequential([
    keras.layers.Flatten(input_shape=(28, 28)),
    keras.layers.Dense(128, activation='relu'),
    keras.layers.Dense(10, activation='softmax')
])

model.compile(optimizer='adam',
              loss='sparse_categorical_crossentropy',
              metrics=['accuracy'])

Then Train and Evaluate the Model using the below code:

model.fit(train_images, train_labels, epochs=10)
test_loss, test_acc = model.evaluate(test_images,  test_labels, verbose=2)

After that, if you want to predict the class of a particular image, you can do it using the below code:

predictions_single = model.predict(img)

If you want to predict the classes of a set of Images, you can use the below code:

predictions = model.predict(new_images)

where new_images is an Array of Images.

For more information, refer this Tensorflow Tutorial.


The question is specifically about the Google MNIST tutorial, which defines a predictor but doesn't apply it. Using guidance from Jonathan Hui's TensorFlow Estimator blog post, here is code which exactly fits the Google tutorial and does predictions:

from matplotlib import pyplot as plt

images = mnist.test.images[0:10]

predict_input_fn = tf.estimator.inputs.numpy_input_fn(
      x={"x":images},
      num_epochs=1,
      shuffle=False)

mnist_classifier.predict(input_fn=predict_input_fn)

for image,p in zip(images,mnist_classifier.predict(input_fn=predict_input_fn)):
    print(np.argmax(p['probabilities']))
    plt.imshow(image.reshape(28, 28), cmap=plt.cm.binary)
    plt.show()