Why is the accuracy for my Keras model always 0 when training?
I'm pretty new to keras I have built a simple network to try:
import numpy as np;
from keras.models import Sequential;
from keras.layers import Dense,Activation;
data= np.genfromtxt("./kerastests/mydata.csv", delimiter=';')
x_target=data[:,29]
x_training=np.delete(data,6,axis=1)
x_training=np.delete(x_training,28,axis=1)
model=Sequential()
model.add(Dense(20,activation='relu', input_dim=x_training.shape[1]))
model.add(Dense(10,activation='relu'))
model.add(Dense(1));
model.compile(optimizer='adam',loss='mean_squared_error',metrics=['accuracy'])
model.fit(x_training, x_target)
From my source data, I have removed 2 columns, as you can see. One is a column that came with dates in a string format (in the dataset, besides it, I have a column for the day, another for the month, and another for the year, so I don't need that column) and the other column is the column I use as target for the model).
When I train this model I get this output:
32/816 [>.............................] - ETA: 23s - loss: 13541942.0000 - acc: 0.0000e+00
800/816 [============================>.] - ETA: 0s - loss: 11575466.0400 - acc: 0.0000e+00
816/816 [==============================] - 1s - loss: 11536905.2353 - acc: 0.0000e+00
Epoch 2/10
32/816 [>.............................] - ETA: 0s - loss: 6794785.0000 - acc: 0.0000e+00
816/816 [==============================] - 0s - loss: 5381360.4314 - acc: 0.0000e+00
Epoch 3/10
32/816 [>.............................] - ETA: 0s - loss: 6235184.0000 - acc: 0.0000e+00
800/816 [============================>.] - ETA: 0s - loss: 5199512.8700 - acc: 0.0000e+00
816/816 [==============================] - 0s - loss: 5192977.4216 - acc: 0.0000e+00
Epoch 4/10
32/816 [>.............................] - ETA: 0s - loss: 4680165.5000 - acc: 0.0000e+00
736/816 [==========================>...] - ETA: 0s - loss: 5050110.3043 - acc: 0.0000e+00
816/816 [==============================] - 0s - loss: 5168771.5490 - acc: 0.0000e+00
Epoch 5/10
32/816 [>.............................] - ETA: 0s - loss: 5932391.0000 - acc: 0.0000e+00
768/816 [===========================>..] - ETA: 0s - loss: 5198882.9167 - acc: 0.0000e+00
816/816 [==============================] - 0s - loss: 5159585.9020 - acc: 0.0000e+00
Epoch 6/10
32/816 [>.............................] - ETA: 0s - loss: 4488318.0000 - acc: 0.0000e+00
768/816 [===========================>..] - ETA: 0s - loss: 5144843.8333 - acc: 0.0000e+00
816/816 [==============================] - 0s - loss: 5151492.1765 - acc: 0.0000e+00
Epoch 7/10
32/816 [>.............................] - ETA: 0s - loss: 6920405.0000 - acc: 0.0000e+00
800/816 [============================>.] - ETA: 0s - loss: 5139358.5000 - acc: 0.0000e+00
816/816 [==============================] - 0s - loss: 5169839.2941 - acc: 0.0000e+00
Epoch 8/10
32/816 [>.............................] - ETA: 0s - loss: 3973038.7500 - acc: 0.0000e+00
672/816 [=======================>......] - ETA: 0s - loss: 5183285.3690 - acc: 0.0000e+00
816/816 [==============================] - 0s - loss: 5141417.0000 - acc: 0.0000e+00
Epoch 9/10
32/816 [>.............................] - ETA: 0s - loss: 4969548.5000 - acc: 0.0000e+00
768/816 [===========================>..] - ETA: 0s - loss: 5126550.1667 - acc: 0.0000e+00
816/816 [==============================] - 0s - loss: 5136524.5098 - acc: 0.0000e+00
Epoch 10/10
32/816 [>.............................] - ETA: 0s - loss: 6334703.5000 - acc: 0.0000e+00
768/816 [===========================>..] - ETA: 0s - loss: 5197778.8229 - acc: 0.0000e+00
816/816 [==============================] - 0s - loss: 5141391.2059 - acc: 0.0000e+00
Why is this happening? My data is a time series. I know that for time series people do not usually use Dense
neurons, but it is just a test. What really tricks me is that accuracy is always 0. And, with other tests, I did even lose: gets to a "NAN" value.
Could anybody help here?
Solution 1:
Your model seems to correspond to a regression model for the following reasons:
You are using
linear
(the default one) as an activation function in the output layer (andrelu
in the layer before).Your loss is
loss='mean_squared_error'
.
However, the metric that you use- metrics=['accuracy']
corresponds to a classification problem. If you want to do regression, remove metrics=['accuracy']
. That is, use
model.compile(optimizer='adam',loss='mean_squared_error')
Here is a list of keras metrics for regression and classification (taken from this blog post):
Keras Regression Metrics
•Mean Squared Error: mean_squared_error, MSE or mse
•Mean Absolute Error: mean_absolute_error, MAE, mae
•Mean Absolute Percentage Error: mean_absolute_percentage_error, MAPE, mape
•Cosine Proximity: cosine_proximity, cosine
Keras Classification Metrics
•Binary Accuracy: binary_accuracy, acc
•Categorical Accuracy: categorical_accuracy, acc
•Sparse Categorical Accuracy: sparse_categorical_accuracy
•Top k Categorical Accuracy: top_k_categorical_accuracy (requires you specify a k parameter)
•Sparse Top k Categorical Accuracy: sparse_top_k_categorical_accuracy (requires you specify a k parameter)
Solution 2:
Add following to get metrics:
history = model.compile(optimizer='adam', loss='mean_squared_error', metrics=['mean_squared_error'])
# OR
history = model.compile(optimizer='adam', loss='mean_absolute_error', metrics=['mean_absolute_error'])
history.history.keys()
history.history