Difference between revisions of "Keras: Prediction"

From OnnoWiki
Jump to navigation Jump to search
Line 33: Line 33:
 
Masalah klasifikasi adalah masalah ketika model belajar memetakan antara fitur input dan fitur output yang merupakan label, seperti "spam" dan "not spam".
 
Masalah klasifikasi adalah masalah ketika model belajar memetakan antara fitur input dan fitur output yang merupakan label, seperti "spam" dan "not spam".
  
Di bawah ini adalah contoh finalized neural network model dalam Keras yang dikembangkan untuk masalah klasifikasi dua kelas (biner) sederhana.
+
Bagi pemula dengan Keras ada baiknya membaca
  
If developing a neural network model in Keras is new to you, see the post:
+
* [[Keras: Develop Your First Neural Network in Python Step-By-Step]]
  
    Develop Your First Neural Network in Python With Keras Step-By-Step
+
Di bawah ini adalah contoh finalized neural network model dalam Keras yang dikembangkan untuk masalah klasifikasi dua kelas (biner) sederhana.
  
 
  # example of training a final classification model
 
  # example of training a final classification model

Revision as of 17:51, 14 August 2019

Sumber: https://machinelearningmastery.com/how-to-make-classification-and-regression-predictions-for-deep-learning-models-in-keras/

Setelah kita memilih dan fit deep learning model di Keras, kita dapat menggunakannya untuk membuat prediksi pada instance data baru.

Ada beberapa kebingungan di kalangan pemula tentang bagaimana tepatnya melakukan ini. Sering terlihat pertanyaan seperti:

  • Bagaimana cara membuat prediksi dengan model yang saya peroleh di Keras?

Dalam tutorial ini, kita akan menemukan bagaimana tepatnya kita dapat membuat prediksi klasifikasi dan regresi dengan model deep learning yang telah diperoleh dengan library Keras Python.

Setelah menyelesaikan tutorial ini, kita diharapkan bisa mengetahui tentang:

  • Bagaimana memperoleh final model agar siap untuk digunakan untuk melakukan pediction.
  • Bagaimana cara membuat class dan probability prediction untuk classification problem di Keras.
  • Bagaimana cara melakukan regression prediction di Keras.


Finalisasi Model

Sebelum kita dapat membuat prediksi, kita harus me-train untuk memperoleh final model.

Kita mungkin telah men-train model menggunakan k-fold cross validation atau train/test split dari data yang ada. Ini dilakukan untuk memberi kita perkiraan kemampuan model pada data sampel, mis. ada yang data baru.

Model-model ini setelah memenuhi fungsi-nya, sekarang dapat dibuang.

Kita sekarang harus men-train final model menggunakan semua data yang tersedia. Kita dapat mempelajari lebih lanjut tentang cara me-train final model di:


Classification Prediction

Masalah klasifikasi adalah masalah ketika model belajar memetakan antara fitur input dan fitur output yang merupakan label, seperti "spam" dan "not spam".

Bagi pemula dengan Keras ada baiknya membaca

Di bawah ini adalah contoh finalized neural network model dalam Keras yang dikembangkan untuk masalah klasifikasi dua kelas (biner) sederhana.

# example of training a final classification model
from keras.models import Sequential
from keras.layers import Dense
from sklearn.datasets.samples_generator import make_blobs
from sklearn.preprocessing import MinMaxScaler
# generate 2d classification dataset
X, y = make_blobs(n_samples=100, centers=2, n_features=2, random_state=1)
scalar = MinMaxScaler()
scalar.fit(X)
X = scalar.transform(X)
# define and fit the final model
model = Sequential()
model.add(Dense(4, input_dim=2, activation='relu'))
model.add(Dense(4, activation='relu'))
model.add(Dense(1, activation='sigmoid'))
model.compile(loss='binary_crossentropy', optimizer='adam')
model.fit(X, y, epochs=200, verbose=0)

After finalizing, you may want to save the model to file, e.g. via the Keras API. Once saved, you can load the model any time and use it to make predictions. For an example of this, see the post:

   Save and Load Your Keras Deep Learning Models

For simplicity, we will skip this step for the examples in this tutorial.

There are two types of classification predictions we may wish to make with our finalized model; they are class predictions and probability predictions. Class Predictions

A class prediction is given the finalized model and one or more data instances, predict the class for the data instances.

We do not know the outcome classes for the new data. That is why we need the model in the first place.

We can predict the class for new data instances using our finalized classification model in Keras using the predict_classes() function. Note that this function is only available on Sequential models, not those models developed using the functional API.

For example, we have one or more data instances in an array called Xnew. This can be passed to the predict_classes() function on our model in order to predict the class values for each instance in the array.

Xnew = [[...], [...]]
ynew = model.predict_classes(Xnew)

Let’s make this concrete with an example:

# example making new class predictions for a classification problem
from keras.models import Sequential
from keras.layers import Dense
from sklearn.datasets.samples_generator import make_blobs
from sklearn.preprocessing import MinMaxScaler
# generate 2d classification dataset
X, y = make_blobs(n_samples=100, centers=2, n_features=2, random_state=1)
scalar = MinMaxScaler()
scalar.fit(X)
X = scalar.transform(X)
# define and fit the final model
model = Sequential()
model.add(Dense(4, input_dim=2, activation='relu'))
model.add(Dense(4, activation='relu'))
model.add(Dense(1, activation='sigmoid'))
model.compile(loss='binary_crossentropy', optimizer='adam')
model.fit(X, y, epochs=500, verbose=0)
# new instances where we do not know the answer
Xnew, _ = make_blobs(n_samples=3, centers=2, n_features=2, random_state=1)
Xnew = scalar.transform(Xnew)
# make a prediction
ynew = model.predict_classes(Xnew)
# show the inputs and predicted outputs
for i in range(len(Xnew)):
	print("X=%s, Predicted=%s" % (Xnew[i], ynew[i]))

Running the example predicts the class for the three new data instances, then prints the data and the predictions together.

X=[0.89337759 0.65864154], Predicted=[0]
X=[0.29097707 0.12978982], Predicted=[1]
X=[0.78082614 0.75391697], Predicted=[0]

If you had just one new data instance, you could provide this as an instance wrapped in an array to the predict_classes() function; for example:

# example making new class prediction for a classification problem
from keras.models import Sequential
from keras.layers import Dense
from sklearn.datasets.samples_generator import make_blobs
from sklearn.preprocessing import MinMaxScaler
from numpy import array
# generate 2d classification dataset
X, y = make_blobs(n_samples=100, centers=2, n_features=2, random_state=1)
scalar = MinMaxScaler()
scalar.fit(X)
X = scalar.transform(X)
# define and fit the final model
model = Sequential()
model.add(Dense(4, input_dim=2, activation='relu'))
model.add(Dense(4, activation='relu'))
model.add(Dense(1, activation='sigmoid'))
model.compile(loss='binary_crossentropy', optimizer='adam')
model.fit(X, y, epochs=500, verbose=0)
# new instance where we do not know the answer
Xnew = array(0.89337759, 0.65864154)
# make a prediction
ynew = model.predict_classes(Xnew)
# show the inputs and predicted outputs
print("X=%s, Predicted=%s" % (Xnew[0], ynew[0]))

Running the example prints the single instance and the predicted class.

X=[0.89337759 0.65864154], Predicted=[0]

A Note on Class Labels

Note that when you prepared your data, you will have mapped the class values from your domain (such as strings) to integer values. You may have used a LabelEncoder.

This LabelEncoder can be used to convert the integers back into string values via the inverse_transform() function.

For this reason, you may want to save (pickle) the LabelEncoder used to encode your y values when fitting your final model. Probability Predictions

Another type of prediction you may wish to make is the probability of the data instance belonging to each class.

This is called a probability prediction where, given a new instance, the model returns the probability for each outcome class as a value between 0 and 1.

You can make these types of predictions in Keras by calling the predict_proba() function; for example:

Xnew = [[...], [...]]
ynew = model.predict_proba(Xnew)

In the case of a two-class (binary) classification problem, the sigmoid activation function is often used in the output layer. The predicted probability is taken as the likelihood of the observation belonging to class 1, or inverted (1 – probability) to give the probability for class 0.

In the case of a multi-class classification problem, the softmax activation function is often used on the output layer and the likelihood of the observation for each class is returned as a vector.

The example below makes a probability prediction for each example in the Xnew array of data instance.

# example making new probability predictions for a classification problem
from keras.models import Sequential
from keras.layers import Dense
from sklearn.datasets.samples_generator import make_blobs
from sklearn.preprocessing import MinMaxScaler
# generate 2d classification dataset
X, y = make_blobs(n_samples=100, centers=2, n_features=2, random_state=1)
scalar = MinMaxScaler()
scalar.fit(X)
X = scalar.transform(X)
# define and fit the final model
model = Sequential()
model.add(Dense(4, input_dim=2, activation='relu'))
model.add(Dense(4, activation='relu'))
model.add(Dense(1, activation='sigmoid'))
model.compile(loss='binary_crossentropy', optimizer='adam')
model.fit(X, y, epochs=500, verbose=0)
# new instances where we do not know the answer
Xnew, _ = make_blobs(n_samples=3, centers=2, n_features=2, random_state=1)
Xnew = scalar.transform(Xnew)
# make a prediction
ynew = model.predict_proba(Xnew)
# show the inputs and predicted outputs
for i in range(len(Xnew)):
	print("X=%s, Predicted=%s" % (Xnew[i], ynew[i]))

Running the instance makes the probability predictions and then prints the input data instance and the probability of each instance belonging to class 1.

X=[0.89337759 0.65864154], Predicted=[0.0087348]
X=[0.29097707 0.12978982], Predicted=[0.82020265]
X=[0.78082614 0.75391697], Predicted=[0.00693122]


This can be helpful in your application if you want to present the probabilities to the user for expert interpretation. 3. Regression Predictions

Regression is a supervised learning problem where given input examples, the model learns a mapping to suitable output quantities, such as “0.1” and “0.2”, etc.

Below is an example of a finalized Keras model for regression.

# example of training a final regression model
from keras.models import Sequential
from keras.layers import Dense
from sklearn.datasets import make_regression
from sklearn.preprocessing import MinMaxScaler
# generate regression dataset
X, y = make_regression(n_samples=100, n_features=2, noise=0.1, random_state=1)
scalarX, scalarY = MinMaxScaler(), MinMaxScaler()
scalarX.fit(X)
scalarY.fit(y.reshape(100,1))
X = scalarX.transform(X)
y = scalarY.transform(y.reshape(100,1))
# define and fit the final model
model = Sequential()
model.add(Dense(4, input_dim=2, activation='relu'))
model.add(Dense(4, activation='relu'))
model.add(Dense(1, activation='linear'))
model.compile(loss='mse', optimizer='adam')
model.fit(X, y, epochs=1000, verbose=0)

We can predict quantities with the finalized regression model by calling the predict() function on the finalized model.

The predict() function takes an array of one or more data instances.

The example below demonstrates how to make regression predictions on multiple data instances with an unknown expected outcome.

# example of making predictions for a regression problem
from keras.models import Sequential
from keras.layers import Dense
from sklearn.datasets import make_regression
from sklearn.preprocessing import MinMaxScaler
# generate regression dataset
X, y = make_regression(n_samples=100, n_features=2, noise=0.1, random_state=1)
scalarX, scalarY = MinMaxScaler(), MinMaxScaler()
scalarX.fit(X)
scalarY.fit(y.reshape(100,1))
X = scalarX.transform(X)
y = scalarY.transform(y.reshape(100,1))
# define and fit the final model
model = Sequential()
model.add(Dense(4, input_dim=2, activation='relu'))
model.add(Dense(4, activation='relu'))
model.add(Dense(1, activation='linear'))
model.compile(loss='mse', optimizer='adam')
model.fit(X, y, epochs=1000, verbose=0)
# new instances where we do not know the answer
Xnew, a = make_regression(n_samples=3, n_features=2, noise=0.1, random_state=1)
Xnew = scalarX.transform(Xnew)
# make a prediction
ynew = model.predict(Xnew)
# show the inputs and predicted outputs
for i in range(len(Xnew)):
	print("X=%s, Predicted=%s" % (Xnew[i], ynew[i]))

Running the example makes multiple predictions, then prints the inputs and predictions side by side for review.

X=[0.29466096 0.30317302], Predicted=[0.17097184]
X=[0.39445118 0.79390858], Predicted=[0.7475489]
X=[0.02884127 0.6208843 ], Predicted=[0.43370453]

The same function can be used to make a prediction for a single data instance, as long as it is suitably wrapped in a surrounding list or array.

For example:

# example of making predictions for a regression problem
from keras.models import Sequential
from keras.layers import Dense
from sklearn.datasets import make_regression
from sklearn.preprocessing import MinMaxScaler
from numpy import array
# generate regression dataset
X, y = make_regression(n_samples=100, n_features=2, noise=0.1, random_state=1)
scalarX, scalarY = MinMaxScaler(), MinMaxScaler()
scalarX.fit(X)
scalarY.fit(y.reshape(100,1))
X = scalarX.transform(X)
y = scalarY.transform(y.reshape(100,1))
# define and fit the final model
model = Sequential()
model.add(Dense(4, input_dim=2, activation='relu'))
model.add(Dense(4, activation='relu'))
model.add(Dense(1, activation='linear'))
model.compile(loss='mse', optimizer='adam')
model.fit(X, y, epochs=1000, verbose=0)
# new instance where we do not know the answer
Xnew = array(0.29466096, 0.30317302)
# make a prediction
ynew = model.predict(Xnew)
# show the inputs and predicted outputs
print("X=%s, Predicted=%s" % (Xnew[0], ynew[0]))

Running the example makes a single prediction and prints the data instance and prediction for review.

X=[0.29466096 0.30317302], Predicted=[0.17333156]

Further Reading

This section provides more resources on the topic if you are looking to go deeper.

   How to Train a Final Machine Learning Model
   Save and Load Your Keras Deep Learning Models
   Develop Your First Neural Network in Python With Keras Step-By-Step
   The 5 Step Life-Cycle for Long Short-Term Memory Models in Keras
   How to Make Predictions with Long Short-Term Memory Models in Keras

Summary

In this tutorial, you discovered how you can make classification and regression predictions with a finalized deep learning model with the Keras Python library.

Specifically, you learned:

  • How to finalize a model in order to make it ready for making predictions.
  • How to make class and probability predictions for classification problems in Keras.
  • How to make regression predictions in in Keras.

Do you have any questions? Ask your questions in the comments below and I will do my best to answer.

Referensi


Pranala Menarik