- This post is for ultimate begginers in deep learning. It helps to get started with keras a python based deep learning package. Keras runs on backends like Theano, Tensorflow and CNTK.
- In this post, I will explain the setting up of Keras in your computer using backend as Tensorflow and running a simple multilayer perceptron using keras.
- Step 0: Before starting make sure that you have installed,
python 2.7 (prefered) or python 3,
scipy, numpy, Matplotlib
Step 1: Now install the Tensorflow, if you have compute GPU in your computer then you can install GPU version of Tensorflow otherwise install CPU version of Tensorflow.
Instructions to install Tensorflow : LINK
you can check whether the packages are installed or not by simply importing them.
$ python
>>import tensorflow
>>print tensorflow.__version__
>>import numpy
>>print numpy.__version__
>>import scipy
>>print scipy.__version__
Step 2: Installing Keras in Ubuntu
$ sudo pip install keras
$ python
>>import keras
>>using Tensorflow backend
Example: Digit classification using multilayer perceptron (MLP) on MNIST dataset.
#########Digit classification using MLP on MNIST dataset###
import keras
from keras.datasets import mnist
from keras.models import Sequential
from keras.layers import Dense, Dropout
from keras.optimizers import RMSprop
import matplotlib.pyplot as plt
batch_size = 128
num_classes = 10
epochs = 20
# the data, shuffled and split between train and test sets
(x_train, y_train), (x_test, y_test) = mnist.load_data()
x_train = x_train.reshape(60000, 784)
x_test = x_test.reshape(10000, 784)
x_train = x_train.astype('float32')
x_test = x_test.astype('float32')
x_train /= 255
x_test /= 255
print(x_train.shape[0], 'train samples')
print(x_test.shape[0], 'test samples')
# convert class vectors to binary class matrices
y_train = keras.utils.to_categorical(y_train, num_classes)
y_test = keras.utils.to_categorical(y_test, num_classes)
model = Sequential()
model.add(Dense(512, activation='relu', input_shape=(784,)))
model.add(Dropout(0.2))
model.add(Dense(512, activation='relu'))
model.add(Dropout(0.2))
model.add(Dense(num_classes, activation='softmax'))
model.summary()
model.compile(loss='categorical_crossentropy',
optimizer=RMSprop(),
metrics=['accuracy'])
history = model.fit(x_train, y_train,
batch_size=batch_size,
epochs=epochs,
verbose=1,
validation_data=(x_test, y_test))
score = model.evaluate(x_test, y_test, verbose=0)
print('Test loss:', score[0])
print('Test accuracy:', score[1])
print(history.history.keys())
# summarize history for accuracy
plt.plot(history.history['acc'])
plt.plot(history.history['val_acc'])
plt.title('model accuracy')
plt.ylabel('accuracy')
plt.xlabel('epoch')
plt.legend(['train', 'test'], loc='upper left')
plt.show()
# summarize history for loss
plt.plot(history.history['loss'])
plt.plot(history.history['val_loss'])
plt.title('model loss')
plt.ylabel('loss')
plt.xlabel('epoch')
plt.legend(['train', 'test'], loc='upper left')
plt.show()
OutPut: The model.summary() gives the details about
the network architecture and number of trainable and non-
trainable parameters.
____________________________________________________________
Layer (type) Output Shape Param #
==========================================================
dense_4 (Dense) (None, 512) 401920
__________________________________________________________
dropout_3 (Dropout) (None, 512) 0
___________________________________________________________
dense_5 (Dense) (None, 512) 262656
___________________________________________________________
dropout_4 (Dropout) (None, 512) 0
___________________________________________________________
dense_6 (Dense) (None, 10) 5130
===========================================================
Total params: 669,706
Trainable params: 669,706
Non-trainable params: 0
Training and testing accuracy vs epoch are showin in the following figure.
Training and testing loss vs epoch are showin in the following figure.
I'm cheerful I found this blog! Every now and then, understudies need to psychologically the keys of beneficial artistic articles forming. Your information about this great post can turn into a reason for such individuals.
ReplyDeletecertification of data science
Incredibly conventional blog and articles. I am realy very happy to visit your blog. Directly I am found which I truly need. Thankful to you and keeping it together for your new post.
ReplyDeletedata analytics training in yelahanka
Amazing blog.Thanks for sharing such excellent information with us. keep sharing...
ReplyDeleteai training in noida