[Keras] Develop Neural Network With Keras Step-By-Step

时间:2023-03-08 20:20:38
[Keras] Develop Neural Network With Keras Step-By-Step

简单地训练一个四层全连接网络

Ref: http://machinelearningmastery.com/tutorial-first-neural-network-python-keras/

1. Load Data

数据简介:Pima Indians Diabetes Data Set

下载  :Data download --> 保存为:pima-indians-diabetes.csv

from keras.models import Sequential
from keras.layers import Dense
import numpy
# fix random seed for reproducibility
seed = 7
numpy.random.seed(seed)
# 1.播了个随机种子 # load pima indians dataset
dataset = ("pima-indians-diabetes.csv", delimiter=",")
# split into input (X) and output (Y) variables
X = dataset[:,0:8]
Y = dataset[:,8]
# 2.读取了数据放在了二维数组

2. Define Model

目标:采用四层全连接网络

思路:采用Sequential model,然后一次添加一个layers:

    • 参数1:一层网络的结点个数    【第一层(input) 8个 --> 第二层 12个 --> 第三层 8个 --> 第四层(output) 1个】
    • 参数2:初始化方法         【0-0.05均匀分布】
    • 参数3:激活函数            【第二、三层:relu;第四层:sigmoid】
# create model
model = Sequential()
model.add(Dense(12, input_dim=8, init='uniform', activation='relu'))
model.add(Dense(8, init='uniform', activation='relu'))
model.add(Dense(1, init='uniform', activation='sigmoid'))

3. Compile Model

我们的目的:找较好的权重w来做预测。

# Compile model
model.compile(loss='binary_crossentropy', optimizer='adam', metrics=['accuracy'])

Finally, because it is a classification problem, we will collect and report the classification accuracy as the metric.

Otherwise,见<6. load model>,也可直接导入现成模型,继续训练。

4. Fit Model

开始训练数据,监督学习:

    • 循环次数:epoch
    • 批统计量:batch
# Fit the model
history_callback = model.fit(X, Y, nb_epoch=150, batch_size=10)

5. Evaluate Model

训练后,使用model.evaluate(...)预测成功率统计:

# evaluate the model
scores = model.evaluate(X, Y)
print("%s: %.2f%%" % (model.metrics_names[1], scores[1]*100))

运行结果:acc: 78.91%

Epoch 1/150
768/768 [==============================] - 0s - loss: 0.6826 - acc: 0.6328
Epoch 2/150
768/768 [==============================] - 0s - loss: 0.6590 - acc: 0.6510
Epoch 3/150
768/768 [==============================] - 0s - loss: 0.6475 - acc: 0.6549
Epoch 4/150
768/768 [==============================] - 0s - loss: 0.6416 - acc: 0.6615
Epoch 5/150
768/768 [==============================] - 0s - loss: 0.6216 - acc: 0.6745
Epoch 6/150
768/768 [==============================] - 0s - loss: 0.6128 - acc: 0.6680
Epoch 7/150
768/768 [==============================] - 0s - loss: 0.6018 - acc: 0.6927
Epoch 8/150
768/768 [==============================] - 0s - loss: 0.5962 - acc: 0.6927
Epoch 9/150
768/768 [==============================] - 0s - loss: 0.5991 - acc: 0.6953
Epoch 10/150
768/768 [==============================] - 0s - loss: 0.5920 - acc: 0.6927
Epoch 11/150
768/768 [==============================] - 0s - loss: 0.5905 - acc: 0.6979
Epoch 12/150
768/768 [==============================] - 0s - loss: 0.5883 - acc: 0.6901
Epoch 13/150
768/768 [==============================] - 0s - loss: 0.5870 - acc: 0.6953
Epoch 14/150
768/768 [==============================] - 0s - loss: 0.5869 - acc: 0.6836
Epoch 15/150
768/768 [==============================] - 0s - loss: 0.5815 - acc: 0.6953
Epoch 16/150
768/768 [==============================] - 0s - loss: 0.5779 - acc: 0.6966
Epoch 17/150
768/768 [==============================] - 0s - loss: 0.5809 - acc: 0.6849
Epoch 18/150
768/768 [==============================] - 0s - loss: 0.5818 - acc: 0.6953
Epoch 19/150
768/768 [==============================] - 0s - loss: 0.5814 - acc: 0.6901
Epoch 20/150
768/768 [==============================] - 0s - loss: 0.5748 - acc: 0.7096
Epoch 21/150
768/768 [==============================] - 0s - loss: 0.5758 - acc: 0.7005
Epoch 22/150
768/768 [==============================] - 0s - loss: 0.5739 - acc: 0.7135
Epoch 23/150
768/768 [==============================] - 0s - loss: 0.5736 - acc: 0.6927
Epoch 24/150
768/768 [==============================] - 0s - loss: 0.5750 - acc: 0.6940
Epoch 25/150
768/768 [==============================] - 0s - loss: 0.5734 - acc: 0.7031
Epoch 26/150
768/768 [==============================] - 0s - loss: 0.5683 - acc: 0.7083
Epoch 27/150
768/768 [==============================] - 0s - loss: 0.5688 - acc: 0.7018
Epoch 28/150
768/768 [==============================] - 0s - loss: 0.5714 - acc: 0.7070
Epoch 29/150
768/768 [==============================] - 0s - loss: 0.5621 - acc: 0.7188
Epoch 30/150
768/768 [==============================] - 0s - loss: 0.5647 - acc: 0.7122
Epoch 31/150
768/768 [==============================] - 0s - loss: 0.5630 - acc: 0.7135
Epoch 32/150
768/768 [==============================] - 0s - loss: 0.5613 - acc: 0.7214
Epoch 33/150
768/768 [==============================] - 0s - loss: 0.5594 - acc: 0.7188
Epoch 34/150
768/768 [==============================] - 0s - loss: 0.5598 - acc: 0.7187
Epoch 35/150
768/768 [==============================] - 0s - loss: 0.5624 - acc: 0.7187
Epoch 36/150
768/768 [==============================] - 0s - loss: 0.5615 - acc: 0.7201
Epoch 37/150
768/768 [==============================] - 0s - loss: 0.5544 - acc: 0.7214
Epoch 38/150
768/768 [==============================] - 0s - loss: 0.5529 - acc: 0.7135
Epoch 39/150
768/768 [==============================] - 0s - loss: 0.5550 - acc: 0.7227
Epoch 40/150
768/768 [==============================] - 0s - loss: 0.5574 - acc: 0.7331
Epoch 41/150
768/768 [==============================] - 0s - loss: 0.5561 - acc: 0.7357
Epoch 42/150
768/768 [==============================] - 0s - loss: 0.5459 - acc: 0.7370
Epoch 43/150
768/768 [==============================] - 0s - loss: 0.5481 - acc: 0.7240
Epoch 44/150
768/768 [==============================] - 0s - loss: 0.5409 - acc: 0.7331
Epoch 45/150
768/768 [==============================] - 0s - loss: 0.5438 - acc: 0.7422
Epoch 46/150
768/768 [==============================] - 0s - loss: 0.5360 - acc: 0.7344
Epoch 47/150
768/768 [==============================] - 0s - loss: 0.5393 - acc: 0.7357
Epoch 48/150
768/768 [==============================] - 0s - loss: 0.5360 - acc: 0.7435
Epoch 49/150
768/768 [==============================] - 0s - loss: 0.5407 - acc: 0.7370
Epoch 50/150
768/768 [==============================] - 0s - loss: 0.5473 - acc: 0.7344
Epoch 51/150
768/768 [==============================] - 0s - loss: 0.5287 - acc: 0.7448
Epoch 52/150
768/768 [==============================] - 0s - loss: 0.5283 - acc: 0.7539
Epoch 53/150
768/768 [==============================] - 0s - loss: 0.5308 - acc: 0.7396
Epoch 54/150
768/768 [==============================] - 0s - loss: 0.5274 - acc: 0.7448
Epoch 55/150
768/768 [==============================] - 0s - loss: 0.5241 - acc: 0.7539
Epoch 56/150
768/768 [==============================] - 0s - loss: 0.5262 - acc: 0.7526
Epoch 57/150
768/768 [==============================] - 0s - loss: 0.5272 - acc: 0.7422
Epoch 58/150
768/768 [==============================] - 0s - loss: 0.5262 - acc: 0.7539
Epoch 59/150
768/768 [==============================] - 0s - loss: 0.5224 - acc: 0.7604
Epoch 60/150
768/768 [==============================] - 0s - loss: 0.5200 - acc: 0.7513
Epoch 61/150
768/768 [==============================] - 0s - loss: 0.5158 - acc: 0.7578
Epoch 62/150
768/768 [==============================] - 0s - loss: 0.5162 - acc: 0.7513
Epoch 63/150
768/768 [==============================] - 0s - loss: 0.5097 - acc: 0.7552
Epoch 64/150
768/768 [==============================] - 0s - loss: 0.5134 - acc: 0.7487
Epoch 65/150
768/768 [==============================] - 0s - loss: 0.5112 - acc: 0.7435
Epoch 66/150
768/768 [==============================] - 0s - loss: 0.5141 - acc: 0.7656
Epoch 67/150
768/768 [==============================] - 0s - loss: 0.5082 - acc: 0.7539
Epoch 68/150
768/768 [==============================] - 0s - loss: 0.5101 - acc: 0.7643
Epoch 69/150
768/768 [==============================] - 0s - loss: 0.5136 - acc: 0.7409
Epoch 70/150
768/768 [==============================] - 0s - loss: 0.5182 - acc: 0.7474
Epoch 71/150
768/768 [==============================] - 0s - loss: 0.5185 - acc: 0.7370
Epoch 72/150
768/768 [==============================] - 0s - loss: 0.5073 - acc: 0.7539
Epoch 73/150
768/768 [==============================] - 0s - loss: 0.4982 - acc: 0.7682
Epoch 74/150
768/768 [==============================] - 0s - loss: 0.4967 - acc: 0.7591
Epoch 75/150
768/768 [==============================] - 0s - loss: 0.5070 - acc: 0.7617
Epoch 76/150
768/768 [==============================] - 0s - loss: 0.5025 - acc: 0.7526
Epoch 77/150
768/768 [==============================] - 0s - loss: 0.4991 - acc: 0.7604
Epoch 78/150
768/768 [==============================] - 0s - loss: 0.4923 - acc: 0.7656
Epoch 79/150
768/768 [==============================] - 0s - loss: 0.4998 - acc: 0.7695
Epoch 80/150
768/768 [==============================] - 0s - loss: 0.5004 - acc: 0.7526
Epoch 81/150
768/768 [==============================] - 0s - loss: 0.5043 - acc: 0.7552
Epoch 82/150
768/768 [==============================] - 0s - loss: 0.5002 - acc: 0.7656
Epoch 83/150
768/768 [==============================] - 0s - loss: 0.4932 - acc: 0.7617
Epoch 84/150
768/768 [==============================] - 0s - loss: 0.4971 - acc: 0.7604
Epoch 85/150
768/768 [==============================] - 0s - loss: 0.5007 - acc: 0.7513
Epoch 86/150
768/768 [==============================] - 0s - loss: 0.4889 - acc: 0.7656
Epoch 87/150
768/768 [==============================] - 0s - loss: 0.4953 - acc: 0.7591
Epoch 88/150
768/768 [==============================] - 0s - loss: 0.4910 - acc: 0.7669
Epoch 89/150
768/768 [==============================] - 0s - loss: 0.4897 - acc: 0.7604
Epoch 90/150
768/768 [==============================] - 0s - loss: 0.4867 - acc: 0.7643
Epoch 91/150
768/768 [==============================] - 0s - loss: 0.4915 - acc: 0.7669
Epoch 92/150
768/768 [==============================] - 0s - loss: 0.4907 - acc: 0.7630
Epoch 93/150
768/768 [==============================] - 0s - loss: 0.4912 - acc: 0.7604
Epoch 94/150
768/768 [==============================] - 0s - loss: 0.4851 - acc: 0.7630
Epoch 95/150
768/768 [==============================] - 0s - loss: 0.4821 - acc: 0.7682
Epoch 96/150
768/768 [==============================] - 0s - loss: 0.4835 - acc: 0.7669
Epoch 97/150
768/768 [==============================] - 0s - loss: 0.4738 - acc: 0.7773
Epoch 98/150
768/768 [==============================] - 0s - loss: 0.5008 - acc: 0.7474
Epoch 99/150
768/768 [==============================] - 0s - loss: 0.4841 - acc: 0.7682
Epoch 100/150
768/768 [==============================] - 0s - loss: 0.4816 - acc: 0.7669
Epoch 101/150
768/768 [==============================] - 0s - loss: 0.4843 - acc: 0.7695
Epoch 102/150
768/768 [==============================] - 0s - loss: 0.4753 - acc: 0.7891
Epoch 103/150
768/768 [==============================] - 0s - loss: 0.4841 - acc: 0.7630
Epoch 104/150
768/768 [==============================] - 0s - loss: 0.4836 - acc: 0.7786
Epoch 105/150
768/768 [==============================] - 0s - loss: 0.4809 - acc: 0.7708
Epoch 106/150
768/768 [==============================] - 0s - loss: 0.4792 - acc: 0.7786
Epoch 107/150
768/768 [==============================] - 0s - loss: 0.4831 - acc: 0.7734
Epoch 108/150
768/768 [==============================] - 0s - loss: 0.4783 - acc: 0.7852
Epoch 109/150
768/768 [==============================] - 0s - loss: 0.4784 - acc: 0.7708
Epoch 110/150
768/768 [==============================] - 0s - loss: 0.4803 - acc: 0.7682
Epoch 111/150
768/768 [==============================] - 0s - loss: 0.4704 - acc: 0.7734
Epoch 112/150
768/768 [==============================] - 0s - loss: 0.4752 - acc: 0.7878
Epoch 113/150
768/768 [==============================] - 0s - loss: 0.4776 - acc: 0.7760
Epoch 114/150
768/768 [==============================] - 0s - loss: 0.4849 - acc: 0.7604
Epoch 115/150
768/768 [==============================] - 0s - loss: 0.4773 - acc: 0.7682
Epoch 116/150
768/768 [==============================] - 0s - loss: 0.4712 - acc: 0.7773
Epoch 117/150
768/768 [==============================] - 0s - loss: 0.4675 - acc: 0.7786
Epoch 118/150
768/768 [==============================] - 0s - loss: 0.4660 - acc: 0.7839
Epoch 119/150
768/768 [==============================] - 0s - loss: 0.4702 - acc: 0.7891
Epoch 120/150
768/768 [==============================] - 0s - loss: 0.4699 - acc: 0.7852
Epoch 121/150
768/768 [==============================] - 0s - loss: 0.4786 - acc: 0.7852
Epoch 122/150
768/768 [==============================] - 0s - loss: 0.4745 - acc: 0.7786
Epoch 123/150
768/768 [==============================] - 0s - loss: 0.4684 - acc: 0.7839
Epoch 124/150
768/768 [==============================] - 0s - loss: 0.4709 - acc: 0.7760
Epoch 125/150
768/768 [==============================] - 0s - loss: 0.4699 - acc: 0.7747
Epoch 126/150
768/768 [==============================] - 0s - loss: 0.4649 - acc: 0.7747
Epoch 127/150
768/768 [==============================] - 0s - loss: 0.4709 - acc: 0.7708
Epoch 128/150
768/768 [==============================] - 0s - loss: 0.4573 - acc: 0.7982
Epoch 129/150
768/768 [==============================] - 0s - loss: 0.4646 - acc: 0.7943
Epoch 130/150
768/768 [==============================] - 0s - loss: 0.4775 - acc: 0.7773
Epoch 131/150
768/768 [==============================] - 0s - loss: 0.4613 - acc: 0.7799
Epoch 132/150
768/768 [==============================] - 0s - loss: 0.4608 - acc: 0.7799
Epoch 133/150
768/768 [==============================] - 0s - loss: 0.4737 - acc: 0.7826
Epoch 134/150
768/768 [==============================] - 0s - loss: 0.4711 - acc: 0.7773
Epoch 135/150
768/768 [==============================] - 0s - loss: 0.4665 - acc: 0.7839
Epoch 136/150
768/768 [==============================] - 0s - loss: 0.4579 - acc: 0.7969
Epoch 137/150
768/768 [==============================] - 0s - loss: 0.4621 - acc: 0.7917
Epoch 138/150
768/768 [==============================] - 0s - loss: 0.4684 - acc: 0.7760
Epoch 139/150
768/768 [==============================] - 0s - loss: 0.4597 - acc: 0.7839
Epoch 140/150
768/768 [==============================] - 0s - loss: 0.4593 - acc: 0.7799
Epoch 141/150
768/768 [==============================] - 0s - loss: 0.4624 - acc: 0.7799
Epoch 142/150
768/768 [==============================] - 0s - loss: 0.4609 - acc: 0.7786
Epoch 143/150
768/768 [==============================] - 0s - loss: 0.4648 - acc: 0.7826
Epoch 144/150
768/768 [==============================] - 0s - loss: 0.4541 - acc: 0.8060
Epoch 145/150
768/768 [==============================] - 0s - loss: 0.4597 - acc: 0.7852
Epoch 146/150
768/768 [==============================] - 0s - loss: 0.4639 - acc: 0.7891
Epoch 147/150
768/768 [==============================] - 0s - loss: 0.4548 - acc: 0.7865
Epoch 148/150
768/768 [==============================] - 0s - loss: 0.4659 - acc: 0.7786
Epoch 149/150
768/768 [==============================] - 0s - loss: 0.4596 - acc: 0.7799
Epoch 150/150
768/768 [==============================] - 0s - loss: 0.4615 - acc: 0.7773
32/768 [>.............................] - ETA: 0sacc: 78.91%

log

6. Save & load model

分析logHow to log Keras loss output to a file

loss_history = history_callback.history["loss"]
acc_history = history_callback.history["acc"]

Save and Load Your Keras Deep Learning Models

    • 模型:model.json (option: json or yaml格式)
    • 权重:model.h5
# serialize model to JSON
model_json = model.to_json()
with open("model.json", "w+") as json_file:
json_file.write(model_json)
# serialize weights to HDF5
model.save_weights("model.h5")
print("Saved model to disk") # later... # load json and create model
json_file = open('model.json', 'r')
loaded_model_json = json_file.read()
json_file.close()
loaded_model = model_from_json(loaded_model_json)
# load weights into new model
loaded_model.load_weights("model.h5")
print("Loaded model from disk")
# serialize model to YAML
model_yaml = model.to_yaml()
with open("model.yaml", "w") as yaml_file:
yaml_file.write(model_yaml)
# serialize weights to HDF5
model.save_weights("model.h5")
print("Saved model to disk") # later... # load YAML and create model
yaml_file = open('model.yaml', 'r')
loaded_model_yaml = yaml_file.read()
yaml_file.close()
loaded_model = model_from_yaml(loaded_model_yaml)
# load weights into new model
loaded_model.load_weights("model.h5")
print("Loaded model from disk")

7. Make Predictions

通过numpy.loadtxt(...) 获取新的数据,放入X中。

# calculate predictions
predictions = model.predict(X)
# round predictions
rounded = [int(round(x[0])) for x in predictions]
print(rounded)

附加题:Multilayer Perceptron (理解图与代码的对应关系)

[Keras] Develop Neural Network With Keras Step-By-Step

Code: a Multilayer Perceptron

import numpy as np
np.random.seed(1337) # for reproducibility import os
from keras.datasets import mnist    #自动下载 from keras.models import Sequential 
from keras.layers.core import Dense, Dropout, Activation
from keras.optimizers import RMSprop
from keras.utils import np_utils batch_size = 128 #Number of images used in each optimization step
nb_classes = 10 #One class per digit
nb_epoch = 12 #Number of times the whole data is used to learn

(X_train, y_train), (X_test, y_test) = mnist.load_data() #Flatten the data: 神经网络不用二位数组作为数据,所以这里变为一维
X_train = X_train.reshape(60000, 784)
X_test = X_test.reshape(10000, 784) #Make the value floats in [0;1] instead of int in [0;255] --> [归一化]
X_train = X_train.astype('float32')
X_test = X_test.astype('float32')
X_train /= 255
X_test /= 255 #Display the shapes to check if everything's ok
print(X_train.shape[0], 'train samples')
print(X_test.shape[0], 'test samples') # convert class vectors to binary class matrices (ie one-hot vectors)
Y_train = np_utils.to_categorical(y_train, nb_classes)
Y_test = np_utils.to_categorical(y_test, nb_classes) #Define the model achitecture
model = Sequential()
########################################################################################
model.add(Dense(512, input_shape=(784,)))
model.add(Activation('relu'))
model.add(Dropout(0.2))
model.add(Dense(512))
model.add(Activation('relu'))
model.add(Dropout(0.2))
model.add(Dense(10)) #Last layer with one output per class 二值化表示
model.add(Activation('softmax')) #We want a score simlar to a probability for each class
########################################################################################
#Use rmsprop to do the gradient descent see http://www.cs.toronto.edu/~tijmen/csc321/slides/lecture_slides_lec6.pdf
#and http://cs231n.github.io/neural-networks-3/#ada
rms = RMSprop()  # 随机梯度下降
#The function to optimize is the cross entropy between the true label and the output (softmax) of the model
model.compile(loss='categorical_crossentropy', optimizer=rms, metrics=["accuracy"])

#Make the model learn --> [Training]
model.fit(X_train, Y_train,
batch_size=batch_size, nb_epoch=nb_epoch,
verbose=2,
validation_data=(X_test, Y_test)) #Evaluate how the model does on the test set
score = model.evaluate(X_test, Y_test, verbose=0) print('Test score:', score[0])
print('Test accuracy:', score[1])