使用VGG16来训练车辆识别

使用VGG16来训练车辆识别

from keras.applications import VGG16
conv_base = VGG16(weights='imagenet',
                    include_top=False,
                    input_shape=(150,150,3))

import os 
import numpy as np 
from keras.preprocessing.image import ImageDataGenerator
base_dir = '/Users/chenyin/Documents/深度学习/car'
train_dir = os.path.join(base_dir,'train')
validation_dir = os.path.join(base_dir,'val')
test_dir = os.path.join(base_dir,'test')
from keras import models
from keras import layers
from keras import optimizers

加入分类层

model = models.Sequential()
model.add(conv_base)
model.add(layers.Flatten())
model.add(layers.Dense(256,activation='relu'))
model.add(layers.Dense(10, activation='sigmoid'))
model.summary()
Model: "sequential"
_________________________________________________________________
Layer (type)                 Output Shape              Param #   
=================================================================
vgg16 (Model)                (None, 4, 4, 512)         14714688  
_________________________________________________________________
flatten (Flatten)            (None, 8192)              0         
_________________________________________________________________
dense (Dense)                (None, 256)               2097408   
_________________________________________________________________
dense_1 (Dense)              (None, 10)                2570      
=================================================================
Total params: 16,814,666
Trainable params: 16,814,666
Non-trainable params: 0
_________________________________________________________________
len(model.trainable_weights)
30
conv_base.trainable = False
len(model.trainable_weights)
4
train_datagen = ImageDataGenerator(
    rescale=1./255,
    rotation_range=40,
    width_shift_range=0.2,
    height_shift_range=0.2,
    shear_range=0.2,
    zoom_range=0.2,
    horizontal_flip=True,
    fill_mode='nearest'
)
test_datagen = ImageDataGenerator(rescale=1./255)
train_generator = train_datagen.flow_from_directory(
    train_dir,
    target_size=(150,150),
    batch_size=20,
    class_mode='binary'
)
validation_generator = test_datagen.flow_from_directory(
    validation_dir,
    target_size=(150,150),
    batch_size=20,
    class_mode='binary'
)
model.compile(loss='sparse_categorical_crossentropy',
    optimizer=optimizers.RMSprop(lr=2e-5),
    metrics=['acc'])
Found 1400 images belonging to 10 classes.
Found 200 images belonging to 10 classes.
history = model.fit(
    train_generator,
    steps_per_epoch=100,
    epochs=30,
    validation_data=validation_generator,
    validation_steps=50
)
Epoch 1/30
WARNING:tensorflow:AutoGraph could not transform <function Model.make_train_function.<locals>.train_function at 0x7fdfaeef3710> and will run it as-is.
Please report this to the TensorFlow team. When filing the bug, set the verbosity to 10 (on Linux, `export AUTOGRAPH_VERBOSITY=10`) and attach the full output.
Cause: Bad argument number for Name: 4, expecting 3
To silence this warning, decorate the function with @tf.autograph.experimental.do_not_convert
WARNING: AutoGraph could not transform <function Model.make_train_function.<locals>.train_function at 0x7fdfaeef3710> and will run it as-is.
Please report this to the TensorFlow team. When filing the bug, set the verbosity to 10 (on Linux, `export AUTOGRAPH_VERBOSITY=10`) and attach the full output.
Cause: Bad argument number for Name: 4, expecting 3
To silence this warning, decorate the function with @tf.autograph.experimental.do_not_convert
100/100 [==============================] - ETA: 0s - loss: 2.2270 - acc: 0.2045WARNING:tensorflow:AutoGraph could not transform <function Model.make_test_function.<locals>.test_function at 0x7fdfaed7aa70> and will run it as-is.
Please report this to the TensorFlow team. When filing the bug, set the verbosity to 10 (on Linux, `export AUTOGRAPH_VERBOSITY=10`) and attach the full output.
Cause: Bad argument number for Name: 4, expecting 3
To silence this warning, decorate the function with @tf.autograph.experimental.do_not_convert
WARNING: AutoGraph could not transform <function Model.make_test_function.<locals>.test_function at 0x7fdfaed7aa70> and will run it as-is.
Please report this to the TensorFlow team. When filing the bug, set the verbosity to 10 (on Linux, `export AUTOGRAPH_VERBOSITY=10`) and attach the full output.
Cause: Bad argument number for Name: 4, expecting 3
To silence this warning, decorate the function with @tf.autograph.experimental.do_not_convert
100/100 [==============================] - 199s 2s/step - loss: 2.2270 - acc: 0.2045 - val_loss: 2.0976 - val_acc: 0.3400
Epoch 2/30
100/100 [==============================] - 194s 2s/step - loss: 1.9879 - acc: 0.3965 - val_loss: 1.8054 - val_acc: 0.5260
Epoch 3/30
100/100 [==============================] - 198s 2s/step - loss: 1.7472 - acc: 0.5060 - val_loss: 1.5587 - val_acc: 0.5880
Epoch 4/30
100/100 [==============================] - 178s 2s/step - loss: 1.5686 - acc: 0.5600 - val_loss: 1.3564 - val_acc: 0.6710
Epoch 5/30
100/100 [==============================] - 179s 2s/step - loss: 1.4308 - acc: 0.5940 - val_loss: 1.2241 - val_acc: 0.6850
Epoch 6/30
100/100 [==============================] - 185s 2s/step - loss: 1.3169 - acc: 0.6170 - val_loss: 1.0816 - val_acc: 0.7150
Epoch 7/30
100/100 [==============================] - 179s 2s/step - loss: 1.2277 - acc: 0.6530 - val_loss: 1.0170 - val_acc: 0.7210
Epoch 8/30
100/100 [==============================] - 178s 2s/step - loss: 1.1479 - acc: 0.6790 - val_loss: 0.9509 - val_acc: 0.7530
Epoch 9/30
100/100 [==============================] - 179s 2s/step - loss: 1.0915 - acc: 0.6845 - val_loss: 0.8450 - val_acc: 0.7820
Epoch 10/30
100/100 [==============================] - 177s 2s/step - loss: 1.0540 - acc: 0.6915 - val_loss: 0.8223 - val_acc: 0.7720
Epoch 11/30
100/100 [==============================] - 182s 2s/step - loss: 0.9687 - acc: 0.7200 - val_loss: 0.7857 - val_acc: 0.7700
Epoch 12/30
100/100 [==============================] - 179s 2s/step - loss: 0.9658 - acc: 0.7180 - val_loss: 0.7357 - val_acc: 0.7820
Epoch 13/30
100/100 [==============================] - 178s 2s/step - loss: 0.9198 - acc: 0.7295 - val_loss: 0.7101 - val_acc: 0.7780
Epoch 14/30
100/100 [==============================] - 181s 2s/step - loss: 0.8924 - acc: 0.7425 - val_loss: 0.6926 - val_acc: 0.7940
Epoch 15/30
100/100 [==============================] - 179s 2s/step - loss: 0.8465 - acc: 0.7480 - val_loss: 0.6716 - val_acc: 0.7920
Epoch 16/30
100/100 [==============================] - 178s 2s/step - loss: 0.8407 - acc: 0.7435 - val_loss: 0.6581 - val_acc: 0.7780
Epoch 17/30
100/100 [==============================] - 176s 2s/step - loss: 0.8111 - acc: 0.7490 - val_loss: 0.5969 - val_acc: 0.7950
Epoch 18/30
100/100 [==============================] - 182s 2s/step - loss: 0.7752 - acc: 0.7705 - val_loss: 0.5955 - val_acc: 0.8000
Epoch 19/30
100/100 [==============================] - 176s 2s/step - loss: 0.7615 - acc: 0.7735 - val_loss: 0.5915 - val_acc: 0.8010
Epoch 20/30
100/100 [==============================] - 178s 2s/step - loss: 0.7400 - acc: 0.7790 - val_loss: 0.5745 - val_acc: 0.8060
Epoch 21/30
100/100 [==============================] - 175s 2s/step - loss: 0.7564 - acc: 0.7735 - val_loss: 0.5657 - val_acc: 0.7950
Epoch 22/30
100/100 [==============================] - 180s 2s/step - loss: 0.7097 - acc: 0.7865 - val_loss: 0.5864 - val_acc: 0.7930
Epoch 23/30
100/100 [==============================] - 179s 2s/step - loss: 0.7047 - acc: 0.7930 - val_loss: 0.5537 - val_acc: 0.8150
Epoch 24/30
100/100 [==============================] - 175s 2s/step - loss: 0.6993 - acc: 0.7830 - val_loss: 0.5388 - val_acc: 0.7970
Epoch 25/30
100/100 [==============================] - 178s 2s/step - loss: 0.6872 - acc: 0.7900 - val_loss: 0.5653 - val_acc: 0.8100
Epoch 26/30
100/100 [==============================] - 180s 2s/step - loss: 0.6430 - acc: 0.8040 - val_loss: 0.5413 - val_acc: 0.7990
Epoch 27/30
100/100 [==============================] - 176s 2s/step - loss: 0.6613 - acc: 0.8005 - val_loss: 0.5245 - val_acc: 0.8290
Epoch 28/30
100/100 [==============================] - 177s 2s/step - loss: 0.6185 - acc: 0.8040 - val_loss: 0.5202 - val_acc: 0.8300
Epoch 29/30
100/100 [==============================] - 178s 2s/step - loss: 0.6425 - acc: 0.8085 - val_loss: 0.5097 - val_acc: 0.8150
Epoch 30/30
100/100 [==============================] - 178s 2s/step - loss: 0.6132 - acc: 0.8145 - val_loss: 0.5098 - val_acc: 0.8410
model.save("./car_Vgg16.h5")#保存模型
import matplotlib.pyplot as plt

acc = history.history['acc']

val_acc = history.history['val_acc']

loss = history.history['loss']

val_loss = history.history['val_loss']
epochs = range(1, len(acc) + 1)
plt.plot(epochs, acc, 'bo', label='Training acc') 
plt.plot(epochs, val_acc, 'b', label='Validation acc') 
plt.title('Training and validation accuracy') 
plt.legend()

plt.figure()

plt.plot(epochs, loss, 'bo', label='Training loss') 
plt.plot(epochs, val_loss, 'b', label='Validation loss') 
plt.title('Training and validation loss') 
plt.legend()

plt.show()
test_list=os.listdir(test_dir)
#train_generator.class_indices.keys()   标签类别
labels=['SUV', 'bus', 'family sedan', 'fire engine', 'heavy truck', 'jeep', 'minibus', 'racing car', 'taxi', 'truck']
from keras.preprocessing import image
img_path=os.path.join(test_dir,test_list[180])
img = image.load_img(img_path,target_size=(150,150))
img.show()
x=image.img_to_array(img)
x=np.expand_dims(x,axis=0)
x=x/255.0
labels[model.predict_classes(x)[0]]
'jeep'
len(test_list)
200

©著作权归作者所有,转载或内容合作请联系作者
  • 序言:七十年代末,一起剥皮案震惊了整个滨河市,随后出现的几起案子,更是在滨河造成了极大的恐慌,老刑警刘岩,带你破解...
    沈念sama阅读 218,036评论 6 506
  • 序言:滨河连续发生了三起死亡事件,死亡现场离奇诡异,居然都是意外死亡,警方通过查阅死者的电脑和手机,发现死者居然都...
    沈念sama阅读 93,046评论 3 395
  • 文/潘晓璐 我一进店门,熙熙楼的掌柜王于贵愁眉苦脸地迎上来,“玉大人,你说我怎么就摊上这事。” “怎么了?”我有些...
    开封第一讲书人阅读 164,411评论 0 354
  • 文/不坏的土叔 我叫张陵,是天一观的道长。 经常有香客问我,道长,这世上最难降的妖魔是什么? 我笑而不...
    开封第一讲书人阅读 58,622评论 1 293
  • 正文 为了忘掉前任,我火速办了婚礼,结果婚礼上,老公的妹妹穿的比我还像新娘。我一直安慰自己,他们只是感情好,可当我...
    茶点故事阅读 67,661评论 6 392
  • 文/花漫 我一把揭开白布。 她就那样静静地躺着,像睡着了一般。 火红的嫁衣衬着肌肤如雪。 梳的纹丝不乱的头发上,一...
    开封第一讲书人阅读 51,521评论 1 304
  • 那天,我揣着相机与录音,去河边找鬼。 笑死,一个胖子当着我的面吹牛,可吹牛的内容都是我干的。 我是一名探鬼主播,决...
    沈念sama阅读 40,288评论 3 418
  • 文/苍兰香墨 我猛地睁开眼,长吁一口气:“原来是场噩梦啊……” “哼!你这毒妇竟也来了?” 一声冷哼从身侧响起,我...
    开封第一讲书人阅读 39,200评论 0 276
  • 序言:老挝万荣一对情侣失踪,失踪者是张志新(化名)和其女友刘颖,没想到半个月后,有当地人在树林里发现了一具尸体,经...
    沈念sama阅读 45,644评论 1 314
  • 正文 独居荒郊野岭守林人离奇死亡,尸身上长有42处带血的脓包…… 初始之章·张勋 以下内容为张勋视角 年9月15日...
    茶点故事阅读 37,837评论 3 336
  • 正文 我和宋清朗相恋三年,在试婚纱的时候发现自己被绿了。 大学时的朋友给我发了我未婚夫和他白月光在一起吃饭的照片。...
    茶点故事阅读 39,953评论 1 348
  • 序言:一个原本活蹦乱跳的男人离奇死亡,死状恐怖,灵堂内的尸体忽然破棺而出,到底是诈尸还是另有隐情,我是刑警宁泽,带...
    沈念sama阅读 35,673评论 5 346
  • 正文 年R本政府宣布,位于F岛的核电站,受9级特大地震影响,放射性物质发生泄漏。R本人自食恶果不足惜,却给世界环境...
    茶点故事阅读 41,281评论 3 329
  • 文/蒙蒙 一、第九天 我趴在偏房一处隐蔽的房顶上张望。 院中可真热闹,春花似锦、人声如沸。这庄子的主人今日做“春日...
    开封第一讲书人阅读 31,889评论 0 22
  • 文/苍兰香墨 我抬头看了看天上的太阳。三九已至,却和暖如春,着一层夹袄步出监牢的瞬间,已是汗流浃背。 一阵脚步声响...
    开封第一讲书人阅读 33,011评论 1 269
  • 我被黑心中介骗来泰国打工, 没想到刚下飞机就差点儿被人妖公主榨干…… 1. 我叫王不留,地道东北人。 一个月前我还...
    沈念sama阅读 48,119评论 3 370
  • 正文 我出身青楼,却偏偏与公主长得像,于是被迫代替她去往敌国和亲。 传闻我的和亲对象是个残疾皇子,可洞房花烛夜当晚...
    茶点故事阅读 44,901评论 2 355