1 回答

TA貢獻1777條經驗 獲得超3個贊
以下示例將幫助您了解如何cifar100使用DenseNet121. 請注意,我使用keraswith in tensorflow。
import tensorflow as tf
from tensorflow import keras
from tensorflow.keras.applications import DenseNet121
from tensorflow.keras.preprocessing import image
from tensorflow.keras.models import Model
from tensorflow.keras.layers import Dense, GlobalAveragePooling2D
from tensorflow.keras import backend as K
# import cifar 100 data
# The data, split between train and test sets:
(x_train, y_train), (x_test, y_test) = tf.keras.datasets.cifar100.load_data()
print('x_train shape:', x_train.shape)
print(x_train.shape[0], 'train samples')
print(x_test.shape[0], 'test samples')
x_train = x_train.astype('float32')
x_test = x_test.astype('float32')
x_train /= 255
x_test /= 255
# create the base pre-trained model
base_model = DenseNet121(weights='imagenet', include_top=False)
# add a global spatial average pooling layer
x = base_model.output
x = GlobalAveragePooling2D()(x)
# let's add a fully-connected layer
x = Dense(1024, activation='relu')(x)
# and a logistic layer -- let's say we have 200 classes
predictions = Dense(100)(x)
# this is the model we will train
model = Model(inputs=base_model.input, outputs=predictions)
# first: train only the top layers (which were randomly initialized)
# i.e. freeze all convolutional layers
for layer in base_model.layers:
layer.trainable = False
# compile the model (should be done *after* setting layers to non-trainable)
loss = tf.keras.losses.SparseCategoricalCrossentropy(from_logits=True)
model.compile(optimizer='rmsprop', loss=loss, metrics=['accuracy'])
# train the model on the new data for a few epochs
model.fit(x_train,y_train,epochs=5, validation_data=(x_test,y_test), verbose=1,batch_size=128)
您也可以進行微調,因為我訓練了將原始base_model權重保持在凍結狀態的模型(未訓練原始 base_model 的權重)。在微調期間,您可以解凍一些層并再次訓練。我還建議您閱讀有關ImageDataGenerator增強圖像并在測試期間獲得更好的準確性的信息。
添加回答
舉報