亚洲在线久爱草,狠狠天天香蕉网,天天搞日日干久草,伊人亚洲日本欧美

為了賬號安全,請及時綁定郵箱和手機立即綁定
已解決430363個問題,去搜搜看,總會有你想問的

加載后更改 MobileNet 丟失

加載后更改 MobileNet 丟失

哆啦的時光機 2023-02-22 17:05:21
我正在處理遷移學習問題。當我僅從 Mobilenet 創建新模型時,我設置了一個 dropout。base_model = MobileNet(weights='imagenet', include_top=False, input_shape=(200,200,3), dropout=.15)x = base_model.outputx = GlobalAveragePooling2D()(x)x = Dense(10, activation='softmax')(x)我在訓練時使用model_checkpoint_callback. 當我訓練時,我會發現過度擬合發生的地方,并調整凍結層的數量和學習率。當我再次保存加載的模型時,我是否也可以調整 dropout?我看到了這個答案,但是 Mobilenet 中沒有實際的 dropout 層,所以這個for layer in model.layers:    if hasattr(layer, 'rate'):        print(layer.name)        layer.rate = 0.5什么都不做。
查看完整描述

1 回答

?
UYOU

TA貢獻1878條經驗 獲得超4個贊

過去,您必須克隆模型才能讓新的 dropout 接受。我最近沒試過。


# This code allows you to change the dropout

# Load model from .json

model.load_weights(filenameToModelWeights) # Load weights

model.layers[-2].rate = 0.04  # layer[-2] is my dropout layer, rate is dropout attribute

model = keras.models.clone(model) # If I do not clone, the new rate is never used. Weights are re-init now.

model.load_weights(filenameToModelWeights) # Load weights

model.predict(x)

歸功于


http://www.gergltd.com/home/2018/03/changing-dropout-on-the-fly-during-training-time-test-time-in-keras/


如果模型一開始就沒有 dropout 層,就像 Keras 的預訓練移動網絡一樣,您必須使用方法添加它們。這是您可以做到的一種方法。


用于添加單層


def insert_single_layer_in_keras(model, layer_name, new_layer):

    layers = [l for l in model.layers]


    x = layers[0].output

    for i in range(1, len(layers)):

        x = layers[i](x)

        # add layer afterward

        if layers[i].name == layer_name:

            x = new_layer(x)


    new_model = Model(inputs=layers[0].input, outputs=x)

    return new_model


用于系統地添加層


def insert_layers_in_model(model, layer_common_name, new_layer):

    import re


    layers = [l for l in model.layers]

    x = layers[0].output

    layer_config = new_layer.get_config()

    base_name = layer_config['name']

    layer_class = type(dropout_layer)

    for i in range(1, len(layers)):

        x = layers[i](x)

        match = re.match(".+" + layer_common_name + "+", layers[i].name)

        # add layer afterward

        if match:

            layer_config['name'] = base_name + "_" + str(i)  # no duplicate names, could be done different

            layer_copy = layer_class.from_config(layer_config)

            x = layer_copy(x)


    new_model = Model(inputs=layers[0].input, outputs=x)

    return new_model

像這樣跑


import tensorflow as tf

from tensorflow.keras.applications.mobilenet import MobileNet

from tensorflow.keras.layers import Dropout

from tensorflow.keras.models import Model


base_model = MobileNet(weights='imagenet', include_top=False, input_shape=(192, 192, 3), dropout=.15)


dropout_layer = Dropout(0.5)

# add single layer after last dropout

mobile_net_with_dropout = insert_single_layer_in_model(base_model, "conv_pw_13_bn", dropout_layer)

# systematically add layers after any batchnorm layer

mobile_net_with_multi_dropout = insert_layers_in_model(base_model, "bn", dropout_layer)

順便說一句,您絕對應該進行實驗,但您不太可能希望在 batchnorm 之上對像 mobilenet 這樣的小型網絡進行額外的正則化。


查看完整回答
反對 回復 2023-02-22
  • 1 回答
  • 0 關注
  • 110 瀏覽
慕課專欄
更多

添加回答

舉報

0/150
提交
取消
微信客服

購課補貼
聯系客服咨詢優惠詳情

幫助反饋 APP下載

慕課網APP
您的移動學習伙伴

公眾號

掃描二維碼
關注慕課網微信公眾號