我正在尝试编写弹性网络代码。它看起来像这样:
我想将这个损失函数应用到Keras中:
def nn_weather_model(): ip_weather = Input(shape = (30, 38, 5)) x_weather = BatchNormalization(name='weather1')(ip_weather) x_weather = Flatten()(x_weather) Dense100_1 = Dense(100, activation='relu', name='weather2')(x_weather) Dense100_2 = Dense(100, activation='relu', name='weather3')(Dense100_1) Dense18 = Dense(18, activation='linear', name='weather5')(Dense100_2) model_weather = Model(inputs=[ip_weather], outputs=[Dense18]) model = model_weather ip = ip_weather op = Dense18 return model, ip, op
我的损失函数是:
def cost_function(y_true, y_pred): return ((K.mean(K.square(y_pred - y_true)))+L1+L2) return cost_function
它是mse+L1+L2
其中L1和L2是
weight1=model.layers[3].get_weights()[0]weight2=model.layers[4].get_weights()[0]weight3=model.layers[5].get_weights()[0]L1 = Calculate_L1(weight1,weight2,weight3)L2 = Calculate_L2(weight1,weight2,weight3)
我使用Calculate_L1函数来计算dense1、dense2和dense3的权重总和,Calculate_L2也做同样的事情。
当我训练时RB_model.compile(loss = cost_function(),optimizer= 'RMSprop')
,L1和L2变量并没有在每个批次更新。所以我尝试在批次开始时使用回调函数,如下所示:
class update_L1L2weight(Callback): def __init__(self): super(update_L1L2weight, self).__init__() def on_batch_begin(self,batch,logs=None): weight1=model.layers[3].get_weights()[0] weight2=model.layers[4].get_weights()[0] weight3=model.layers[5].get_weights()[0] L1 = Calculate_L1(weight1,weight2,weight3) L2 = Calculate_L2(weight1,weight2,weight3)
我如何在批次开始时使用回调函数计算L1和L2,并将L1、L2变量传递给损失函数?
回答:
你可以简单地使用Keras内置的权重正则化来为每个层进行正则化。你可以通过使用层的kernel_regularizer
参数并指定一个正则化器来实现。例如:
from keras import regularizersmodel.add(Dense(..., kernel_regularizer=regularizers.l2(0.1)))
这些正则化会创建一个损失张量,该张量会被添加到损失函数中,如Keras源代码中所实现的:
# 添加正则化惩罚# 和其他层特定的损失。for loss_tensor in self.losses: total_loss += loss_tensor