Keras变分自编码器在模型摘要清晰的情况下停止工作?

我一直在使用下面的变分自编码器有一段时间了,但最近在环境重置后它就停止工作了。我已经查看了tensorflow的文档和最近的更改日志(但我似乎找不到任何超过一个月的良好发布历史),但没有发现这些函数基本定义的任何变化。此外,我得到了一个非常简洁的错误代码,它并不十分描述性或有帮助。

AssertionError: in user code:AssertionError: Could not compute output Tensor("sequential/dense_1/Sigmoid:0", shape=(None, 110), dtype=float32)

错误似乎只是指向最后一行,因为它无法提供更多的描述。然而,在查看模型摘要时,模型似乎能够识别所有层及其构建。

vae.summary()Model: "functional_1"__________________________________________________________________________________________________Layer (type)                    Output Shape         Param #     Connected to                     ==================================================================================================input_1 (InputLayer)            [(None, 110)]        0                                            __________________________________________________________________________________________________dense_2 (Dense)                 (None, 64)           7104        input_1[0][0]                    __________________________________________________________________________________________________dense_3 (Dense)                 (None, 2)            130         dense_2[0][0]                    __________________________________________________________________________________________________dense_4 (Dense)                 (None, 2)            130         dense_2[0][0]                    __________________________________________________________________________________________________kl_divergence_layer (KLDivergen [(None, 2), (None, 2 0           dense_3[0][0]                                                                                     dense_4[0][0]                    __________________________________________________________________________________________________lambda (Lambda)                 (None, 2)            0           kl_divergence_layer[0][1]        __________________________________________________________________________________________________input_2 (InputLayer)            [(None, 2)]          0                                            __________________________________________________________________________________________________multiply (Multiply)             (None, 2)            0           lambda[0][0]                                                                                      input_2[0][0]                    __________________________________________________________________________________________________add (Add)                       (None, 2)            0           kl_divergence_layer[0][0]                                                                         multiply[0][0]                   __________________________________________________________________________________________________sequential (Sequential)         (None, 110)          7342        add[0][0]                        ==================================================================================================Total params: 14,706Trainable params: 14,706Non-trainable params: 0__________________________________________________________________________________________________

因此,我不知道哪里出了问题。有没有想法?以下是使用变分自编码器的样本数据。

import pandas as pdfrom sklearn.datasets import make_blobs from sklearn.preprocessing import MinMaxScalerimport keras.backend as Kimport tensorflow as tffrom keras.layers import Input, Dense, Lambda, Layer, Add, Multiplyfrom keras.models import Model, Sequentialfrom keras.callbacks import EarlyStopping, LearningRateSchedulerfrom keras.objectives import binary_crossentropyx, labels = make_blobs(n_samples=150000, n_features=110,  centers=16, cluster_std=4.0)scaler = MinMaxScaler()x = scaler.fit_transform(x)x = pd.DataFrame(x)train = x.sample(frac = 0.8)train_indexs = train.index.valuestest = x[~x.index.isin(train_indexs)]print(train.shape, test.shape)def nll(y_true, y_pred):    """ Negative log likelihood (Bernoulli). """    return K.sum(K.binary_crossentropy(y_true, y_pred), axis = -1)class KLDivergenceLayer(Layer):    """ Identity transform layer that adds KL divergence    to the final model loss.    """    def __init__(self, *args, **kwargs):        self.is_placeholder = True        super(KLDivergenceLayer, self).__init__(*args, **kwargs)    def call(self, inputs):        mu, log_var = inputs                            #changing form sum to mean        kl_batch = - .5 * K.sum(1 + log_var -                                K.square(mu) -                                K.exp(log_var), axis=-1)        self.add_loss(K.mean(kl_batch), inputs=inputs)        return inputs#################### VAE ##############latent_dim = 2decoder = Sequential([    Dense(64, input_dim=latent_dim, activation='relu'),    Dense(x.shape[1], activation='sigmoid')])data = Input(shape=(x.shape[1],))h = Dense(64, activation='relu')(data)z_mu = Dense(latent_dim)(h)z_log_var = Dense(latent_dim)(h)#applies KLDivergence here?z_mu, z_log_var = KLDivergenceLayer()([z_mu, z_log_var])#generates random samplesz_sigma = Lambda(lambda t: K.exp(.5*t))(z_log_var)eps = Input(tensor=K.random_normal(stddev=1.0,                                   shape=(K.shape(data)[0], latent_dim)))z_eps = Multiply()([z_sigma, eps])z = Add()([z_mu, z_eps])#apply decoderx_pred = decoder(z)# defines final modelvae = Model(inputs=[data, eps], outputs=x_pred)vae.compile(optimizer='rmsprop', loss=nll)#runs modelvae.fit(train, train, shuffle = True, epochs = 1000,         batch_size = 512, validation_data = (test, test),         callbacks = [EarlyStopping(patience=50)])

回答:

问题出在你的eps层上。它不是模型输入

你可以用一个简单的层来替换它,就像这样:

eps = Lambda(lambda t: K.random_normal(stddev=1.0, shape=(K.shape(t)[0], latent_dim)))(z_log_var)

你可以在这里找到运行的笔记本: https://colab.research.google.com/drive/17ugeVin4yOlSD3fBvX4jQNJ2WT7nmlnz?usp=sharing

Related Posts

使用LSTM在Python中预测未来值

这段代码可以预测指定股票的当前日期之前的值,但不能预测…

如何在gensim的word2vec模型中查找双词组的相似性

我有一个word2vec模型,假设我使用的是googl…

dask_xgboost.predict 可以工作但无法显示 – 数据必须是一维的

我试图使用 XGBoost 创建模型。 看起来我成功地…

ML Tuning – Cross Validation in Spark

我在https://spark.apache.org/…

如何在React JS中使用fetch从REST API获取预测

我正在开发一个应用程序,其中Flask REST AP…

如何分析ML.NET中多类分类预测得分数组?

我在ML.NET中创建了一个多类分类项目。该项目可以对…

发表回复

您的邮箱地址不会被公开。 必填项已用 * 标注