在Keras中拼接形状为(None,512)和(18577,4)的层

在Keras中,如何拼接两个层,其中一个层的维度为(None,512),另一个层的维度为(18577,4)?我尝试使用Concatenate

concat_layer = Concatenate()([z1,agp]

但这抛出了一个错误,提示如下:

ValueError: A `Concatenate` layer requires inputs with matching shapes except for the concat axis. Got inputs shapes: [(None, 512), (18577, 4)]

模型大致如下:

a1= (Convolution2D(32, filter_dim, activation='linear',                     padding='same',kernel_regularizer=regularizers.l2(reg)))(input_img) b1 = (BatchNormalization())(a1)c1 = (PReLU())(b1)d1 = (Convolution2D(32, filter_dim, activation='linear',kernel_regularizer=regularizers.l2(reg)))(c1)e1 = (BatchNormalization())(d1)f1 = (PReLU())(e1)g1 = (MaxPooling2D(pool_size=(2,2)))(f1)h1 = (Dropout(0.2))(g1)i1= (Convolution2D(64, filter_dim, activation='linear', padding='same',kernel_regularizer=regularizers.l2(reg)))(h1)j1 = (BatchNormalization())(i1)k1 = (PReLU())(j1)l1 = (Convolution2D(64, filter_dim, activation='linear',kernel_regularizer=regularizers.l2(reg)))(k1)m1 = (BatchNormalization())(k1)n1 = (PReLU())(m1)o1 = (MaxPooling2D(pool_size=(2,2)))(n1)p1 = (Dropout(0.2))(o1)q1= (Convolution2D(128, filter_dim, activation='linear', padding='same',kernel_regularizer=regularizers.l2(reg)))(p1)r1=q1s1 = (BatchNormalization())(r1)t1 = (PReLU())(s1)u1 = (Convolution2D(128, filter_dim, activation='linear',kernel_regularizer=regularizers.l2(reg)))(t1)v1 = (BatchNormalization())(u1)w1 = (PReLU())(v1)x1 = (MaxPooling2D(pool_size=(3,3)))(w1)y1 = (Dropout(0.2))(x1)z1 = (Flatten())(y1)agp=tf.convert_to_tensor(agp,np.float32)z1 = Concatenate(axis=1)([z1,agp])a2 = (Dense(128, activation='linear',kernel_regularizer=regularizers.l2(reg)))(z1)b2 = (BatchNormalization())(a2)c2 = (PReLU())(b2)d2 = (Dropout(0.2))(c2)e2 = (Dense(32, activation='linear',kernel_regularizer=regularizers.l2(reg)))(d2)f2 = (BatchNormalization())(e2)g2 = (PReLU())(f2)h2 = (Dropout(0.3))(g2)

我的输入图像维度为(32,32,3)。我想将z1(None,512)与agp(18577,4)拼接起来


回答:

#!/usr/bin/env pythondef create_model(nb_classes, input_shape):    """Create a NN model."""    # from keras.layers import Dropout    from keras.layers import Activation, Input    from keras.layers import Dense, Concatenate    from keras.models import Model    input_ = Input(shape=input_shape)    x = input_    # Branch in two directions - this can be more    # complex, of course    x1 = Dense(512, activation='relu')(x)    x2 = Dense(4, activation='relu')(x)    # And this is how you use concatenation    x = Concatenate(axis=-1)([x1, x2])    # And then finish it    x = Dense(nb_classes, activation='softmax')(x)    model = Model(inputs=input_, outputs=x)    return modelmodel = create_model(10, (512, ))print(model.summary())

输出如下

Using TensorFlow backend.____________________________________________________________________________________________________Layer (type)                     Output Shape          Param #     Connected to                     ====================================================================================================input_1 (InputLayer)             (None, 512)           0                                            ____________________________________________________________________________________________________dense_1 (Dense)                  (None, 512)           262656      input_1[0][0]                    ____________________________________________________________________________________________________dense_2 (Dense)                  (None, 4)             2052        input_1[0][0]                    ____________________________________________________________________________________________________concatenate_1 (Concatenate)      (None, 516)           0           dense_1[0][0]                                                                                       dense_2[0][0]                    ____________________________________________________________________________________________________dense_3 (Dense)                  (None, 10)            5170        concatenate_1[0][0]              ====================================================================================================Total params: 269,878Trainable params: 269,878Non-trainable params: 0____________________________________________________________________________________________________None

Related Posts

L1-L2正则化的不同系数

我想对网络的权重同时应用L1和L2正则化。然而,我找不…

使用scikit-learn的无监督方法将列表分类成不同组别,有没有办法?

我有一系列实例,每个实例都有一份列表,代表它所遵循的不…

f1_score metric in lightgbm

我想使用自定义指标f1_score来训练一个lgb模型…

通过相关系数矩阵进行特征选择

我在测试不同的算法时,如逻辑回归、高斯朴素贝叶斯、随机…

可以将机器学习库用于流式输入和输出吗?

已关闭。此问题需要更加聚焦。目前不接受回答。 想要改进…

在TensorFlow中,queue.dequeue_up_to()方法的用途是什么?

我对这个方法感到非常困惑,特别是当我发现这个令人费解的…

发表回复

您的邮箱地址不会被公开。 必填项已用 * 标注