我一直在尝试在MNIST数据集上训练一个卷积LSTM模型,以扩展我在模型开发方面的知识。我无法摆脱标题中提到的错误。任何帮助或提示都将不胜感激!
我知道步长的默认值是(1,1),但我不确定为什么会被设置为2。
import tensorflow as tffrom keras.models import Sequentialfrom keras.layers import Dense, Dropout, LSTM, CuDNNLSTM, TimeDistributed, Reshapefrom keras.utils import to_categoricalfrom keras.layers.convolutional import Conv2D, Conv3Dfrom keras.layers.pooling import MaxPooling2D, MaxPool3Dfrom keras.layers.core import Flattendef prep_pixels(train, test): # 将整数转换为浮点数 train_norm = train.astype('float32') test_norm = test.astype('float32') # 归一化到0-1范围 train_norm = train_norm / 255.0 test_norm = test_norm / 255.0 # 返回归一化后的图像 return train_norm, test_normmnist = tf.keras.datasets.mnist(x_train, y_train), (x_test, y_test) = mnist.load_data()x_train = x_train.reshape((x_train.shape[0], 28, 28, 1))x_test = x_test.reshape((x_test.shape[0], 28, 28, 1))y_train = to_categorical(y_train)y_test = to_categorical(y_test)x_train, x_test = prep_pixels(x_train, x_test)model = Sequential()model.add(TimeDistributed(Conv2D(32, (3, 3), activation='relu', input_shape=(28, 28, 1))))model.add(TimeDistributed((MaxPooling2D((2, 2)))))model.add(TimeDistributed(Flatten()))model.add(LSTM(32, activation='relu', return_sequences=True))model.add(Dropout(0.2))model.add(Dense(10, activation='softmax'))opt = tf.keras.optimizers.Adam(lr=1e-3, decay=1e-5)model.compile(loss='categorical_crossentropy', optimizer=opt, metrics=['accuracy'])model.fit(x_train, y_train, epochs=1, validation_data=(x_test, y_test))
错误
model.fit(x_train, y_train, epochs=1, validation_data=(x_test, y_test))
strides = _get_sequence(strides, n, channel_index, “strides”)
ValueError: 步长应为长度1、1或3,但实际为2
回答:
看起来你还没有为你的ConvLSTM创建一个窗口化数据集。所以在调用model.fit
之前,你可能需要这样做
d_train = tf.keras.preprocessing.sequence.TimeseriesGenerator(x_train, y_train, length=5, batch_size=64) # 窗口大小 = 5d_test = tf.keras.preprocessing.sequence.TimeseriesGenerator(x_test, y_test, length=5)model.fit(d_train, epochs=1, validation_data=d_test)
为了与你的损失函数保持一致,你需要禁用返回序列(或添加另一层不返回序列的层)。
model.add(tf.keras.layers.LSTM(32, activation='relu', return_sequences=False))