您可以尝试将其
K.batch_flatten()包裹在
Lambda一层中。的输出形状
K.batch_flatten()在运行时动态确定。
model.add(Lambda(lambda x: K.batch_flatten(x)))model.summary()_________________________________________________________________Layer (type) Output Shape Param # =================================================================conv2d_5 (Conv2D) (None, 4, None, 32) 4128 _________________________________________________________________batch_normalization_3 (Batch (None, 4, None, 32) 128 _________________________________________________________________leaky_re_lu_3 (LeakyReLU) (None, 4, None, 32) 0 _________________________________________________________________conv2d_6 (Conv2D) (None, 1, None, 1) 65 _________________________________________________________________activation_3 (Activation) (None, 1, None, 1) 0 _________________________________________________________________lambda_5 (Lambda) (None, None) 0 =================================================================Total params: 4,321Trainable params: 4,257Non-trainable params: 64_________________________________________________________________X = np.random.rand(32, 4, 256, 1)print(model.predict(X).shape)(32, 256)X = np.random.rand(32, 4, 64, 1)print(model.predict(X).shape)(32, 64)



