初始化
attention layer和传递参数的方式存在问题。您应该
attention layer在该位置指定单位数并修改传递参数的方式:
context_vector, attention_weights = Attention(32)(lstm, state_h)
结果:
__________________________________________________________________________________________________Layer (type) Output Shape Param # Connected to ==================================================================================================input_1 (InputLayer) (None, 200) 0__________________________________________________________________________________________________embedding (Embedding)(None, 200, 128) 32000 input_1[0][0] __________________________________________________________________________________________________bi_lstm_0 (Bidirectional) [(None, 200, 256), ( 263168 embedding[0][0] __________________________________________________________________________________________________bidirectional (Bidirectional) [(None, 200, 256), ( 394240 bi_lstm_0[0][0] bi_lstm_0[0][1] bi_lstm_0[0][2] bi_lstm_0[0][3] bi_lstm_0[0][4] __________________________________________________________________________________________________concatenate (Concatenate) (None, 256) 0bidirectional[0][1] bidirectional[0][3] __________________________________________________________________________________________________attention (Attention)[(None, 256), (None, 16481 bidirectional[0][0] concatenate[0][0] __________________________________________________________________________________________________dense_3 (Dense) (None, 1) 257 attention[0][0] ==================================================================================================Total params: 706,146Trainable params: 706,146Non-trainable params: 0__________________________________________________________________________________________________None



