栏目分类:
子分类:
返回
名师互学网用户登录
快速导航关闭
当前搜索
当前分类
子分类
实用工具
热门搜索
名师互学网 > IT > 面试经验 > 面试问答

整形LSTM的数据,并将密集层的输出馈送到LSTM

面试问答 更新时间: 发布时间: IT归档 最新发布 模块sitemap 名妆网 法律咨询 聚返吧 英语巴士网 伯小乐 网商动力

整形LSTM的数据,并将密集层的输出馈送到LSTM

对于第一个问题,我正在做同样的事情,没有收到任何错误,请分享您的错误。

注意 :我将为您提供使用函数式API的示例,该API的使用自由度稍高一些(个人观点)

from keras.layers import Dense, Flatten, LSTM, Activationfrom keras.layers import Dropout, RepeatVector, TimeDistributedfrom keras import Input, Modelseq_length = 15input_dims = 10output_dims = 8n_hidden = 10model1_inputs = Input(shape=(seq_length,input_dims,))model1_outputs = Input(shape=(output_dims,))net1 = LSTM(n_hidden, return_sequences=True)(model1_inputs)net1 = LSTM(n_hidden, return_sequences=False)(net1)net1 = Dense(output_dims, activation='relu')(net1)model1_outputs = net1model1 = Model(inputs=model1_inputs, outputs = model1_outputs, name='model1')## Fit the modelmodel1.summary()_________________________________________________________________Layer (type)      Output Shape   Param #   =================================================================input_11 (InputLayer)        (None, 15, 10) 0         _________________________________________________________________lstm_8 (LSTM)     (None, 15, 10) 840       _________________________________________________________________lstm_9 (LSTM)     (None, 10)     840       _________________________________________________________________dense_9 (Dense)   (None, 8)      88        _________________________________________________________________

对于第二个问题,有两种方法:

  1. 如果您发送的数据没有按顺序排列,即 暗淡
    (batch, input_dims)
    ,则可以使用此方法 RepeatVector ,该方法重复相同的权重by
    n_steps
    ,这
    rolling_steps
    在LSTM中仅此而已。

{

seq_length = 15input_dims = 16output_dims = 8n_hidden = 20lstm_dims = 10model1_inputs = Input(shape=(input_dims,))model1_outputs = Input(shape=(output_dims,))net1 = Dense(n_hidden)(model1_inputs)net1 = Dense(n_hidden)(net1)net1 = RepeatVector(3)(net1)net1 = LSTM(lstm_dims, return_sequences=True)(net1)net1 = LSTM(lstm_dims, return_sequences=False)(net1)net1 = Dense(output_dims, activation='relu')(net1)model1_outputs = net1model1 = Model(inputs=model1_inputs, outputs = model1_outputs, name='model1')## Fit the modelmodel1.summary()_________________________________________________________________Layer (type)      Output Shape   Param #   =================================================================input_13 (InputLayer)        (None, 16)     0         _________________________________________________________________dense_13 (Dense)  (None, 20)     340       _________________________________________________________________dense_14 (Dense)  (None, 20)     420       _________________________________________________________________repeat_vector_2 (RepeatVecto (None, 3, 20)  0         _________________________________________________________________lstm_14 (LSTM)    (None, 3, 10)  1240      _________________________________________________________________lstm_15 (LSTM)    (None, 10)     840       _________________________________________________________________dense_15 (Dense)  (None, 8)      88        =================================================================
  1. 如果要发送dims序列
    (seq_len, input_dims)
    ,则可以使用 TimeDistributed ,它在整个序列上重复相同权重的密集层。

{

seq_length = 15input_dims = 10output_dims = 8n_hidden = 10lstm_dims = 6model1_inputs = Input(shape=(seq_length,input_dims,))model1_outputs = Input(shape=(output_dims,))net1 = TimeDistributed(Dense(n_hidden))(model1_inputs)net1 = LSTM(output_dims, return_sequences=True)(net1)net1 = LSTM(output_dims, return_sequences=False)(net1)net1 = Dense(output_dims, activation='relu')(net1)model1_outputs = net1model1 = Model(inputs=model1_inputs, outputs = model1_outputs, name='model1')## Fit the modelmodel1.summary()_________________________________________________________________Layer (type)      Output Shape   Param #   =================================================================input_17 (InputLayer)        (None, 15, 10) 0         _________________________________________________________________time_distributed_3 (TimeDist (None, 15, 10) 110       _________________________________________________________________lstm_18 (LSTM)    (None, 15, 8)  608       _________________________________________________________________lstm_19 (LSTM)    (None, 8)      544       _________________________________________________________________dense_19 (Dense)  (None, 8)      72        =================================================================

注意
:在执行此操作时,我在第一层中堆叠了两层

return_sequence
,这将在每个时间步长返回输出,第二层将使用该输出,最后才返回输出
time_step



转载请注明:文章转载自 www.mshxw.com
本文地址:https://www.mshxw.com/it/641568.html
我们一直用心在做
关于我们 文章归档 网站地图 联系我们

版权所有 (c)2021-2022 MSHXW.COM

ICP备案号:晋ICP备2021003244-6号