代码修改的位置
Autoformer.py
Autoformer_EncDec.py
tsmoothie/smoother.py at master · cerlymarco/tsmoothie · GitHub
tsmoothie/Basic Smoothing.ipynb at master · cerlymarco/tsmoothie · GitHub
Question on KalmanSmoother usage · Issue #7 · cerlymarco/tsmoothie · GitHub
Fedformer的MOE模块替换Autoformer的Decomp模块,效果确实不错,开始研究
再研究研究tsmoothie模块,还有一些github上面的开源时间序列处理的代码
ICML、ICLR、WWWL、AAAI,时序人整理的代码
吴现在在研究denoisng diffusion模块,看看理论,研究研究
另辟蹊径—Denoising Diffusion Probabilistic 一种从噪音中剥离出图像/音频的模型 - 知乎 (zhihu.com)
【读论文】117 Denoising Diffusion Probabilistic Models_哔哩哔哩_bilibili
刘硕论文里面的SDAE模块可以考虑加入Autoformer.
GitHub - Zero-coder/MEDIUM_NoteBook: Repository containing notebooks of my posts on Medium
Time Series generation with VAE LSTM [post]|[code]
mixture of experts的论文思想和理论可以看看,MOE-layer
稀疏性在机器学习中的发展趋势:MoE、稀疏注意力机制_zenRRan的博客-CSDN博客
OUTRAGEOUSLY LARGE NEURAL NETWORKS_Buyi_Shizi的博客-CSDN博客
原文地址:
Outrageously Large Neural Networks: The Sparsely-Gated Mixture-of-Experts Layer
GitHub - davidmrau/mixture-of-experts: PyTorch Re-Implementation of "The Sparsely-Gated Mixture-of-Experts Layer" by Noam Shazeer et al. https://arxiv.org/abs/1701.06538
暑期编程PK赛 得CSDN机械键盘等精美礼品!


