PP: A dual-stage attention-based recurrent neural network for time series prediction
Problem: time series prediction
The nonlinear autoregressive exogenous model: The Nonlinear autoregressive exogenous (NARX) model, which predicts the current value of a time series based upon its previous values as well as the current and past values of multiple driving (exogenous) series.
However, few NARX models can capture the long-term temporal dependencies appropriately and select the relevant driving series to make a prediction.
2 issues:
1. capture the long-term temporal dependencies
2. select the relevant driving series to make a prediction
We propose a dual-stage attention-based RNN to address these 2 issues.
1. first stage: input attention mechanism to extract relevant driving series.
2. second stage: temporal attention mechanism.
attention-based encoder-decoder networks for time series prediction/ LSTM/ GRU
One problem with encoder-decoder networks is that their performance will deteriorate rapidly as the length of input sequence increases.
Contribution: the two-stage attention mechanism. input attention for driving series and temporal attention for all time stamps.
input attention can select the relevant driving series.
temporal attention capture temporal information.
Supplementary knowledge:
1. what is driving series?
PP: A dual-stage attention-based recurrent neural network for time series prediction的更多相关文章
- 论文笔记:(2019)GAPNet: Graph Attention based Point Neural Network for Exploiting Local Feature of Point Cloud
目录 摘要 一.引言 二.相关工作 基于体素网格的特征学习 直接从非结构化点云中学习特征 从多视图模型中学习特征 几何深度学习的学习特征 三.GAPNet架构 3.1 GAPLayer 局部结构表示 ...
- (转)LSTM NEURAL NETWORK FOR TIME SERIES PREDICTION
LSTM NEURAL NETWORK FOR TIME SERIES PREDICTION Wed 21st Dec 2016 Neural Networks these days are th ...
- (zhuan) LSTM Neural Network for Time Series Prediction
LSTM Neural Network for Time Series Prediction Wed 21st Dec 2016 Neural Networks these days are the ...
- (zhuan) Recurrent Neural Network
Recurrent Neural Network 2016年07月01日 Deep learning Deep learning 字数:24235 this blog from: http:/ ...
- 论文翻译:2021_A New Real-Time Noise Suppression Algorithm for Far-Field Speech Communication Based on Recurrent Neural Network
论文地址:一种新的基于循环神经网络的远场语音通信实时噪声抑制算法 引用格式:Chen B, Zhou Y, Ma Y, et al. A New Real-Time Noise Suppression ...
- 论文笔记:ReNet: A Recurrent Neural Network Based Alternative to Convolutional Networks
ReNet: A Recurrent Neural Network Based Alternative to Convolutional Networks2018-03-05 11:13:05 ...
- 论文翻译:2020_WaveCRN: An efficient convolutional recurrent neural network for end-to-end speech enhancement
论文地址:用于端到端语音增强的卷积递归神经网络 论文代码:https://github.com/aleXiehta/WaveCRN 引用格式:Hsieh T A, Wang H M, Lu X, et ...
- Recurrent Neural Network系列1--RNN(循环神经网络)概述
作者:zhbzz2007 出处:http://www.cnblogs.com/zhbzz2007 欢迎转载,也请保留这段声明.谢谢! 本文翻译自 RECURRENT NEURAL NETWORKS T ...
- Recurrent Neural Network系列2--利用Python,Theano实现RNN
作者:zhbzz2007 出处:http://www.cnblogs.com/zhbzz2007 欢迎转载,也请保留这段声明.谢谢! 本文翻译自 RECURRENT NEURAL NETWORKS T ...
随机推荐
- java设计模式4——原型模式
java设计模式4--原型模式 1.写在前面 本节内容与C++语言的复制构造函数.浅拷贝.深拷贝极为相似,因此建议学习者可以先了解C++的该部分的相关知识,或者学习完本节内容后,也去了解C++的相应内 ...
- mysql 的root 用户无法授权,navicat 远程授权提示1044解决方案
先看解决方案 #------------mysql root 用户无法赋权问题解决 -------- ,登录 mysql -u root -p ,use mysql; 选择mysql数据库 ,执行以下 ...
- MFC/QT 学习笔记(四)——MFC基于对话框学习控件(下)
//5.列表控件 ListControl 属性 报表模式 view:Report:添加变量 //Cdemo5Dlg.cpp ps:资源视图 右键 类向导 成员变量 查看对象所属类 // TODO: 在 ...
- C#设计模式学习笔记:(8)装饰模式
本笔记摘抄自:https://www.cnblogs.com/PatrickLiu/p/7723225.html,记录一下学习过程以备后续查用. 一.引言 今天我们要讲结构型设计模式的第三个模式--装 ...
- string_random
1.随机数 import random 0-1间的随机浮点数,random.random() 指定区间随机浮点数,random.uniform(a,b) 指定区间随机整数(闭区间),random.ra ...
- java面试必问问题总结
1. 自我介绍 2. get跟load的区别 3. 什么是重载,什么是重写 4. HashTable跟HashMap的区别 5. Jsp九大隐式对象 6. Forword和redirect 的区别 7 ...
- Python小白
.IDLE软件为内建于CPython的集成开发环境(IDE),包括编辑器,编译或解释器,调试器 .py(后缀保存) 2.行一,单行注释 多行,””” ‘’’ 之后,内建 ...
- kafka消费服务调优
1.消费服务速度跟不上 2.top -H 观察是哪个线程最忙 3.多次使用jstack,看看最忙的那个线程在做什么
- ng-做一个简单的通讯录--学习使用路由和HTTP
app.module import { BrowserModule } from '@angular/platform-browser'; import { NgModule } from '@ang ...
- 知乎-如何rebuttal
作者:魏秀参链接:https://zhuanlan.zhihu.com/p/104298923来源:知乎著作权归作者所有.商业转载请联系作者获得授权,非商业转载请注明出处. 学术论文是发布自己或团队最 ...