【转载】Chaotic Time-Series Prediction
原文地址:https://cn.mathworks.com/help/fuzzy/examples/chaotic-time-series-prediction.html?requestedDomain=www.mathworks.com
This example shows how to do chaotic time series prediction using ANFIS.
Time Series Data
The data is generated from the Mackey-Glass time-delay differential equation which is defined by
dx(t)/dt = 0.2x(t-tau)/(1+x(t-tau)^10) - 0.1x(t)
When x(0) = 1.2 and tau = 17, we have a non-periodic and non-convergent time series that is very sensitive to initial conditions. (We assume x(t) = 0 when t < 0.)
load mgdata.dat
a = mgdata;
time = a(:, 1);
x_t = a(:, 2);
plot(time, x_t);
xlabel('Time (sec)','fontsize',10); ylabel('x(t)','fontsize',10);
title('Mackey-Glass Chaotic Time Series','fontsize',10);

Preprocessing the Data
Now we want to build an ANFIS that can predict x(t+6) from the past values of this time series, that is, x(t-18), x(t-12), x(t-6), and x(t). Therefore the training data format is
[x(t-18), x(t-12), x(t-6), x(t); x(t+6]
From t = 118 to 1117, we collect 1000 data pairs of the above format. The first 500 are used for training while the others are used for checking. The plot shows the segment of the time series where data pairs were extracted from. The first 100 data points are ignored to avoid the transient portion of the data.
trn_data = zeros(500, 5);
chk_data = zeros(500, 5); % prepare training data
trn_data(:, 1) = x_t(101:600);
trn_data(:, 2) = x_t(107:606);
trn_data(:, 3) = x_t(113:612);
trn_data(:, 4) = x_t(119:618);
trn_data(:, 5) = x_t(125:624); % prepare checking data
chk_data(:, 1) = x_t(601:1100);
chk_data(:, 2) = x_t(607:1106);
chk_data(:, 3) = x_t(613:1112);
chk_data(:, 4) = x_t(619:1118);
chk_data(:, 5) = x_t(625:1124); index = 119:1118; % ts starts with t = 0
plot(time(index), x_t(index));
xlabel('Time (sec)','fontsize',10); ylabel('x(t)','fontsize',10);
title('Mackey-Glass Chaotic Time Series','fontsize',10);

Building the ANFIS Model
We use GENFIS1 to generate an initial FIS matrix from training data. The command is quite simple since default values for MF number (2) and MF type ('gbellmf') are used:
fismat = genfis1(trn_data); % The initial MFs for training are shown in the plots.
for input_index=1:4,
subplot(2,2,input_index)
[x,y]=plotmf(fismat,'input',input_index);
plot(x,y)
axis([-inf inf 0 1.2]);
xlabel(['Input ' int2str(input_index)],'fontsize',10);
end

There are 2^4 = 16 rules in the generated FIS matrix and the number of fitting parameters is 108, including 24 nonlinear parameters and 80 linear parameters. This is a proper balance between number of fitting parameters and number of training data (500). The ANFIS command looks like this:
[trn_fismat,trn_error] = anfis(trn_data, fismat,[],[],chk_data)
To save time, we will load the training results directly.
After ten epochs of training, the final MFs are shown in the plots. Note that these MFs after training do not change drastically. Obviously most of the fitting is done by the linear parameters while the nonlinear parameters are mostly for fine- tuning for further improvement.
% load training results
load mganfis % plot final MF's on x, y, z, u
for input_index=1:4,
subplot(2,2,input_index)
[x,y]=plotmf(trn_fismat,'input',input_index);
plot(x,y)
axis([-inf inf 0 1.2]);
xlabel(['Input ' int2str(input_index)],'fontsize',10);
end

Error Curves
This plot displays error curves for both training and checking data. Note that the training error is higher than the checking error. This phenomenon is not uncommon in ANFIS learning or nonlinear regression in general; it could indicate that the training process is not close to finished yet.
% error curves plot
close all;
epoch_n = 10;
plot([trn_error chk_error]);
hold on; plot([trn_error chk_error], 'o'); hold off;
xlabel('Epochs','fontsize',10);
ylabel('RMSE (Root Mean Squared Error)','fontsize',10);
title('Error Curves','fontsize',10);

Comparison
This plot shows the original time series and the one predicted by ANFIS. The difference is so tiny that it is impossible to tell one from another by eye inspection. That is why you probably see only the ANFIS prediction curve. The prediction errors must be viewed on another scale.
input = [trn_data(:, 1:4); chk_data(:, 1:4)];
anfis_output = evalfis(input, trn_fismat);
index = 125:1124;
plot(time(index), [x_t(index) anfis_output]);
xlabel('Time (sec)','fontsize',10);

Prediction Errors of ANFIS
Prediction error of ANFIS is shown here. Note that the scale is about a hundredth of the scale of the previous plot. Remember that we have only 10 epochs of training in this case; better performance is expected if we have extensive training.
diff = x_t(index)-anfis_output;
plot(time(index), diff);
xlabel('Time (sec)','fontsize',10);
title('ANFIS Prediction Errors','fontsize',10);

【转载】Chaotic Time-Series Prediction的更多相关文章
- (转)LSTM NEURAL NETWORK FOR TIME SERIES PREDICTION
LSTM NEURAL NETWORK FOR TIME SERIES PREDICTION Wed 21st Dec 2016 Neural Networks these days are th ...
- (zhuan) LSTM Neural Network for Time Series Prediction
LSTM Neural Network for Time Series Prediction Wed 21st Dec 2016 Neural Networks these days are the ...
- PP: A dual-stage attention-based recurrent neural network for time series prediction
Problem: time series prediction The nonlinear autoregressive exogenous model: The Nonlinear autoregr ...
- PP: Modeling extreme events in time series prediction
KDD: Knowledge Discovery and Data Mining (KDD) Insititute: 复旦大学,中科大 Problem: time series prediction; ...
- PP: Time series clustering via community detection in Networks
Improvement can be done in fulture:1. the algorithm of constructing network from distance matrix. 2. ...
- Autocorrelation in Time Series Data
Why Time Series Data Is Unique A time series is a series of data points indexed in time. The fact th ...
- Recurrent Neural Network[survey]
0.引言 我们发现传统的(如前向网络等)非循环的NN都是假设样本之间无依赖关系(至少时间和顺序上是无依赖关系),而许多学习任务却都涉及到处理序列数据,如image captioning,speech ...
- 生成模型(Generative Model)和 判别模型(Discriminative Model)
引入 监督学习的任务就是学习一个模型(或者得到一个目标函数),应用这一模型,对给定的输入预测相应的输出.这一模型的一般形式为一个决策函数Y=f(X),或者条件概率分布P(Y|X). 监督学习方法又可以 ...
- cnn,rnn,dnn
CNN(卷积神经网络).RNN(循环神经网络).DNN(深度神经网络)的内部网络结构有什么区别? https://www.zhihu.com/question/34681168 CNN(卷积神经网络) ...
随机推荐
- java多线程之 基本概念
一.线程的五种状态 1. 新建状态(New) : 线程对象被创建后,就进入了新建状态.例如,Thread thread = new Thread().2. 就绪状态(Runnable) ...
- HTML5 拖动
触发的事件有:dragstart事件.drag事件和dragend事件. 按下鼠标键并开始移动鼠标的时候,会在被拖拽的元素上触发dragstart事件.这时候光标变成”不能放”符号(圆环中有一条反斜线 ...
- STL库中的正态分布函数
在设计抽奖一类程序中,有时会需要一种概率“有较大可能获得一个普通结果,有较小可能获得一个糟糕或极好的结果”,这就可以用正态分布函数来获得这样一个结果. STL中已经提供了一系列随机分布的函数,包括正态 ...
- 回头再看N层架构(图解)
不知不觉来博客园已经快两半了,时间过的真快. 这次的目标是再回顾一下传统的N层架构并且分析一下在DDD中的N层架构. 一.先来看一看传统的N层架构 N-层架构的出现,主要是由于观注点的分离而产生,这三 ...
- 移动安全初探:窃取微信聊天记录、Hacking Android with Metasploit
在这篇文章中我们将讨论如何获取安卓.苹果设备中的微信聊天记录,并演示如何利用后门通过Metasploit对安卓设备进行控制.文章比较基础.可动手性强,有设备的童鞋不妨边阅读文章边操作,希望能激发大家对 ...
- 样式PC和手机页面
/*媒体查询--当页面大于1200px时*/ @media (min-width: 1200px) { } /*在922和1199像素之间的屏幕里,中等屏幕*/ @media (min-width: ...
- 盒模型与在低版本IE下的区别
对css有一定了解的同学一定听说过盒模型,在这里以我自己的一点儿了解和认知来解释一下盒模型与盒模型在低版本IE浏览器下与其他浏览器下的区别. W3c标准下的盒模型 盒模型由 content(内容),p ...
- Java的流程控制和C++的异同
Java的流程控制和C++基本相似 现将不同的地方总结一下,以便快速掌握. Java的特殊流程控制的特殊部分: 1.顺序结构 -- 没有区别 2.分之结构 -- 没有区别 3.循环结构 1> ...
- Android实现Layout缩放动画
最近看到Any.do的缩放效果很酷,看到一篇讲Layout缩放动画实现的文章,记录一下: http://edison-cool911.iteye.com/blog/704812
- SELECT控件操作的JS代码示例
SELECT控件操作的JS代码示例 1 检测是否有选中 if(objSelect.selectedIndex > -1) { //说明选中 } else { //说明没有选中 } 2.动态创建s ...