An Evolving Cascade System Based on A Set Of Neo Fuzzy Nodes

Reading time: 5 minute
...

📝 Original Info

  • Title: An Evolving Cascade System Based on A Set Of Neo Fuzzy Nodes
  • ArXiv ID: 1610.06484
  • Date: 2016-10-21
  • Authors: Zhengbing Hu, Yevgeniy V. Bodyanskiy, Oleksii K. Tyshchenko and Olena O. Boiko

📝 Abstract

Neo-fuzzy elements are used as nodes for an evolving cascade system. The proposed system can tune both its parameters and architecture in an online mode. It can be used for solving a wide range of Data Mining tasks (namely time series forecasting). The evolving cascade system with neo-fuzzy nodes can process rather large data sets with high speed and effectiveness.

💡 Deep Analysis

Figure 1

📄 Full Content

The task of time series forecasting (data sequences forecasting) is well studied nowadays. There are many mathematical methods of different complexity that can be used for solving this task: spectral analysis, exponential smoothing, regression, advanced intellectual systems, etc. In many real-world cases, analyzed time series are nonstationary, nonlinear, and usually contain unknown behavior trends, stochastic or chaotic components. This obstacle complicates time series forecasting and makes the above mentioned systems less effective.

To solve this problem, nonlinear models based on mathematical methods of Computational Intelligence [1][2][3] can be used. It should be especially mentioned that neuro-fuzzy systems [4][5][6] are widely used for this type of tasks due to their approximating and extrapolating properties, results’ interpretability, and learning abilities. The most appropriate choice for non-stationary data processing is evolving connectionist systems [7][8][9][10]. These systems adjust not only their synaptic weights and parameters of membership functions, but also their architectures.

There are many evolving systems that are able to process data sets in an online mode. Most of them are based on multilayer neuro-fuzzy systems. The Takagi-Sugeno-Kang (TSK) fuzzy systems [11][12] and adaptive neuro-fuzzy inference systems (ANFIS) are the most popular and effective systems that are used to solve such tasks. But in some cases (e.g. when a size of a data set is not sufficient for training) they cannot rapidly tune their parameters, so their effectiveness can decrease.

The first solution for this problem is to decompose an initial task into a set of simpler tasks, so that the obtained system can solve a problem with a data set at hand regardless to its size. One of the most studied approaches based on this principle is the Group Method of Data Handling (GMDH) [13][14]. But in case of online data processing, the GMDH systems are not sufficiently effective. This problem can be solved by an evolving cascade model that tunes both its parameters as well as its architecture in an online mode.

Generally speaking, one can use different types of neurons or other more complicated systems as nodes in an evolving cascade system. For example, a compartmental R-neuron was introduced as a node of a cascade system [15,16]. If a data set to be processed is large enough, it seems reasonable to use neo-fuzzy neurons [17][18][19]. The neo-fuzzy neuron is capable of finding a global minimum for a learning criterion in an online mode, it also has a high learning speed and good approximating properties. It is also appropriate from the viewpoint of computational simplicity.

The remainder of this paper is organized as follows: Section 2 describes an architecture of the evolving cascade system. Section 3 describes an architecture of the neo-fuzzy neuron as a node of the evolving cascade system. Section 4 presents several synthetic and real-world applications to be solved with the help of the proposed evolving cascade system. Conclusions and future work are given in the final section.

An architecture of the evolving cascade model is shown in Fig. 1.

ˆ* y k of the selection block are fed to a unique neuron of the second layer which forms its output signal   [2] ŷ k . This signal and the signal    

Neo-fuzzy neurons were proposed by T. Yamakawa and co-authors [17][18][19]. Advantages of this block are good approximating properties, computational simplicity, a high learning speed, and ability of finding a global minimum for a learning criterion in an online mode. An architecture of the neo-fuzzy neuron as a node of the evolving cascade system is shown in Fig. 2.

to the node’s input. The first layer of each nonlinear synapse contains h membership functions. In [20], it was proposed to use the B-splines as membership functions for the neo-fuzzy neuron. B-splines provide higher approximation quality. A B-spline of the q -th order has the form

, for 1 0 otherwise for 1 1, …, ,

, for 1 0 otherwise for 1 1, …, .

where iA c , iB c are parameters that define centers of the membership functions. It should be noticed that when 2 q  one can get traditional triangular membership functions, and when 4 q  one can get cubic splines, etc. The B-splines meet the unity partitioning conditions

that allows to simplify the node’s architecture excluding a normalization layer.

So, the elements of the first layer compute membership levels

The second layer contains synaptic weights iA w , iB w that are adjusted during a learning process. The third layer is formed by two summation units. It computes sums of the output signals of the second layer for each nonlinear synapse

Another summation unit sums up these two signals in order to produce the output signal  

The expression (1) can be written in the form

, …, , , …,

, …, , , …,

To learn the neo-fuzzy neuron, we can use the procedure [21,22] 

which possesses both

📸 Image Gallery

cover.png

Reference

This content is AI-processed based on open access ArXiv data.

Start searching

Enter keywords to search articles

↑↓
ESC
⌘K Shortcut