Note: This content is accessible to all versions of every browser. However, this browser does not seem to support current Web standards, preventing the display of our site's design details.


Regularized Variational Bayesian Learning of Echo State Networks with Delay&Sum Readout


D. Shutin, C. Zechner, R.S. Kulkarni, H.V. Poor

Neural Computation, vol. 24, no. 4, pp. 967-995

In this work, a variational Bayesian framework for efficient train- ing of echo state networks (ESNs) with automatic regularization and delay&sum (D&S) readout adaptation is proposed. The algorithm uses a classical batch learning of ESNs. By treating the network echo states as fixed basis functions parameterized with delay parameters, we propose a variational Bayesian ESN training scheme. The variational approach allows for a seamless combination of sparse Bayesian learning ideas and a variational Bayesian space-alternating generalized expectation- maximization (VB-SAGE) algorithm for estimating parameters of su- perimposed signals. While the former method realizes automatic reg- ularization of ESNs, which also determines which echo states and input signals are relevant for “explaining” the desired signal, the latter method provides a basis for joint estimation of D&S readout parameters. The proposed training algorithm can naturally be extended to ESNs with fixed filter neurons. It also generalizes the recently proposed expectation- maximization-based D&S readout adaptation method. The proposed algorithm was tested on synthetic data prediction tasks as well as on dynamic handwritten character recognition.


Type of Publication:


No Files for download available.
% Autogenerated BibTeX entry
@Article { ShuEtal:2012:IFA_3909,
    author={D. Shutin and C. Zechner and R.S. Kulkarni and H.V. Poor},
    title={{Regularized Variational Bayesian Learning of Echo State
	  Networks with Delay\&Sum Readout}},
    journal={Neural Computation},
Permanent link