Many dynamical systems, both natural and artificial, are stimulated by time dependent external signals, somehow processing the information contained therein. We demonstrate how to quantify the different modes in which information can be processed by such systems and combine them to define the computational capacity of a dynamical system. This is bounded by the number of linearly independent state variables of the dynamical system, equaling it if the system obeys the fading memory condition. It can be interpreted as the total number of linearly independent functions of its stimuli the system can compute. Our theory combines concepts from machine learning (reservoir computing), system modeling, stochastic processes, and functional analysis. We illustrate our theory by numerical simulations for the logistic map, a recurrent neural network, and a two-dimensional reaction diffusion system, uncovering universal trade-offs between the non-linearity of the computation and the system\’s short-term memory.

}, doi = {DOI: 10.1038/srep00514}, url = {http://www.nature.com/srep/2012/120719/srep00514/full/srep00514.html}, attachments = {http://organic.elis.ugent.be/sites/reservoir-computing.org/files/sites/organic.elis.ugent.be/files/srep00514.pdf}, author = {Dambre, J. and Verstraeten, D. and Schrauwen, B. and Massar, S.} }