go back

Local self-adaptation mechanisms for large-scale neural system building

Michael Ortiz and Alexander Gepperth, "Local self-adaptation mechanisms for large-scale neural system building", Proceedings of the 2nd International Conference on Cognitive Neurodynamics, 2009.

Abstract

For integrating neural networks into large systems, dynamical stability and parameter settings are key issues, especially for popular recurrent network models such as dynamic neural fields. In neural circuits, homeostatic plasticity seems to counter these problems. Here we present a set of gradient adaptation rules that autonomously regulate the strength of synaptic input and the parameters of the transfer function for each neuron individually. By doing this, we actively maintain the average membrane potentials and firing rates as well as the variances of the firing rate at specified levels. A focus of this contribution lies on clarifying at which time scales these mechanisms should work. The benefit of such self-adaptation is a significant reduction of free parameters as well as the possibility to connect a neural field to almost arbitrary inputs since dynamical stability is actively maintained. We consider these two properties to be crucial since they will facilitate the construction of large neural systems significantly.



Download Bibtex file Download PDF

Search