go back

Child-friendly divorcing: Incremental Hierarchy Learning in Bayesian Networks

Florian Röhrbein, Julian Eggert, Edgar Körner, "Child-friendly divorcing: Incremental Hierarchy Learning in Bayesian Networks", Proceedings of the 2009 International Joint Conference on Neural Networks (IJCNN), 2009.

Abstract

The autonomous learning of concept hierarchies is still a matter of research. Here we present a learning schema for Bayesian networks which results in a nested structure of sub- and superclass relationships. It is based on so-called parent divorcing but exploits the similarity of all nodes involved as expressed by their connectivity pattern. If the procedure is applied to simple object-property pairings a nested taxonomic hierarchy emerges. We further show how the learning procedure can be aligned with basic results from developmental psychology. For this we made a set of simulations which clearly indicate that a fixed developmental order of sensory maturation is crucial for the emerging conceptual system. The learning procedure itself is biologically plausible since it works incrementally, makes use of only local information and leads to a reduced computational effort by building a more efficient representation.



Download Bibtex file Download PDF

Search