Brage vll være stengt kl. 21:40-07:30 pga. vedlikeholdsarbeid.
On merging the fields of neural networks and adaptive data structures to yield new pattern recognition methodologies
Chapter, Peer reviewed
MetadataShow full item record
Original versionOommen, B. J. (2011). On merging the fields of neural networks and adaptive data structures to yield new pattern recognition methodologies. In S. Kuznetsov, D. Mandal, M. Kundu & S. Pal (Eds.), Pattern Recognition and Machine Intelligence (Vol. 6744, pp. 13-16): Springer
The aim of this talk is to explain a pioneering exploratory research endeavour that attempts to merge two completely different fields in Computer Science so as to yield very fascinating results. These are the well-established fields of Neural Networks (NNs) and Adaptive Data Structures (ADS) respectively. The field of NNs deals with the training and learning capabilities of a large number of neurons, each possessing minimal computational properties. On the other hand, the field of ADS concerns designing, implementing and analyzing data structures which adaptively change with time so as to optimize some access criteria. In this talk, we shall demonstrate how these fields can be merged, so that the neural elements are themselves linked together using a data structure. This structure can be a singly-linked or doubly-linked list, or even a Binary Search Tree (BST). While the results themselves are quite generic, in particular, we shall, as a prima facie case, present the results in which a Self-Organizing Map (SOM) with an underlying BST structure can be adaptively re-structured using conditional rotations. These rotations on the nodes of the tree are local and are performed in constant time, guaranteeing a decrease in the Weighted Path Length of the entire tree. As a result, the algorithm, referred to as the Tree-based Topology-Oriented SOM with Conditional Rotations (TTO-CONROT), converges in such a manner that the neurons are ultimately placed in the input space so as to represent its stochastic distribution. Besides, the neighborhood properties of the neurons suit the best BST that represents the data.
Published version of a chapter from the book Pattern Recognition and Machine Intelligence. Also available from the publisher at http://dx.doi.org/10.1007/978-3-642-21786-9_3