High performance associative memory models and symmetric connections
Two existing high capacity training rules for the standard Hopfield architecture associative memory are examined. Both rules, based on the perceptron learning rule produce asymmetric weight matrices, for which the simple dynamics (only point attractors) of a symmetric network can no longer be guaranteed. This paper examines the consequences of imposing a symmetry constraint in learning. The mean size of attractor basins of trained patterns and the mean time for learning convergence are analysed for the networks that arise from these learning rules, in both the asymmetric and symmetric instantiations. It is concluded that a symmetry constraint does not have any adverse affect on performance but that it does offer benefits in learning time and in network dynamics
Item Type | Other |
---|---|
Date Deposited | 14 Nov 2024 10:57 |
Last Modified | 14 Nov 2024 10:57 |
-
picture_as_pdf - 900907.pdf