- 95-221 Dimitri Petritis
- THERMODYNAMIC FORMALISM OF NEURAL COMPUTING
(267K, 51 pages, uuencoded postscript)
May 15, 95
(auto. generated ps),
of related papers
Abstract. Neural networks are systems of interconnected processors mimicking
some of the brain functions.
After a rapid overview of neural computing, the thermodynamic formalism of
the learning procedure is introduced. Besides its use in introducing
efficient stochastic learning algorithms, it gives an insight in terms of
information theory. Main emphasis is given in the information restitution
process; stochastic evolution is used as the starting point for introducing
statistical mechanics of associative memory. Instead of formulating
problems in their most general setting, it is preferred stating precise
results on specific models.
In this report are mainly presented those features that are relevant when
the neural net becomes very large.
A survey of the most recent results is given and the main open problems
are pointed out.