Next: References
Up: No Title
Previous: References
References
- 1
-
Y Abu-Mostfafa and J St. Jacques.
Information capacity of the Hopfield model.
IEEE Transactions on Information Theory, IT-31(4):461--464,
1985.
Discusses the number of patterns a Hopfield net may be expected to be
able to memorise.
- 2
-
Ackley, Hinton, and Sejnowski.
A learning algorithm for Boltzmann machines.
Cognitive Science, 9:147--169, 1985.
The original reference for Boltzmann machines.
- 3
-
Aleksander and Stonham.
Guide to pattern recognition using RAMs.
Computers and Digital Techniques, 2(1):29--40, February 1979.
Aleksander's WISARD.
- 4
-
Amit, Gutfreund, and Sompolinsky.
Storing infinite numbers of patterns in a spin glass model of neural
networks.
Physical Review Letters, 55(14), September 1985.
Theoretical analysis of Hopfield nets.
- 5
-
J Bernstein.
Profiles: Marvin Minsky.
New Yorker, pages 50--126, December 1981.
Biography of Minsky, written before the resurgence of interest in
NNs. Fascinating references to his campaign with Papert against Rosenblatt.
- 6
-
D Bounds.
A statistical mechanical study of Boltzmann machines.
J. Phys. A, 20:2133--2145, 1987.
Theoretical analysis of Boltzmann machines.
- 7
-
Carpenter and Grossberg.
The art of adaptive resonance theory.
IEEE Computer, 21(3), March 1988.
Introduction to ART.
- 8
-
Fukushima and Miyake.
Neocognitron: A new algorithm for pattern recognition of deformations
and shifts in position.
Pattern Recognition, 15(6):445--4369, 1982.
The Neocognitron unsupervised NN.
- 9
-
G Hinton.
Learning translation invariant recognition in massively parallel
networks.
In Proceedings of Parallel architectures and languages, Europe.
Springer Verlag, 1987.
Early and illustrative application of BackProp.
- 10
-
J J Hopfield.
Neurons with graded response have collective computational properties
like those of two-state neurons.
Proc. National Acad. Sci. USA, 81:3088--3092, 1984.
Deriving properties of `analogue' Hopfield networks.
- 11
-
J J Hopfield.
Neural networks and physical systems with emergent collective
computational abilities.
Proc. National Acad. Sci. USA, 79:2554--2558, April 1982.
Hopfield's original paper; readable and widely quoted.
- 12
-
J J Hopfield and Tank.
Neural computation of decisions in optimization problems.
Biological Cybernetics, 52:141--152, 1985.
Using Hopfield nets to solve the TSP (and other problems).
- 13
-
Kirkpatrick, Gelatt, and Vecchi.
Optimisation by simulated annealing.
Science, 220:671--680, 1983.
This paper provoked the storm of interest in simulated annealing
during the 1980s.
- 14
-
T Kohonen.
The ``neural'' phonetic typewriter.
Computer, pages 11--22, March 1988.
The best early application of Kohonen's ideas.
- 15
-
K S Lashley.
The problem of serial order in behaviour.
In L Jeffress, editor, Cerebral mechanisms in behaviour: The
Hixon sympoium, pages 112--136. Wiley, 1951.
Important early reference.
- 16
-
Richard P Lippmann.
An introduction to computing with neural nets.
IEEE ASSP magazine, pages 4--22, April 1987.
Widely quoted and thorough early summary of the subject; well
written.
- 17
-
R Matthews and T Merriam.
Neural computation in stylometry.
Literary and Linguistic Computing, 8(4):203--209, 1993.
Application of Backprop; easy to undersatnd.
- 18
-
McCulloch and Pitts.
A logical calculus of ideas immanent in nervous activity.
Bull. Math. Biophysics, 5:115--133, 1943.
Where it all began.
- 19
-
Pinker and Prince.
On language and connectionism: Analysis of a parallel distributed
processing model of language acquisition.
Cognition, 28:73--193, 1988.
Lengthy article refuting suggestions that NNs mimic human behaviour.
- 20
-
Prager and Fallside.
A comparison of the Boltzmann machine and the back propagation
network as recognisers of static speech patterns.
Computer Speech and Language, 2:179--183, 1987.
Demonstrates that Boltzmann machines outperform BackProp on similar
tasks.
- 21
-
Prager, Harrison, and Fallside.
Boltzmann machines for speech recognition.
Computer Speech and Language, 1:3--27, 1986.
Early and accessible application of Boltzmann machines.
- 22
-
B D Ripley.
Statistical aspects of neural networks.
In J L Jenson O E Barnsdorff-Nielson and W S Kendall, editors,
Chaos and Networks - Statistical and Probabilistic Aspects. Chapman and
Hall, 1993.
A sceptical article, noting that traditional statistical techniques
can outperform NNs.
- 23
-
Rosenberg and Sejnowski.
The spacing effect on NetTalk, a massively parallel network.
In Proceedings of the 8th Annual Conference of the Cognitive
Science Society, pages 72--89, Hillsdale, NJ, 1986. Lawrence Erlbaum.
Anatomy of a NetTalk network.
- 24
-
D Rumelhart, G Hinton, and R Williams.
Learning internal representations by error propagation.
In D Rumelhart and J McClelland, editors, Parallel Distributed
Processing: Explorations in the Microstructure of Cognition. Foundations,
volume 1. MIT Press, 1986.
Provides an example of `Back propagation in time'.
- 25
-
Sejnowski and Rosenberg.
Parallel systems that learn to pronounce English text.
Complex Systems, 1:145--168, 1987.
One of the best early applications of BackProp.
- 26
-
D Servan-Schreiber, A Cleeremans, and J McClelland.
Learning sequential structure in simple recurrent networks.
In D Touretzky, editor, Advances in Neural Information
Processing Systems, volume 1. Morgan Kaufmann, 1989.
This paper exhibits the use of context units -- feedback from the
hidden layer of a BPN.
- 27
-
Wilson and Pawley.
On the stability of the TSP algorithm of Hopfield and tank.
Biological Cybernetics, 58:63--70, 1988.
Analysis of Hopfield's TSP solution.
Applications
The following articles are referenced in the section on applications to
computer vision and include some non-ANN papers.
Bob Fisher
Mon Aug 4 14:24:13 BST 1997