We then introduce one kind of network in detail: the feedforward network trained by backpropagation of error. We discuss model architectures, training methods and data representation issues. We hope to cover everything you need to know to get backpropagation working for you. A range of applications and extensions to the basic model will be presented in the final section of the module.
Lecture 1: Introduction
Lecture 3: Advanced Topics
By popular demand, a list of English terms for mathematical expressions that we are using.
A few suggestions for possible project topics.
Lecture 3 as a ZIP file.
This module will consist of three extended lectures of three hours
each. Lectures will be on Fridays, as usual. There will
be no set assignments. Material covered in this module may be used
in course examinations, however.
The best way to reach us is either at the lectures (during the break, or
after the end) or by email:
|Brainwave: a Java based simulator|
|tlearn: W*ndows, M*cintosh and Un*x implentation of backprop and variants. Written in C.|
|PDP++: C++ software with every conceivable bell and whistle. Un*x only. The manual also makes a good tutorial.|
Related stuff of interest: