Non-Symbolic Artifical Intelligence 2006 |
|
Inman Harvey COGS 5C12 ext 8431 | inmanh@cogs.sussex.ac.uk |
You can get (buy) a copy of all lecture handouts from Celia in the
Informatics Library. Or you can download them all in one pdf file here, 4.6Mb 87
pages.
Lecture notes for each individual lecture below. First link to web
pages, second link to Powerpoint file,
third to pdf. Pdf files best if you are going to print them out.
Subjects for each seminar are posted here:-
Seminars week 2
Seminars week 3
Seminars week 4
Seminars week 5
(Remember: penalties 10% up to 24 hrs late, 100% penalty after that!)
Firstly, a separate training part of the program should be written such that this ANN can have its weights and biases trained by back-propagation (or a variant).
Secondly, you should write an alternative Genetic Algorithm training method for finding suitable values for all the weights and biases of the same network. Appropriate methods for encoding all the weights and biases on the genotype should be used, and a suitable fitness function designed.
You should then use independently each training method, backprop and GA, on a version of the 4-bit parity problem. Here the 4 inputs can be any combination of 0s and 1s, and the desired target output of the first Output node is (as close as possible to) 0.0 when there is an even number of input 1s (i.e. 0, 2 or 4 1s) and 1.0 otherwise; the desired target for the second Output node is the opposite (1.0 for even, 0.0 for odd).
Each training method, backprop and GA, should be optimised as far as
possible, and then a comparison drawn between performance with the two
methods. Is this problem more appropriate for one method than the
other?
NB:
this link to a note on Generalisation is relevant.