Non-Symbolic Artifical Intelligence 2004 |
|
Inman Harvey COGS 5C12 ext 8431 | inmanh@cogs.sussex.ac.uk |
My lecture notes will be placed up on this website, usually the day they are delivered. First link to web pages, second link to Powerpoint file.
Lec 1 html or
ppt
Lec 2 html or ppt
Lec 3 html or ppt
Lec 4 html or ppt
Lec 5 html or ppt
Lec 6 html or ppt
Lec 7 html or ppt
Lec 8 html or ppt
Lec 9 html or ppt
Lec
10 html or ppt
Lec 11
html or ppt
Lec12 html or ppt
or handouts (.prn)
Lec 13 html or
ppt or handouts (.prn)
Lec 14 html or
ppt or handouts (.prn)
Lec 15: now incorporated into Lec 16
Lec 16 html or
ppt or handouts (.prn)
Group A Mon 14:00 in PEV1-2A12
Group B Mon 15:00 in PEV1-2A12
Group C Tue 14:00 in PEV1-2A12
Group D Tue 15:00 in PEV1-2A12
Subjects for each seminar are as follows:-
Seminar week 2
Seminar week 3
Seminar week 4
Seminar week 5
Seminar Week 6
(Remember: penalties 10% up to 24 hrs late, 100% penalty after that!)
Firstly, a separate training part of the program should be written such that this ANN can have its weights and biases trained by back-propagation (or a variant).
Secondly, you should write an alternative Genetic Algorithm training method for finding suitable values for all the weights and biases of the same network. Appropriate methods for encoding all the weights and biases on the genotype should be used, and a suitable fitness function designed.
You should then use independently each training method, backprop and GA, on a version of the 4-bit parity problem. Here the 4 inputs can be any combination of 0s and 1s, and the desired target output of the first Output node is (as close as possible to) 0.0 when there is an even number of input 1s (i.e. 0, 2 or 4 1s) and 1.0 otherwise; the desired target for the second Output node is the opposite (1.0 for even, 0.0 for odd).
Each training method, backprop and GA, should be optimised as far as possible, and then a comparison drawn between performance with the two methods. Is this problem more appropriate for one method than the other?