Non-Symbolic Artifical Intelligence

Inman Harvey COGS 5C12 ext 8431 inmanh@cogs.sussex.ac.uk

Lectures

Lectures are Wed 12:30/Thu 12:30/Fri 10:15 in ARUN-401, in weeks 1 to 4 of Summer term.

Please note: these slides were prepared in Powerpoint, and transferred to html for the online version. For some reason they do not display well online (misalignments etc) particularly with Netscape -- please address all complaints to Microsoft!

Lecture 1
Lecture 2
Lecture 3
Lecture 4
Lecture 5
Lecture 6
Lecture 7
Lecture 8
Lecture 9
Lecture 10
Lecture 11
Lecture 12

Lecture 12 will be on a topic to be decided in consultation with the students the week before -- either covering a previous topic in more depth, or a new topic. The current version on the website (and in notes) is based on what was asked for last year.

Lecture Notes

Full lecture notes for lectures 1--12 will be available from Celia in the COGS library, for the cost of photocopying. The papers for Seminars in weeks 2 and 5 will also be available from there (NB: both are also available online, follow links below).

Seminars

Seminars will run from Week 2 to Week 5. You will be split into 4 groups as listed here (these are the authoritative up-to-date lists!):

Tue 14:00 PEV1-2B13
Tue 15:00 PEV1-2B13
Thu 15:00 PEV1-2A12
Thu 16:00 PEV1-2A12

You will be expected to attend the allotted seminars. If you need to change your seminar time, then please arrange to swap with someone in the other seminar AND notify me by email at the beginning of the week.

Seminar week 2
Seminar week 3
Seminar week 4
Seminar week 5


Assessed Coursework

50% of your assessment comes from a Programming exercise, with a short (max 2000 words) report. This must be completed and handed in to COGS School office by 4pm on Thurs May 30 (week 6).

(Remember: penalties 10% up to 24 hrs late, 100% penalty after that!)


Assessed Programming Project
Your program should implement a 3 layer Artificial Neural Net (ANN), with 4 Input nodes, 4 Hidden nodes and 2 Output nodes. The Hidden nodes each receive weighted inputs from all of the previous Input layer, plus a bias; the Output nodes likewise from the Hidden layer. Sigmoid transfer functions [ 1/(1+e^(-x)) ] should be used at nodes where appropriate. Using this ANN code, two separate training methods should then be implemented, for training the weights and biases on any set of Input/Output training examples:-

Firstly, a separate training part of the program should be written such that this ANN can have its weights and biases trained by back-propagation (or a variant).

Secondly, you should write an alternative Genetic Algorithm training method for finding suitable values for all the weights and biases of the same network. Appropriate methods for encoding all the weights and biases on the genotype should be used, and a suitable fitness function designed.

You should then use independently each training method, backprop and GA, on a version of the 4-bit parity problem. Here the 4 inputs can be any combination of 0s and 1s, and the desired target output of the first Output node is (as close as possible to) 0.0 when there is an even number of input 1s (i.e. 0, 2 or 4 1s) and 1.0 otherwise; the desired target for the second Output node is the opposite (1.0 for even, 0.0 for odd).

Each training method, backprop and GA, should be optimised as far as possible, and then a comparison drawn between performance with the two methods. Is this problem more appropriate for one method than the other?


Exam

50% of your assessment comes from an unseen exam, scheduled for 11th June 2002. This is one and a half hours, and you should answer 2 out of the given 3 questions.

For reference, here is a copy of the [Summer 2000 Exam]

and the [Summer 2001 Exam]