Non-Symbolic Artifical Intelligence 2006

Inman Harvey COGS 5C12 ext 8431 inmanh@cogs.sussex.ac.uk

Lectures

Lectures are Tue 11:00/Thu 16:00/Fri 09:00 in ARUN-401, in weeks 1 to 6 of Summer term (week1: start Thursday).

You can get (buy) a copy of all lecture handouts from Celia in the Informatics Library. Or you can download them all in one pdf file here, 4.6Mb 87 pages.

Lecture notes for each individual lecture below. First link to web pages, second link to Powerpoint file, third to pdf. Pdf files best if you are going to print them out.

 Lec 1 [ppt] [pdf]
 Lec 2 [ppt] [pdf]
 Lec 3 [ppt] [pdf]
 Lec 4 [ppt] [pdf]
 Lec 5 [ppt] [pdf]
 Lec 6 [ppt] [pdf]
 Lec 7 [ppt] [pdf]
 Lec 8 [ppt] [pdf]
 Lec 9 [ppt] [pdf]
 Lec 10 [ppt] [pdf]
 Lec 11 [ppt] [pdf]
 Lec 12 [ppt] [pdf]
 Lec 13 [ppt] [pdf]
 Lec 14 [ppt] [pdf]

Link to last year's 2005 NSAI Lectures

Syllabus for 2005 lectures was broadly similar. Use this link.

Seminars

Seminars will run from Week 2 to Week 6. You will be in one of two separate groups listed here, seminar times are
Thursdays 09:00-10:50 in PEV1-1A3
Fridays 14:00-15:50 in PEV1-1A1

Subjects for each seminar are posted here:-

Seminars week 2
Seminars week 3 
Seminars week 4
Seminars week 5


Coursework

50% of your assessment comes from a Programming exercise, with a short (max 2000 words) report. This must be completed and handed in to Informatics School office by 4pm on Thurs May 25 (week 6).

(Remember: penalties 10% up to 24 hrs late, 100% penalty after that!)


Assessed Programming Project
Your program should implement a 3 layer Artificial Neural Net (ANN), with 4 Input nodes, 4 Hidden nodes and 2 Output nodes. The Hidden nodes each receive weighted inputs from all of the previous Input layer, plus a bias; the Output nodes likewise from the Hidden layer. Sigmoid transfer functions [ 1/(1+e^(-x)) ] should be used at nodes where appropriate. Using this ANN code, two separate training methods should then be implemented, for training the weights and biases on any set of Input/Output training examples:-

Firstly, a separate training part of the program should be written such that this ANN can have its weights and biases trained by back-propagation (or a variant).

Secondly, you should write an alternative Genetic Algorithm training method for finding suitable values for all the weights and biases of the same network. Appropriate methods for encoding all the weights and biases on the genotype should be used, and a suitable fitness function designed.

You should then use independently each training method, backprop and GA, on a version of the 4-bit parity problem. Here the 4 inputs can be any combination of 0s and 1s, and the desired target output of the first Output node is (as close as possible to) 0.0 when there is an even number of input 1s (i.e. 0, 2 or 4 1s) and 1.0 otherwise; the desired target for the second Output node is the opposite (1.0 for even, 0.0 for odd).

Each training method, backprop and GA, should be optimised as far as possible, and then a comparison drawn between performance with the two methods. Is this problem more appropriate for one method than the other?

NB: this link to a note on Generalisation is relevant.


Exam

50% of your assessment comes from an unseen exam on Wed 14th June at 9:30. This is one and a half hours, and you should answer 2 out of the given 3 questions.
For reference, copies of past exam papers can be found [here]. Syllabus for 2005, 2004, 2002, 2001 was similar to this year, but for 2003 was a bit different.
Warning: you will have to type in "Non-Symbolic Artificial Intelligence" in full as the name of the course, the database doesn't seem to like abbreviations.