Lectures
Lectures are Mon 12:00/Thu 11:00/Fri 10:00
in ARUN-401, in weeks 1 to 6 of Summer term.
N.B.: Week 3, Monday May 2nd is
a Bank Holiday.
My lecture notes will be placed up on this website, usually the day
they are
delivered. First link to web pages, second link to Powerpoint file,
third to pdf.
Lec 1 [ppt]
[pdf]
Lec 2 [ppt]
[pdf]
Lec 3 [ppt]
[pdf]
Lec 4 [ppt] [pdf]
Lec 5 [ppt] [pdf]
Lec 6 [ppt] [pdf]
Lec 7 [ppt] [pdf]
Lec 8 [ppt]
[pdf]
Lec 9 [ppt]
[pdf]
Lec10 [ppt]
[pdf]
Lec 11 [ppt] [pdf] (A guest lecture by Eric
Vaughan on current work in Passive Dynamic Walking)
Lec 12 [ppt] [pdf]
Lec 13 [ppt] [pdf]
Lec 14 [ppt] [pdf]
Lec 15 [ppt]
[pdf]
Link to last year's 2004 NSAI Lectures
Syllabus for 2004 lectures was broadly similar. Use this link.
Seminars
Seminars will run from Week 2 to Week 6. You will be in one of two
separate
groups , seminar times are
Mondays 14:00-15:50 in PEV1-1A1
Thursdays 09:00-10:50 in PEV1-2A12
Subjects for each seminar will be posted here:-
Seminars week 2
Seminars week 3 NB because of Monday
Bank Holiday, Mon session moves to Thu 10:00 in PEV1-2A12
Seminars week 4
Seminars week 5
Seminar week 6: one session only, Mon 23 May 14:00 in PEV1-1A1:
"Agony Aunt" session, bring your (NSAI) problems.
Coursework
50% of your assessment comes from a Programming exercise, with a short
(max
2000 words) report. This must be completed and handed in to Informatics
School
office by 4pm on Thurs May 26 (week 6).
(Remember: penalties 10% up to 24 hrs late, 100% penalty after
that!)
Assessed Programming Project
Your program should implement a 3 layer Artificial Neural Net (ANN),
with 4 Input nodes, 4 Hidden nodes and 2 Output nodes. The Hidden nodes
each receive weighted inputs from all of the previous Input layer, plus
a bias; the Output nodes likewise from the Hidden layer. Sigmoid
transfer functions [ 1/(1+e^(-x)) ] should be used at nodes where
appropriate. Using this ANN code, two separate training methods should
then be implemented, for training the weights and biases on any set of
Input/Output training examples:-
Firstly, a separate training part of the program should be written
such
that this ANN can have its weights and biases trained by
back-propagation (or a variant).
Secondly, you should write an alternative Genetic Algorithm training
method for finding suitable values for all the weights and biases of
the
same network. Appropriate methods for encoding all the weights and
biases on the genotype should be used, and a suitable fitness function
designed.
You should then use independently each training method, backprop and
GA,
on a version of the 4-bit parity problem. Here the 4 inputs can be any
combination of 0s and 1s, and the desired target output of the first
Output node is (as close as possible to) 0.0 when there is an even
number of input 1s (i.e. 0, 2 or 4 1s) and 1.0 otherwise; the desired
target for the second Output node is the opposite (1.0 for even, 0.0
for
odd).
Each training method, backprop and GA, should be optimised as far as
possible, and then a comparison drawn between performance with the two
methods. Is this problem more appropriate for one method than the
other?
NB:
this link to a note on Generalisation is relevant.
Exam
50% of your assessment comes from an unseen exam, scheduled for Tuesday
21st
June 2005, at 09:30.
This is one and a half hours, and you should answer 2 out of the given
3
questions.
For reference, copies of past exam papers can be found
[here].
Syllabus for 2004, 2002, 2001 was similar to
this year, but for 2003 was a bit different.
Warning: you will have to type in "Non-Symbolic Artificial
Intelligence" in full as the name of the course, the database doesn't
seem to
like abbreviations.