Neural Networks and Connectionist Computing
Fall 2001, TuTh 9:30-10:45pm, 120 TMCB
Professor:
Tony Martinez
3360 TMCB, x6464
Office Hours: by appointment
http://axon.cs.byu.edu/~martinez
TA:
Jennifer Stallard
Neural Network Lab: 3325 TMCB, 378-1660
e-mail: jrstall@cs.byu.edu
Office Hours: M 1-2:50pm and W 10-11:50am
Also available by appointment or whenever in Neural Network Lab
Office hours on Oct 24th are changed to: 2-3pm and 5-6pm
Midterm review session is Tues, Oct 23rd at 5pm in 1120 TMCB


Goals: Introduce and study the philosophy, utility, and models of connectionist computing, such that students are able to propose original research with potential follow-up in a graduate research program. Expand the creativity of the students in all aspects of computing.

Text: Prepared packet of overhead copies which can be purchased at the bookstore. You will be expected to read the assigned literature before and optionally after the scheduled lecture. To help motivate reading I will pass around a sheet for you to mark whether you have done a complete and careful reading, a partial reading, or no reading. A total reading counts for 3 points, partial - 1 point, none - 0 points. Each day mark if you have done the reading for the lecture to be given that day. The grading will be non-linear such that missing one or two readings does not hurt much, but it picks up fast after that.

Prerequisites: Senior or Graduate standing, 380, 312, Math 343 (linear algebra), Creativity.

Software: You can do your simulations, etc. in the department open labs using the NeuroSolutions simulator which comes with the book. It is already installed on the open lab machines so you should not need the CD for that. It looks like a pretty good simulator with numerous options. The version we have does not allow us to save weights however.

Extra Literature: I have a number of papers in my office that can be looked over and copied under constraint of the 20 minute rule (The paper must be back to me within 20 minutes). I can also send for most any paper you wish through interlibrary loan, (and will do so), but it usually takes 2-3 weeks, so plan ahead.

Grading (~): Reading: 10%, Simulations and Homework: 25%, Midterm: 21%, Project: 22%, Final: 22% (Fri., Dec. 15, 7am-10am). Grading is on a curve and some amount of subjectivity is allowed for attendance, participation, perceived effort, etc. If you think, you'll be all right.

Late assignments: Assignments are expected on time (beginning of class on due date). Late papers will be marked off at 5%/school day late. However, if you have any unusual circumstances (sick, out of town, unique from what all the other students have, etc.), which you inform me of, then I will not take off any late points. Nothing will be accepted after the last day of class instruction.

Project: An in-depth effort on a particular aspect of neural networks. A relatively extensive literature search in the area is expected with a subsequent bibliography. Good projects are typically as follows: Best: Some of your own original thinking and proposal of a network, learning paradigm, system, etc. This (and other projects) are typically well benefited by some computer simulation to bear out potential. Very Good: Starting from an in-depth study of some current model, strive to extend it through some new mechanisms. Not Bad: A study of a current model with an in-depth analysis of its strengths, weaknesses, potential, and suggested research. Not Good: A description of a current model. The earlier you start the better. Note that in a semester course like this, you will have to choose a topic when we have only covered half of the material. That does not mean your project must cover items related to the first half of the semester. You should use your own initiative and the resources available (library literature, texts, me, etc.) to peruse and find any topic of interest to you, regardless of whether we have or will cover it in class. Interesting models which we will probably not have time to cover in-depth in class include: Feldman nets, Kohonen maps, HOTLU's, BAMs, CMAC, ASN, Cognitron, Neo-Cognitron, BolzCONS, Michie Boxes, Cauchy Machines, Counterpropagation, Madaline II, Associative Networks, RCE, etc.


Topic Reading Assignment
1. Introduction to Neural Networks, Learning, NN Goals (2) Appendix B, Ch. 1
2. Brain and Nervous System (2) Your Neural Network
3. Definitions, Theory, learning, applications, and General Mechanisms of Neural Networks (2) Ch. 2
4. Delta Rule Models - Linear associators, Perceptron, Adaline, Quadric Machines, Higher Order networks, Committee Machines, and separability issues (3) Ch. 3.1-3.3 Delta rule Simulation
5. Back-Propagation (2) Ch. 3.4-3.12 Backpropagation Sim.
6. Applications, Learning ensembles (2) Ch. 4
7. Back-Propagation variants and extensions (2)
Midterm (1)
8. Radial Basis Function networks, Ontogenic networks (2) Ch. 5 RBF Simulation
9. Hopfield Networks (2) Ch. 6, 11.10
10. Boltzmann Machines (1) Paper
11. Unsupervised learning, Competitive Learning (2) Ch. 7 CL Simulation
12. Survey of other models, implementation, future research (1-2)
Oral Presentations (2-3) Final Project Paper