Neural Networks and Connectionist Computing

Fall 99, TuTh 9:35-10:50pm, 120 TMCB

Professor: Tony Martinez, 3366 TMCB, x6464, Office Hours: TuTh 3:00 — 4:00 pm or by appointment, http://axon.cs.byu.edu/~martinez

TA: Butch Istook, 378-3877, 2220 TMCB, Office Hours: TuTh 2-3 pm MW 10-11, or by appointment


  Goals: Introduce and study the philosophy, utility, and models of connectionist computing, such that students are able to propose original research with potential follow-up in a graduate research program. Expand the creativity of the students in all aspects of computing.   Text: Mohamad Hassoun, Fundamentals of Artificial Neural Networks. Prepared Packet of overhead copies which can be purchased at the bookstore. You will be expected to read the assigned literature before and optionally after the scheduled lecture. To help motivate reading I will pass around a sheet for you to mark whether you have done a complete and careful reading, a partial reading, or no reading. A total reading counts for 3 points, partial - 1 point, none - 0 points. Each day mark if you have done the reading for the lecture to be given that day. The grading will be non-linear such that missing one or two readings does not hurt much, but it picks up fast after that.   Prerequisites: Senior or Graduate standing, 380, 312, Math 343 (linear algebra), Creativity.   Software: You can do your simulations, etc. in the department open labs using SNNS (Stuttgart Neural Network Simulator). One of a number of mirror sites for SNNS is http://hertz.ee.sun.ac.za/SNNSinfo/. The software is free and it is a fairly comprehensive and powerful package used by a number of neural network researchers. It has reasonable and extensive on-line help and overviews.   Extra Literature: I have a number of papers in my office that can be looked over and copied under constraint of the 20 minute rule (The paper must be back to me within 20 minutes). I can also send for most any paper you wish through interlibrary loan, (and will do so), but it usually takes 2-3 weeks, so plan ahead.   Grading (~): Reading: 10%, Simulations and Homework: 25%, Midterm: 21%, Project: 22%, Final: 22% (Mon., Dec. 13, 7am-10am). Grading is on a curve and some amount of subjectivity is allowed for attendance, participation, perceived effort, etc. If you think, you'll be all right.   Late assignments: Assignments are expected on time (beginning of class on due date). Late papers will be marked off at 5%/school day late. However, if you have any unusual circumstances (sick, out of town, unique from what all the other students have, etc.), which you inform me of, then I will not take off any late points. Nothing will be accepted after the last day of class instruction.   Project: An in-depth effort on a particular aspect of neural networks. A relatively extensive literature search in the area is expected with a subsequent bibliography. Good projects are typically as follows: Best: Some of your own original thinking and proposal of a network, learning paradigm, system, etc. This (and other projects) are typically well benefited by some computer simulation to bear out potential. Very Good: Starting from an in-depth study of some current model, strive to extend it through some new mechanisms. Not Bad: A study of a current model with an in-depth analysis of its strengths, weaknesses, potential, and suggested research. Not Good: A description of a current model. The earlier you start the better. Note that in a semester course like this, you will have to choose a topic when we have only covered half of the material. That does not mean your project must cover items related to the first half of the semester. You should use your own initiative and the resources available (library literature, texts, me, etc.) to peruse and find any topic of interest to you, regardless of whether we have or will cover it in class. Interesting models which we will probably not have time to cover in-depth in class include: Feldman nets, Kohonen maps, HOTLU's, BAMs, CMAC, ASN, Cognitron, Neo-Cognitron, BolzCONS, Michie Boxes, Cauchy Machines, Counterpropagation, Madaline II, Associative Networks, RCE, etc.