CS 678 - Advanced Machine Learning and Neural Networks



Winter 2015, TuTh 9:30-10:45am, 134 TMCB

Professor: Tony Martinez, 3326 TMCB, x6464, http://axon.cs.byu.edu/~martinez


Office Hours: by appointment


 Course Website: http://axon.cs.byu.edu/~martinez/classes/678/


Goals:  Continuation in the study of the philosophy, utility, and models of machine learning, such that students are able to propose original research with potential follow-up in a graduate research program.  Expand the creativity of the students in all aspects of computing.


Prerequisites:  CS 478 (Introduction to Neural Networks and Machine Learning)


Text:  Papers available from the class website.   Also, we will read a few chapters from Machine Learning by Tom Mitchell.  You are expected to read the assigned literature before and optionally after the scheduled lecture.


Assignments:  We will have three projects during the semester allowing you to build and experiment with some new machine learning models.  You will also present a paper of your choice to the class.  There will be a couple of small homeworks.  There will be a midterm and a final.


Grading (~):



Paper Presentation


Recurrent Neural Network Project


Deep Learning Project


Model of your choice Project







Grading is on a curve and some amount of subjectivity is allowed for attendance, participation, perceived effort, etc.  If you think, you'll be all right.  The mid-term exam will be in the testing center Feb. 23-25 and the final exam will be in class on Friday, April 17 from 7:00am-10:00am.


Late assignments:  Assignments are expected on time (hard copy at the beginning of class on the due date).  Late papers will be marked off at 5%/school day late up to a maximum of 50% off.  However, if you have any unusual circumstances (sick, out of town, unique from what all the other students have, etc.), which you inform me of, then I will not take off any late points.  Nothing can be accepted after the last day of class instruction.


Topics (Order is approximate):

          Von Neumann bottleneck/neurobiology primer

          Advanced Backpropagation Concepts

o    On-line vs. Batch

o    Classification Based Learning

o    Other (Higher Order nets, Ontogenic Nets)

          Recurrent Neural Networks (Elman Nets, BPTT, RTRL)

          Support Vector Machines (with brief review of Quadric/Higher Order Machines and RBF networks)

          Hopfield Networks

          Boltzmann Machines

          Deep Learning

          HMMs (with Baum Welch Learning - EM algorithm), with detailed speech recognition as the example platform

          MULTCONS, Hopfield Extensions

          Semi-Supervised Learning

          Rule Based Learning (Sequential Covering, CN2)

          Reinforcement Learning

          Ensembles (Variations, BMC vs BMA, Oracle Learning, etc.)


Next Group of Topics as Time Allows (Based on Needs):

         Bias: Interesting/computable problems, Bias-Variance Decomposition

          ADIB (Automatic Discovery of Inductive Bias)/Latest lab Research

          Structured Prediction

          Manifold Learning/Non-Linear Dimensionality Reduction

          Record Linkage/Family History Directions


          Feature Selection

          Computational Learning Theory

          Transfer Learning


          Other Unsupervised Learning Models


Topics and Readings Schedule:


Readings Presentations Schedule:

Project Presentations Schedule:              

University and Other Items: