CS
678  Advanced Machine Learning and Neural Networks
Syllabus
Winter 2018, TuTh 1:352:50pm,
3718 HBLL
Professor: Tony Martinez, 3326 TMCB, x6464, http://axon.cs.byu.edu/~martinez
Office Hours: by appointment
Course Website: http://axon.cs.byu.edu/~martinez/classes/678/
Goals: Continuation in the study of the
philosophy, utility, and models of machine learning, such that students are
able to propose original research with potential followup in a graduate
research program. Expand the
creativity of the students in all aspects of computing.
Prerequisites: CS 478 (Introduction to Neural Networks
and Machine Learning)
Text: Papers
available from the class website.
Also, we will read a few chapters from Machine Learning by Tom Mitchell. You are expected to read the assigned
literature before and optionally after the scheduled
lecture.
Assignments: We will have two projects during the
semester allowing you to build and experiment with some new machine learning
models. There will be a few homeworks. There will be a midterm and a final.
Grading (~):
Homeworks 
8% 
Your Model Readings 
4% 
Model of your choice Project 
22%

Deep Learning Project 
22%

Midterm 
22%

Final 
22%

Grading is on a curve and subjectivity is allowed for attendance, participation, perceived effort,
etc.
The midterm exam will be in the testing center Feb. 2123 and the
final exam will be in class on Monday, April 23 from 2:30pm5:30pm.
Late assignments: Assignments are expected on time (hard
copy at the beginning of class on the due date). Late papers will be marked off at
10%/school day late. However, if you have any unusual
circumstances (sick, out of town, unique from what all the other students have,
etc.), which you inform me of, then I will not take off late points. Nothing can be accepted after the last
day of class instruction.
Topics (Order is approximate):
á
Von Neumann bottleneck/neurobiology primer
á
Advanced Backpropagation Concepts
o
Online vs. Batch
o
Classification Based Learning
o
Other (Higher Order nets, Ontogenic
Nets)
á
Hopfield Networks
á
Boltzmann Machines
á
Recurrent Neural Networks (Elman Nets, BPTT, RTRL)
á
Deep Learning
á
Support Vector Machines (with brief review of Quadric/Higher
Order Machines and RBF networks)
á
HMMs (with Baum Welch Learning  EM algorithm), with
detailed speech recognition as the example platform
á
MULTCONS, Hopfield Extensions
á
Rule Based Learning (Sequential Covering, CN2)
á
SemiSupervised Learning
Next Group of
Topics as Time Allows (Based on Needs):
á
Ensembles (Variations, BMC vs BMA, Oracle Learning, etc.)
á Bias: Interesting/computable problems, BiasVariance Decomposition
á
ADIB (Automatic Discovery of Inductive Bias)/Latest lab
Research
á
Structured Prediction
á
Manifold Learning/NonLinear Dimensionality Reduction
á
Record Linkage/Family History Directions
á
MetaLearning
á
Feature Selection
á
Computational Learning Theory
á
Transfer Learning
á
Transduction
á
Reinforcement Learning
á
Other Unsupervised Learning Models
Model of Your Choice Presentations Schedule:
Deep Learning Model Presentations Schedule: