CS
678 - Advanced Machine Learning and Neural Networks
Syllabus
Winter 2019, TuTh 9:30-10:45am,
134 TMCB
Professor: Tony Martinez, 3336 TMCB, x6464, http://axon.cs.byu.edu/~martinez
Office Hours: by appointment
Course Website: http://axon.cs.byu.edu/~martinez/classes/678/
Goals: Continuation in the study of the
philosophy, utility, and models of machine learning, such that students are
able to propose original research with potential follow-up in a graduate
research program. Expand the
creativity of the students in all aspects of computing.
Prerequisites: CS 478 (Introduction to Neural Networks
and Machine Learning)
Text: Papers
available from the class website.
Also, we will read a few chapters from Machine Learning by Tom Mitchell. You are expected to read the assigned
literature before and optionally after the scheduled
lecture.
Assignments: We will have two projects during the
semester allowing you to build and experiment with some new machine learning
models. There will be a few homeworks. There will be a midterm and a final.
Grading (~):
Homeworks |
8% |
Your Model Readings |
4% |
Model of your choice Project |
22%
|
Deep Learning Project |
22%
|
Midterm |
22%
|
Final |
22%
|
Grading is on a curve and subjectivity is allowed for attendance, participation, perceived effort,
etc.
The mid-term exam will be in the testing center Feb. 25-27 and the
final exam will be in class on Friday, April 19 from 7:00am-10:00am.
Late assignments: Assignments are expected on time (hard
copy at the beginning of class on the due date). Late papers will be marked off at
10%/school day late. However, if you have any unusual
circumstances (sick, out of town, unique from what all the other students have,
etc.), which you inform me of, then I will not take off late points. Nothing can be accepted after the last
day of class instruction.
Topics (Order is approximate):
á
Von Neumann bottleneck/neurobiology primer
á
Advanced Backpropagation Concepts
o
On-line vs. Batch
o
Classification Based Learning
o
Other (Higher Order nets, Ontogenic
Nets)
á
Hopfield Networks
á
Boltzmann Machines
á
Recurrent Neural Networks (Elman Nets, BPTT, RTRL)
á
Deep Learning
á
Support Vector Machines (with brief review of Quadric/Higher
Order Machines and RBF networks)
á
HMMs (with Baum Welch Learning - EM algorithm), with
detailed speech recognition as the example platform
á
MULTCONS, Hopfield Extensions
á
Rule Based Learning (Sequential Covering, CN2)
á
Semi-Supervised Learning
Next Group of
Topics as Time Allows (Based on Needs):
á
Ensembles (Variations, BMC vs BMA, Oracle Learning, etc.)
á Bias: Interesting/computable problems, Bias-Variance Decomposition
á
ADIB (Automatic Discovery of Inductive Bias)/Latest lab
Research
á
Structured Prediction
á
Manifold Learning/Non-Linear Dimensionality Reduction
á
Meta-Learning
á
Feature Selection
á
Computational Learning Theory
á
Transfer Learning
á
Transduction
á
Other Unsupervised Learning Models
Model of Your Choice Presentations Schedule:
Deep Learning Model Presentations Schedule: