This is an inactive course webpage. Find the one for your current semester.
For information about CSE517a in spring 2018 go HERE.
- Thanks for the semester!
- Thanks for filling out the course evaluation!
- Check Piazza for homework and autograder related announcements.
- Pick-up Final Exam and Resolve Grade Issues:
- TUE 5/9 3-4pm in Jolley 222 (Marion)
- Any grade issues need to be resolved by TUE 5/9 5pm!
- No office hours WED 5/10 4-5pm!
This course assumes a basic understanding of machine learning (including the theoretical foundations and principles of ML as well as hands-on implementation experience). CSE517a covers advanced topics at the frontier of the field in-depth. Topics to be covered include kernel methods (support vector machines, Gaussian processes), neural networks (deep learning), and unsupervised learning. Depending on developments in the field, the course will also cover some advanced topics, which may include learning from structured data, active learning, and practical machine learning (feature selection, dimensionality reduction).
Prerequisites: CSE 247, CSE 417T, ESE 326, Math 233, Math 309, and profound experience in Matlab/Octave. Please, scroll down to the bottom of this page for more information.
Instructor: Marion Neumann
Office: Jolley Hall Room 222
Office Hours: WED 4-5pm (or individual appointment* – avoid drop ins w/o appointment)
*request individual appointments via email and allow for 2-3 days reply and scheduling time
TA Office Hours:
MON 4-6pm in Jolley 224 (Shali) TUE 10am-12pm in Jolley 224 (Wenjia) THU 3-6pm in Jolley 517 (Ruxin & Mengyan) FRI 10am-12pm in Jolley 517 (Zimu)
Please ask any questions related to the course materials and homework problems on Piazza. Other students might have the same questions or are able to provide a quick answer. Any postings of solutions to problems (written or in form of source or pseudo code) will result in a grade of zero for that particular problem for ALL students.
Grades on BB
- Main course book: A First Course in Machine Learning, 2nd edition (FCML) by Rogers and Girolami (We will use this book for further reading, mathematical derivations, and written homework problems.)
- Useful reference book: The Elements of Statistical Learning (ESL) by Hastie, Tibshirani, Friedman (This book is freely available online, so do not hesitate to consult it for additional information.)
- From CSE417t: Learning from Data (LFD) by Abu-Mostafa, Magdon-Ismail, and Lin (Keep your copy from CSE417t around. You might need it again. This book is a terrific resource!)
CAUTION: CSE517a in Spring 2017 has CSE417t as a prerequisite!
If you have taken a ML course from another university, you should be familiar with the topics listed below. Note that given that we assume that our students in CSE417t spend 10hrs a week to study the conceptual materials and work on the assignments, it is not possible to acquire the same level of knowledge as if cse417t was completed through self-study. Also note that online classes like the coursera course Introduction to ML do not provide the same level of depth as cse417t.
You will also need to makeup the homework assignments (especially the MATALB implementations) as we will use some of those implementations for projects in CSE517a, as well as, build upon a profound knowledge of implementing ML algorithms in MATLAB.
We will not be able to provide any help/support for general SVN and MATLAB issues as well as conceptual questions on course materials from prerequisite courses.
This includes basic coding/debugging/testing issues with MATLAB as well as efficient implementation.
List of pre-requisite materials from cse417t for cse517a:
– Supervised Learning Setup
– training, testing, validation, generalization
– training error, testing error, generalization error
– loss functions for regression, classification
– Perceptron algorithm (analysis and implementation in MATLAB)
– linear regression (least squares model)
– linear classification (logistic regression)
– gradient descent
– non-linear feature space transformation
– (hyper)parameter selection, model selection
– cross validation
– regularization, structural risk minimization
– bias-variance decomposition of the error
– parametric vs non-parametric models
– multi-class classification
– k-NN model (2-optimal, implementation in MATLAB)
– KD-trees, Ball-trees
– Decision Trees: Learning, Pruning, and Prediction (analysis and implementation in MATLAB)
– Bagging, random forests (analysis and implementation in MATLAB)
– Boosting, Adaboost (theoretical analysis and implementation in MATLAB)
– Support Vector Machines (primal and dual optimization, slack, kernel SVM)
– Neural Networks (back propagation algorithm)