This introductory course on machine learning will give an overview of many concepts, techniques, and algorithms in machine learning, beginning with topics such as classification and linear regression and ending up with more recent topics such as boosting, support vector machines, hidden Markov models, and Bayesian networks. The course will give the student the basic ideas and intuition behind modern machine learning methods as well as a bit more formal understanding of how, why, and when they work. The underlying theme in the course is statistical inference as it provides the foundation for most of the methods covered.
Prerequisites: (6.041 or 18.05) and 18.06; 6.034 is helpful
Lectures: Tue. and Thu. 1:00-2.30 PM in 2-190, first lecture Thu Sep 8
Recitations: You should have already signed up for a section on this Doodle poll. The times and locations are:
Office hours: Our regular OH schedule is listed below. We may cancel some office hours that we predict to have low turnout (e.g., after homework is due). Cancellations will always be announced beforehand through Piazza and the wiki. If you want to talk to staff members though, please e-mail the staff and we will set something up.
Instructor: Leslie Kaelbling (lpk at csail dot mit dot edu)
Lecture notes (slides and/or short write-ups), supplementary notes, and reading materials will be made available electronically via the wiki's Calendar.
These two books (taken together) provide good coverage for the course material.
We also provide programming resources and other useful information on the wiki.
Students may opt for an alternative grading formula (Project: 35%, Exam 1: 30%, Exam 2: 35%) by signing a contract to that effect before Oct 1. We strongly recommend doing the homework in any case.
Exams: There will be no final exam. These two exams will probably be 7 - 9PM.