Description
| Topics that are covered in the course include:
" Bayesian decision theory: the theoretical statistical basis for recognition based on Bayes theorem from probability
" Maximum-likelihood and Bayesian parameter estimation: parameters of probability density functions
" Nonparametric techniques: Parzen window, k-nearest neighbor
" Linear discriminant functions: gradient descent, relaxation, minimum squared-error procedures such as LMS, and support vector machines
" Algorithm-independent machine learning
" Unsupervised learning and clustering
The course is quite mathematical. Students enrolling this class are expected to have a good understanding of probability and random variables, both one-dimensional and multi-dimensional, and a good background in linear algebra as well as calculus. Some of the necessary math will be reviewed at the beginning of the course, but it is only a quick review, not a math course.
Grades will be based on homework, tests, and small computer projects.
|
---|