Monson H. Hayes

**The Perceptron**

**"Perceptron Learning,"**R. Rohas, Neural Networks, Springer-Verlag, Berlin 1996

In this paper, the author gives a nice presentation of perceptron learning, a proof of convergence, and how it is related to linear programming.

**Statistical Learning Theory**

- "Probability Inequalites," Larry Wasserman

This is a set of notes for a course in statistical learning at CMU that contains the proof of the Hoeffding inequality.

- Statistical Learning Theory: Models, Concepts and Results," Ulrick von Luxburg and Bernard Scholkopf

This is a beautifully written exposition on statistical learning theory, and should serve as a great reference to help illustrate and solidify the concepts introduced in this course. Taking the time to study this paper carefully will be very much worth your time. It is full of insight and many excellent examples.

**Support Vector Machines**

- "Support-Vector Networks," Corinna Cortes and Vladimir Vapnik

This is the seminal paper on support vector networks. "The support-vector network is a new learning machine for two-group classification problems" (from the Abstract). This is a must-read for anyone wanting to be well-informed about the initial development of SVM's. This paper has been cited over 13,000 times.

- "A Tutorial on Support Vector Machines for Pattern Recognition," Christopher J.C. Burges

This excellent paper provides a wonderfully insightful tutorial on support vector machines, and is a highly recommended paper to learn about SVM's and to gain an understanding of the theory and use of support vector machines.

- "A Users Guide to Support Vector Machines," Asa Ben-Hur and Jason Weston

Once you have read and understood what a support vector machine is and how it works, this nice paper gives some insights into how to get them to work in practice, and the effect of the SVM parameters on the resulting classifier.