36125651 Neural Networks for Pattern Recognition – Statistical

Document technical information

Format pdf
Size 13.7 kB
First found May 22, 2018

Document content analysis

Category Also themed
Language
English
Type
not defined
Concepts
no text concepts found

Persons

Carl Perkins
Carl Perkins

wikipedia, lookup

Trevor Hastie
Trevor Hastie

wikipedia, lookup

Organizations

Places

Transcript

36125651 Neural Networks for Pattern Recognition – Statistical foundation, perspective and
alternatives.
Graduate course, semester "Bet" 2007, "Yom Ravii" 17:00 – 20:00
Instructor: Prof. Mayer Aladjem, http://www.ee.bgu.ac.il/~aladjem
THIS COURSE IS INTENDED TO BE LARGELY SELF-CONTAINED
PREREQUISITES: basic undergraduate mathematical courses.
This course is suitable for all fields of specialization of Electrical & Computer Engineering.
This course will cover the theory, computational aspects, and practice of a variety of neural techniques
for data analysis. The presentation focuses on methods with the specific goal of predicting future
outcomes, in particular regression and classification methods. From the perspective of pattern
recognition, neural networks can be regarded as an extension of many conventional multivariate
statistical methods for data analysis. In the recent years neural computing has emerged as a practical
technology, with successful applications in many fields.
This course is based mainly on the book C.M.Bishop, "Neural Networks for pattern recognition",
1995. It will cover the materials in Chapters 1-9. In ddition I will teach topics related to the modern
and recently developed methods for independent component analysis and blind source separation.
The lecture notes and handouts will be available in http://hl2.bgu.ac.il.
Final Grade:
40% homework - MATLAB programming assignments for neural network simulations. The course
encourages students to use NETLAB toolbox described in the book I.T.Nabney "NETLAB
Algorithms for Pattern Recognition", 2001. The latter book is intended to complement Bishop (1995).
NETLAB toolbox is available for free www.ncrg.aston.ac.uk/netlab.
30% seminar - around 1 hour talk on topic (a journal paper) selected by the instructor.
30% interview on the homework and the course topics - around 30 min. with each student.
Syllabus:
Fundamental concepts of statistical pattern recognition. Principle component analysis (PCA).
Maximum Likelihood (ML) procedure. Stochastic approximation. Non-parametric and mixture models
for density function. Expectation Maximization (EM) algorithm. Projection pursuit mixture density
estimation. Single Layer (SL) network. Solution by Singular Value Decomposition (SVD). Radial
Basis Function (RBF) networks. Regularization theory. Multi-Layer (ML) networks. Batch and
sequential training. Random initialization. Error functions. Statistical interpretation of the outputs and
the hidden units of the network. Visualization of high dimensional data. Learning and generalization.
Bias- variance dilemma. Regularization: weight decay, early stopping of the training, training with
noise. Architecture of the network: validation, complexity criteria, VC-dimension. Pre-processing and
feature extraction.
Text books:
1.
2.
3.
4.
5.
C.M.Bishop, "Neural networks for pattern recognition", Oxford University Press, 1995.
I.T.Nabney "NETLAB Algorithms for Pattern Recognition", Springer, 2001.
T. Hastie, R. Tibshirani, J. Friedman, "The elements of statistical learning – Data Mining,
Inference, and Prediction", Springer, 2001.
S. Haykin, "Neural Networks", Prentice-Hall, Inc, 1999.
Current papers.
×

Report this document