English Türkçe için tıklayın
Chairman's Welcome Message Information
 Lisans Öğrencileri Lisansüstü Adayları
EE572

Title: MATHEMATICAL METHODS IN SIGNAL PROCESSING

 

Credits: 3

 

Catalog Description: Metric spaces, normed vector spaces, basis sets. The four subspaces of the linear transforms. Approximation in Hilbert spaces: least squares filtering and estimation, linear regression, polynomial approximation, minimum norm solutions and system identification. Matrix factorization, eigenvectors, singular value decomposition, iterative matrix inverses, pseudoinverse. Theory of constrained optimization and dynamic programming.

Coordinator: Bülent Sankur, Professor of Electrical-Electronics Engineering


Goals: Refresh the notions of linear algebra. Review the concept of vector algebra applied to signal processing. Concepts of projection, subspace, orthogonality as occurring in such applications as approximations, series expansions, prediction, linear filtering, de-noising, estimation. Familiarity with matrix operations, their fundamental spaqces, solutions of large sets of simultaneous linear equations.

Learning Objectives:


At the end of this course, students will be able to:


1.    Have a deeper understanding of the theory and relevance of vector space concepts, metricity, and matrices as principal operators in finite dimensional Hilbert spaces;
2.    Be able to analyze linear models and subspace solutions.
3.    Be capable of reasoning in terms of explanatory variables and dependent variables 
4.    Be capable of carrying out filtering, estimation, prediction, signal analysis and synthesis algorithms


Textbook: T. Moon, W. Stirling, Mathematical Methods and Algorithms for Signal Processing, Prentice-Hall

Reference Texts:


-    G. Golub, F. van Loan, Matrix Computation
-    T. Hastie, R. Tibshirani, J. Friedman, The Elements of Statistical Learning.



Prerequisites by Topic:


1.    Signals and systems
2.    Probability theory


Topics:


1.  Preliminaries: Metric spaces and normed vector spaces; Norms and normed vector spaces. Orthogonal spaces and orthogonalization; Hilbert and Banach spaces
2.    Subspace Properties:  Basis sets for subspaces. Complements of sets and spaces. Linear mappings. The four subspaces of the linear transforms. 
3.    Approximation in Vector Spaces: Approximation in Hilbert spaces Normal equations, positive definiteness, Grammian. Principle of orthogonality. Matrix formulation of the least squares problem. Best linear unbiased estimate: BLUE
4.    Principle of orthogonality: Continuous and discrete polynomial approximation. Linear regression and explanatory variables. Least squares filtering and estimation. Prediction and function fitting. Minimum mean square and lest square estimations. AR spectrum. Wavelet transform.
5.    Linear operators:  Examples of linear operators. Operator continuity and boundedness. Conjugate operator and conjugate spaces.  The four fundamental spaces of matrices.
6.    Matrix inverses: Right inverse and left inverse. Full row rank and full column rank. Least squares solutions, minimum norm solution. Matrix rank and invertibility. Rank in numerical analysis.
7.    Matrix Factorization:  LU factorization theory and practice, Cholesky, Unitary matrices and QR decomposition. Householder and Givens transformations.
8.    Eigenvalues and eigenvectors:  Eigenvectors, diagonalization and linear systems. Subspaces and invariance. The geometry of eigenvectors. Eigenfilters.
9.    Singular Vbalue Decomposition—The SVD theorem. SVD properties. Pseudoinverse (Moore-Penrose) and  system identification using SVD; Total least squares.
10.    Special matrices:  Modal matrices; Permutation matrices; Toeplitz matrices and Circulant matrices; Vandermonde matrices
11.   Optimization Methods: Theory of constrained optimization; Inequality constraints: Kuhn-Tucker; Dynamic programming; Shortest path algorithms; Maximum likelihood sequence estimation.
12.   Iterative Methods in Signal Processing. Contraction mappings; Newton’s method and steepest descent; Applications: LMS adaptive filtering,; Iterative matrix inverses
13.    Applications: The E&M algorithm in signal processing. Kalman filtering.



Course Structure: The class meets for four lectures a week, consisting of four 50-minute sessions. 7-8 sets of homework problems are assigned per semester.  There is one in-class mid-term exams, 5 quizzes, and a final exam.   


Computer Resources: Students are encouraged to use MATLAB to solve their homework problems.


Laboratory Resources:
None.


Grading:


1.    Homework sets (50%)
2.    One mid-term exam (20% each).
3.    Final exam (20%).
4.    Quizzes (10%)


Outcome Coverage:


(a)    Apply math, science and engineering knowledge.  This course requires probability theory, linear system theory, vector-matrix algebra. Engineering intuition and commonsense is required to translate signal processing instances for the proper application of  linear algebra statistical tools.  
(b)    Design a system, component or process to meet desired needs. The students have to design a few statistical filtering algorithms.  
(c)    Use of modern engineering tools. Students use Matlab on a few occasions.


Prepared By: Bülent Sankur
 

 

Boğaziçi Üniversitesi - Elektrik ve Elektronik Mühendisliği Bölümü

34342 - Bebek / İSTANBUL

Tel: +90 212 359 64 14
Fax: +90 212 287 24 65

 

 

 

 

 

 

Copyright 2009 by Bogazici University EE Dept.    |   Privacy Statement   |   Terms Of Use     |     Login