## Signal Processing & Artificial-intelligence Laboratory

**Lectures**

# Undergraduate Courses

### [ECE3008] Signal and Systems

### [MAT2017] Probability and Statistics

### [ENE4015] Signal Processing

# Graduate Courses

## [ELE6048] Detection and Estimation Theory

This course teaches fundamental theory on statistical detection and estimation techniques. Estimation and detection theory is an important element of information and signal processing widely used for communications, automatic control, radar, and machine learning. The first half of this course will be on various detection theories, e.g., Bayesian hypothesis testing, Neyman-Pearson criteria, min-max testing, composite hypothesis testing, and sequential detection. The second half will cover popular estimation methods including minimum variance unbiased estimation (MVUE), maximum likelihood estimation, Bayesian estimation, expectation and maximization (EM) algorithm and Kalman smoother. Finally, we will briefly introduce sequential Monte-Carlo sampling methods and particle filter.

## [ELE9105] Statistical Signal Processing

In this course, we will study a variety of modern statistical signal processing approaches. First, we will briefly review the basic probability and random process. We will then take a look at basic linear estimation theory including linear MMSE, gradient descent method, Neutonâ€™s method, and LMS algorithm. The students will learn how to model the signals using hidden Markov model (HMM), Gaussian mixture model, auto-regressive moving average (ARMA) model, Bayesian network, random Markov field, and Gaussian-Markov process. Finally, we will learn various inference tools such as forward-backward algorithm, Viterbi algorithm, Levinson-Durbin algorithm, Baum-Welch algorithm, expectation-maximization (EM) algorithm, message-passing algorithm.

## [ELE6055] Machine Learning Theory

In this course, we will study modern machine learning theory. First, we learn the statistical learning theory and its fundamental principles. Several regularization methods and Bayesian learning approaches are discussed. Then, we cover the parametric learning methods including logistic regression and neural network as well as the nonparametric methods including kernel method, Gaussian process, support vector machine. Finally, we mainly focus on the principle and applications of deep neural network. The various structures of deep neural network including CNN, RNN, and LSTM will be explained with examples.

## [ELE6056] Advanced Machine Learning Theory

This course introduces various topics on advanced machine learning theory. The students will learn basic statistical tools including Kernel method, Gaussian process, Gaussian mixture model, Bayesian graphical model, Expectation-maximization, Variational inference, and Monte-Carlo sampling. With the solid understanding on these tools, we study several machine learning methods such as support vector machine, Autoencoder, Hidden Markov model, Restricted Bolzman Machine. Finally, we will look at the generative models which have received much attention from machine learning community. We will look into the details of Deep Boltzman machine, Variational autoencoder, Generative adversarial network. The prerequisite of this course is graduate-level machine learning theory course. The students are expected to be familiar with basic probability, statistics and linear algebra theories.