Learning Outcomes

This course provides a mathematical foundation for the study of communication systems and understanding their operation. It starts with review of basic mathematical concepts, such as general operations with functions in signal processing, Fourier transforms, probability density functions, and convolution. It then uses these tools to examine the operation of modern communication systems, such as analogue (linear CW) and digital modulation, including amplitude modulation, frequency modulation, phase modulation, and multisymbol phase and quadrature amplitude modulation techniques. Fourier analysis is further applied to processing of energy and power signals, signal modulation theorem, introduction of concepts of correlation functions and spectral density of signals. Various sources of random signals and noise, including quantisation noise, thermal Nyquist-Johnson noise, and additive white-Gaussian noise (AWGN) and their quantitative analysis and contributions to bit error rate (BER) in digital systems are studied in detail. The course finishes with a discussion of basics of information theory, concepts of self-information and information entropy and their applications to source coding and evaluation of performance bounds (Shannon-Hartley law), and identifies how close commercially important systems are to these bounds.

On completion of the course, the student is expected to:

1. Perform the shift and scale communication functions, using Woodward's notation.
2. Calculate the mean and variance using a probability density function.
3. Apply the Fourier Transform as a signal processing tool.
4. Understand and evaluate effect of noise in communications.
5. Analyse analogue and digital signal modulation techniques.
6. Apply concepts of information theory to design Huffman codes to compress data.
7. Describe and apply the Shannon-Hartley bound.

Outline Syllabus

Mathematical review:
Shifting and scaling functions, Woodward's notation.

Probability methods in communications:
Sample space, discrete and continuous random variables; Conditional probability and statistical independence, Bayes theorem’ Cumulative distribution function (CDF) and probability density function (pdf); Probability models and their application in communications; Statistical averages, calculation of mean and variance, tail function.

Transform theory:
Linear time-invariant systems, Fourier series; Definition of Fourier forward and inverse transforms, some Fourier transform pairs; Modulation theorem; Energy and power signals and spectra, Parseval’s and Rayleigh’s theorems; Convolution integral and autocorrelation function, energy and power spectral density.

Modulation:
Analysis of amplitude modulation; Time and frequency domain performance, baseband and bandpass signals, fractional bandwidth; Types of linear CW modulation, DSBTC and DSBSC modulation schemes.
Analysis of digital transmission, signal sampling (Nyquist theorem), modulation (PAM), and coding (PCM); Signal space and signal constellations; ASK, FSK and PSK modulation schemes, Multisymbol signalling, phase modulation and quadrature amplitude modulation (QAM) techniques.

Noise in digital communication systems:
Noise and its calculation, noise correlation functions and power spectral density (Wiener-Khinchine theorem); Sources of noise, quantisation noise, thermal Nyquist-Johnson noise, additive white Gaussian noise (AWGN) in communication systems, Signal-to-noise ratio (SNR), bit-error rate (BER) in digital systems and its calculation.

Information theory and coding:
Introduction to information theory, information measure, self-information, information entropy, redundancy of message; Information coding, code performance; Discrete memoryless sources (DMSs), message statistics, digital source coding, the Huffman algorithm, code efficiency estimate; Shannon coding theorem and Shannon-Hartley bound.