Module Description
Learning Outcomes
1. Perform the shift and scale communication functions, using Woodward's notation.
2. Calculate the mean and variance using a probability density function.
3. Apply the Fourier Transform as a signal processing tool.
4. Understand and evaluate effect of noise in communications.
5. Analyse analogue and digital signal modulation techniques.
6. Apply concepts of information theory to design Huffman codes to compress data.
7. Describe and apply the Shannon-Hartley bound.
Outline Syllabus
Mathematical review:
Shifting and scaling functions, Woodward's notation.
Probability methods in communications:
Sample space, discrete and continuous random variables; Conditional probability and statistical independence, Bayes theorem’ Cumulative distribution function (CDF) and probability density function (pdf); Probability models and their application in communications; Statistical averages, calculation of mean and variance, tail function.
Transform theory:
Linear time-invariant systems, Fourier series; Definition of Fourier forward and inverse transforms, some Fourier transform pairs; Modulation theorem; Energy and power signals and spectra, Parseval’s and Rayleigh’s theorems; Convolution integral and autocorrelation function, energy and power spectral density.
Modulation:
Analysis of amplitude modulation; Time and frequency domain performance, baseband and bandpass signals, fractional bandwidth; Types of linear CW modulation, DSBTC and DSBSC modulation schemes.
Analysis of digital transmission, signal sampling (Nyquist theorem), modulation (PAM), and coding (PCM); Signal space and signal constellations; ASK, FSK and PSK modulation schemes, Multisymbol signalling, phase modulation and quadrature amplitude modulation (QAM) techniques.
Noise in digital communication systems:
Noise and its calculation, noise correlation functions and power spectral density (Wiener-Khinchine theorem); Sources of noise, quantisation noise, thermal Nyquist-Johnson noise, additive white Gaussian noise (AWGN) in communication systems, Signal-to-noise ratio (SNR), bit-error rate (BER) in digital systems and its calculation.
Information theory and coding:
Introduction to information theory, information measure, self-information, information entropy, redundancy of message; Information coding, code performance; Discrete memoryless sources (DMSs), message statistics, digital source coding, the Huffman algorithm, code efficiency estimate; Shannon coding theorem and Shannon-Hartley bound.
- Module Supervisor: Vito De Feo