Information Theory, Pattern Recognition, and Neural Networks

David MacKay, University of Cambridge

A series of sixteen lectures covering the core of the book "Information Theory, Inference, and Learning Algorithms (Cambridge University Press, 2003)" which can be bought at Amazon, and is available free online. A subset of these lectures used to constitute a Part III Physics course at the University of Cambridge.

Introduction to information theory
* The possibility of reliable communication over unreliable channels. The (7,4) Hamming code and repetition codes.

Entropy and data compression
* Entropy, conditional entropy, mutual information, Shannon information content. The idea of typicality and the use of typical sets for source coding. Shannon's source coding theorem. Codes for data compression. Uniquely decodeable codes and the Kraft-MacMillan inequality. Completeness of a symbol code. Prefix codes. Huffman codes. Arithmetic coding.

Communication over noisy channels
* Definition of channel capacity. Capacity of binary symmetric channel; of binary erasure channel; of Z channel. Joint typicality, random codes, and Shannon's noisy channel coding theorem. Real channels and practical error-correcting codes. Hash codes.

Statistical inference, data modelling and pattern recognition
* The likelihood function and Bayes' theorem. Clustering as an example

Approximation of probability distributions
* Laplace's method. (Approximation of probability distributions by Gaussian distributions.)
* Monte Carlo methods: Importance sampling, rejection sampling, Gibbs sampling, Metropolis method. (Slice sampling, Hybrid Monte Carlo, Overrelaxation, exact sampling)
* Variational methods and mean field theory. Ising models.

Neural networks and content-addressable memories
* The Hopfield network.

Dates:
  • Free schedule
Course properties:
  • Free:
  • Paid:
  • Certificate:
  • MOOC:
  • Video:
  • Audio:
  • Email-course:
  • Language: English Gb

Reviews

No reviews yet. Want to be the first?

Register to leave a review

Show?id=n3eliycplgk&bids=695438
Included in selections:
Small-icon.hover Machine Learning
Machine learning: from the basics to advanced topics. Includes statistics...
NVIDIA
More on this topic:
Mas-160f07 Signals, Systems and Information for Media Technology
This class teaches the fundamentals of signals and information theory with emphasis...
Extensionflag Monte Carlo Methods for Inference and Data Analysis
Monte Carlo methods are a diverse class of algorithms that rely on...
6-895f04 Essential Coding Theory
This course introduces the theory of error-correcting codes to computer scientists...
3-320s05 Atomistic Computer Modeling of Materials (SMA 5107)
This course uses the theory and application of atomistic computer simulations...
Small-icon.hover Fundamentals of Electrical Engineering
This course probes fundamental ideas in electrical engineering, seeking to understand...
More from 'Mathematics, Statistics and Data Analysis':
Ef5dcb87-b65b-46a6-bb2a-c5a3f7807845-7cb915944555.small Engineering Calculus and Differential Equations
Learn fundamental concepts of single-variable calculus and ordinary differential...
20fedd71-34f4-4084-9fde-49f5d2d224a5-2f0bdb8c45f3.small Microsoft Professional Capstone : Big Data
Validate the skills you learned in the Microsoft Professional Program for Big...
07bd7954-0593-43cb-b0c4-0f18f5c25ee1-a5d93e120a6d.small Microsoft Professional Capstone : Data Science
Solve a real-world data science problem in this capstone project for the Microsoft...
86814127-5973-4549-884e-c8d6ea3514cb-8092d5a682f0.small Microsoft Professional Capstone : Artificial Intelligence
Solve a real-world artificial intelligence problem in this capstone project...
520f308b-128f-4d2c-af91-f646e1e312a8-df630e95f440.small Microsoft Professional Capstone: Data Analysis
Showcase the knowledge you acquired in the Data Analysis MPP in this Capstone...

© 2013-2019