Information Theory, Pattern Recognition, and Neural Networks

David MacKay, University of Cambridge

A series of sixteen lectures covering the core of the book "Information Theory, Inference, and Learning Algorithms (Cambridge University Press, 2003)" which can be bought at Amazon, and is available free online. A subset of these lectures used to constitute a Part III Physics course at the University of Cambridge.

Introduction to information theory
* The possibility of reliable communication over unreliable channels. The (7,4) Hamming code and repetition codes.

Entropy and data compression
* Entropy, conditional entropy, mutual information, Shannon information content. The idea of typicality and the use of typical sets for source coding. Shannon's source coding theorem. Codes for data compression. Uniquely decodeable codes and the Kraft-MacMillan inequality. Completeness of a symbol code. Prefix codes. Huffman codes. Arithmetic coding.

Communication over noisy channels
* Definition of channel capacity. Capacity of binary symmetric channel; of binary erasure channel; of Z channel. Joint typicality, random codes, and Shannon's noisy channel coding theorem. Real channels and practical error-correcting codes. Hash codes.

Statistical inference, data modelling and pattern recognition
* The likelihood function and Bayes' theorem. Clustering as an example

Approximation of probability distributions
* Laplace's method. (Approximation of probability distributions by Gaussian distributions.)
* Monte Carlo methods: Importance sampling, rejection sampling, Gibbs sampling, Metropolis method. (Slice sampling, Hybrid Monte Carlo, Overrelaxation, exact sampling)
* Variational methods and mean field theory. Ising models.

Neural networks and content-addressable memories
* The Hopfield network.

Dates:
  • Free schedule
Course properties:
  • Free:
  • Paid:
  • Certificate:
  • MOOC:
  • Video:
  • Audio:
  • Email-course:
  • Language: English Gb

Reviews

No reviews yet. Want to be the first?

Register to leave a review

Show?id=n3eliycplgk&bids=695438
Included in selections:
Small-icon.hover Machine Learning
Machine learning: from the basics to advanced topics. Includes statistics...
NVIDIA
More on this topic:
15-450f10 Analytics of Finance
This course covers the key quantitative methods of finance: financial econometrics...
6-441s10 Information Theory
6.441 offers an introduction to the quantitative theory of information and its...
Informationtheory Information Theory
This course is an introduction to information theory, which emphasizes fundamental...
3-320s05 Atomistic Computer Modeling of Materials (SMA 5107)
This course uses the theory and application of atomistic computer simulations...
Small-icon.hover Fundamentals of Electrical Engineering
This course probes fundamental ideas in electrical engineering, seeking to understand...
More from 'Computer Science':
17920e6b-e3ed-4819-8116-e48854e62cce-90b16a034656.small Deep Learning with Python and PyTorch
This course is the second part of a two-part course on how to develop Deep Learning...
72c27b2f-3419-430f-a28f-10dbc7120457-a14087e5df76.small DNA Sequences: Alignments and Analysis
Learn how to align and analyze DNA sequences using web and software based tools...
Cbc86bfc-8b76-4cb9-88d8-faa8a8abd820-50fa32daa1bc.small Software Testing Fundamentals
Learn how to locate software bugs and defects using the latest testing techniques...
7ca98c09-a207-40c7-8a84-b9c48ecdf920-f25c990d1f5f.small Cloud Computing Management
Learn methods for managing cloud computing projects and build an understanding...
91f52ef3-fa3f-4934-9d19-8d5a32635cd4-d99e27f09d19.small Data Science: R Basics
Build a foundation in R and learn how to wrangle, analyze, and visualize data...

© 2013-2019