Infinite surprise – the iridescent personality of Kullback-Leibler divergence

Kullback-Leibler divergence is not just used to train variational autoencoders or Bayesian networks (and not just a hard-to-pronounce thing). It is a fundamental concept in information theory, put to use in a vast range of applications. Most interestingly, it’s not always about constraint, regularization or compression. Quite on the contrary, sometimes it is about novelty, discovery and surprise.

Related Articles

Introduction to Autoencoders

Table of Contents Introduction to Autoencoders What Are Autoencoders? How Autoencoders Achieve High-Quality Reconstructions? Revisiting the Story Types of Autoencoder Vanilla Autoencoder Convolutional Autoencoder (CAE) Denoising Autoencoder Sparse Autoencoder Variational Autoencoder (VAE) Sequence-to-Sequence Autoencoder What Are the Applications of Autoencoders?…
The post Introduction to Autoencoders appeared first on PyImageSearch.

PyMC Open Source Development

In this episode of Open Source Directions, we were joined by Thomas Wiecki once again who talked about the work being done with PyMC. PyMC3 is a Python package for Bayesian statistical modeling and Probabilistic Machine Learning focusing on advanced Markov chain Monte Carlo (MCMC) and variational inference (VI) algorithms. Its flexibility and extensibility make it applicable to a large suite of problems.