Half-bayesianism

De causa computatrorum


The reasonable power of the KL divergence

From intuition to deterministic inference

When doing Bayesian Inference, or any other type of inference, chances are you have heard about KL divergence – short for Kullback-Leibler divergence. This quantity has been pervasive in machine learning and artificial intelligence and in this post I would like to explore with you, the reader, some reasons for why this is so, especially for probabilistic inference. Eventually, we’ll find ourselves dealing with Variational Inference and with a little more patience, with Expectation Propagation.

Read more...

My recommended research methodology

I few advices on how to organize a research effort

Fact is: research is a complex process. When done right, it is just art; when done poorly, it may have far-reaching consequences across multiple dimensions; it can bring one to a hideous spiral of psychological issues. Right! Let’s get down to business. We all have heard people talk about the importance of time management. Rarely this subject, like proper civil education, is discussed with greater detail or practice during formal training.

Read more...
1 of 1