Statistical Inference


Romain Brasselet



The course is organized in two weeks of lectures of 2hs each. The first week will give an introduction to probabilities (for biologists) in the first hours, focusing on their role in inference, as emphasized by Jaynes. It will discuss standard probability distributions (normal, binomial, Poisson) and how to characterize them. It will then move on to Bayes theorem and inference (maximum likelihood, Fisher information, maximum a posteriori, predictions and Bayesian predictions). Finally, depending on particular demand, It will address a selection of these topics:

- correlation measures (Pearson, Spearman, Kendall, cross-correlations, partial correlations).

- linear regressions (usual and Bayesian).

- how to do a PCA and what it exactly does.

- permutation tests and bootstrap techniques.

- base rate fallacy, family-wise error rates and how to correct them (Bonferroni, Benjamini-Hochberg).

The second week of lectures will be dedicated to Information Theory. It will introduce the concepts of entropy and mutual information, how to use them and how they quantify and generalize most of the concepts seen in the first week. Depending upon demand, the course will also introduce Maximum Entropy models and their link to Bayesian inference, Granger causality formalized through Transfer Entropy, and some higher level concepts such as synergy and redundancy.

To consult the dates on which the courses take place, consult the calendar.