x

Fifteen Eighty Four

Academic perspectives from Cambridge University Press

Menu
5
Mar
2025

Creative use of prior, likelihood and posterior distributions to develope dependence models using hierarchical structures

Luis E. Nieto-Barajas

Bayes’ Theorem started as a way of obtaining conditional probabilities via the reversed conditionals and thus was called law of inverted probabilities. However, the Bayesian statistical theory uses it as a way of updating prior beliefs associated to uncertain events or quantities. It is common to describe the theorem in words as follows: posterior is obtained as the product of the likelihood and the prior, divided by the prior predictive.

However, there is a creative way of using these three (four) components, prior, likelihood, posterior (and predictive) to create dependence models by simply assigning the corresponding probability law to different (possible latent) variables. A nice consequence of constructing dependence models like this, is that the marginal distributions for the variables of interest are invariant, that is the resulting variables are identically distributed but not independent, they are dependent.

For instance, let us consider the simplest dependence construction to create an exchangeable sequence. There is a common father with a particular probability law and many children whose probability law depends on the common father. After maginalising the law of the father we get a dependence (exchangeable) sequence of children with the same marginal distribution. Let us consider a second ancestor, say the grandfather with a single son (the father) and this latter with many children. If we use the prior as the probability law of the grandfather, the likelihood as the probability law of the father and the posterior the probability law of each of the children, then we also get an exchangeable sequence of children whose marginal probability law is the same as the law of the grandfather.

In the previous context we can fix any desirable law for the grandfather (and thus children) and use any arbitrary law for the father to do the trick. Bayes’ Theorem is then used to obtain the corresponding posterior.

Other type of dependence sequences can also be constructed using the same ideas based on the three building blocks, prior, likelihood and posterior. These other constructions range from Markov, moving average, seasonal and spatial models. All these ideas are carefully reviewed in the book “Dependence Models via Hierarchical Structures” and are illustrated with several examples and applications.

I started developing this kind of dependence models during my PhD and continue developing more models along 20 years of academic career. The whole idea of the book started during my second sabbatical in the Department of Statistics at the University of Oxford in the academic year 2015-2015. I was invited to deliver a graduate lecture and decided to speak about some of my most recent findings in dependence models. However, it was not until my third sabbatical leave in the Department of Statistical Sciences at the University of Toronto, in the academic year 2023-2024, that I was also invited to deliver a series of research seminars for posgraduate students. It was then when I actually sat down and started writing the book.

Title: Dependence Models via Hierarchical Structures

Author: Luis E. Nieto-Barajas

ISBN: 9781009584111

About The Author

Luis E. Nieto-Barajas

Luis E. Nieto-Barajas is Full Professor and Head of the Department of Statistics at the Instituto Tecnológico Autónomo de México (ITAM). He was previously President of the Mexic...

View profile >
 

Latest Comments

Have your say!