Advisor: Mathias Drton
In this talk, two aspects of statistical modeling with latent factors will be addressed:
1) Confirmatory factor analysis aims to model many observable variables, each as a linear combination of a few latent factors, via a fixed factor loading matrix. It is known that imposing lower triangular constraints on the factor loading matrix renders identifiability to the parametrization of the model. In the Bayesian framework, this can be achieved by assigning independent truncated normal priors and normal priors on the diagonal and strictly lower triangular entries of the matrix respectively. Such a prior, however, suffers from vulnerability to reordering of the data, in the sense that the posterior distribution depends on how the random variables are arranged in the data matrix. We introduce an alternative prior that induces invariability of the posterior to such reordering, while maintaining identification of the model parametrization.
2) Linear structural equations model linear functional relationship between random variables and additional independent error terms. Such a model can be represented by a directed graph with nodes representing the variables and edges representing the directions of influence. We will examine parametric identification when we have one latent factor influencing all other observable variables and when the graphical representation is acyclic in nature. Necessary and sufficient graphical conditions for identifiability will be introduced.