Population structure is a core concept in population genetics, capturing the substructure inherent in a collection of genotypes, sampled from diverse locations. Given data from multiple loci, measures of informativeness aim to quantify the potential for accurate ancestry inference, or alternatively, extract a 'population signal' and provide some sense of data 'clusteredness'. Motivated by Claude Shannon’s axiomatic approach in deriving a measure of information in communication systems based on entropy, I first identify a set of intuitively justifiable criteria that any such quantitative information measure should satisfy, where the notion of communication noise can be made analogous to sampling noise. I show that standard information-theoretic measures such as mutual information or relative entropy cannot satisfactorily account for this sense of information, especially when knowledge on population size is considered, necessitating borrowing notions from statistical-learning.
Omri Tal is currently a research associate (postdoc) at Max Planck Institute for Mathematics in the Sciences, Leipzig and was previously a postdoc at CPNSS, London School of Economics (LSE). He received his PhD in 2012 from Tel Aviv University, School of Philosophy and The Cohn Institute for the History and Philosophy of Science and Ideas, under the supervision of Prof. Eva Jablonka ("Heritability, variation and classification: towards a new formalization of biological concepts"). In his research he takes a quantitative approach in the interpretation of central biological concepts. Using formal tools from probability, statistical learning and information theory, and conceptual analysis, he aims to produce novel formulations that may enable a deeper appreciation, and also assist in the application, of otherwise technical concepts. He has worked on heritability and epigenetic information, and is currently focusing on capturing the notion of 'population structure' from an information-theoretic perspective. http://omrital.altervista.org/