In this talk I will demonstrate deep links between a core information-theoretic concept and features of genetic data. In essence, long stretches of genetic variants from a population may be captured as ‘typical sequences’ of a nonstationary source. I will introduce the concepts of typical genotypes, population entropy rate and mutual typicality, and their relation to the asymptotic equipartition property. The asymptotic interplay of mutually typical genotypes and their geometric properties in high dimensions will provide motivation for typicality-based population assignment schemes. We will then highlight a surprising resilience to noise from small training samples of such schemes. Finally, I will discuss the prospects of further analogies and applications of information-theoretic concepts in genetics.