The coding theorem from algorithmic information theory (AIT) - which should be much more widely taught in Physics! - suggests that many processes in nature may be highly biased towards simple outputs. Here simple means highly compressible, or more formally, outputs with relatively lower Kolmogorov complexity. I will explore applications to biological evolution, where the coding theorem implies an exponential bias towards outcomes with higher symmetry, and to deep learning neural networks, where the coding theorem predicts an Occam's razor like bias that may explain why these highly overparamterised systems work so well.
view more