A Briefing of Cross-Entropy and KL Divergence

Cross entropy and KL divergence measure how different two distributions are. In information theory, KL Divergence measures the information loss / extra information when using distribution q to approximate the distribution p KL(p|q)

July 19, 2022 · 3 min · 530 words · Carter Y. CHENG