A Briefing of Cross-Entropy and KL Divergence
Cross entropy and KL divergence measure how different two distributions are. In information theory, KL Divergence measures the information loss / extra information when using distribution q to approximate the distribution p KL(p|q)