# Gamma vs Normal KL divergence? Got you covered.

Have you ever searched for a calculation on the Internet and then was disappointed when you couldn’t find it?

Yeah, me too. I was trying to find the KL divergence between a Gamma and a Normal random distribution, but eventually decided to compute it on my own.

So, if we consider the Gamma distribution, ${Q=Gamma(a,b)}$, and the Normal distribution, ${P=N(\mu,\sigma^2)}$, then the KL of $Q$ with respect to $P$ is:

\displaystyle \begin{aligned} R(Q\|P)&=-\left(H(Q)+\int_{0}^{\infty}Q(x)\log P(x)dx\right ). \end{aligned} \ \ \ \ \ (1)

The ${H(Q)}$ term is the entropy of ${Q}$,

\displaystyle \begin{aligned}H(Q) = a-\log(b)+\log(|\Gamma(a)|)+(1-a)\psi(a), \end{aligned}\ \ \ \ \ (2)

with ${\Gamma}$ being the Gamma function and ${\psi}$ being the digamma function. Also,

\displaystyle \begin{aligned} \int_{0}^{\infty}Q(x)\log P(x)dx=-\frac{1}{2\sigma^2b^2\Gamma(a)}&\left ( \Gamma(a+2)-2\mu \Gamma(a+1)b+\mu^2b^2\Gamma(a)\right)\\ &-\log(\sigma \sqrt{2\pi}). \end{aligned} \ \ \ \ \ (3)

The (2) and (3) pieces, substituted in (1), provide the KL between the two distributions.