Going moment-picking

Yesterday, while trying to build a counter-example for a research problem, I started playing with the idea of designing a random variable, not by picking a vanilla distribution, but by selecting its moments myself. That is, if I pick the moments myself, can I find a distribution that has them? More formally, if I provide {m_n}, is there a {P}: probability measure such that

\displaystyle m_n = \int x^n dP(x)? \ \ \ \ \ (1)

 

Turns out this is quite the rabbit hole; not only is this an old problem, but it has already been attacked by mathematicians like Hausdorff and Stieltjes. It is called the moment problem, and, in full generality, deals with the construction of Borel measures that have specific moments.

For instance, the Hausdorff moment problem is exactly like~(1), but the measure needs to be supported in {[0,1]}. That is,

\displaystyle m_n = \int_{0}^{1} x^n dP(x). \ \ \ \ \ (2)

 

Hausdorff resolved this in 1921. If you want your sequence to be the moments of a (Borel) measure, then for every {n,k\geq 0}, {m_n} has to satisfy

\displaystyle (-1)^k \Delta^{k}m_n\geq 0. \ \ \ \ \ (3)

 

{\Delta} is the difference operator, think discrete version of the derivative of a function,

\displaystyle \Delta m_n = m_{n+1}- m_{n}, \Delta^2 m_n=\Delta (\Delta m_n)=\Delta (m_{n+1}-m_{n})= m_{n+2}-2m_{n+1}+m_{n} , \mathrm{etc.}

Such a sequence is called completely monotonic, and this idea can be generalized to functions.

Advertisements

1 Comment

Comments are closed.