1 Complements to Shibu et al. (2023a)

In a recent article in this journal, Shibu et al. (2023a) “derive exact expressions for inverse moments for any positive integer valued random variable" (X, with probability mass function [p.m.f.] \(p_1,p_2,...\)) using “an integral representation for inverse moments involving the probability generating function" (p.g.f., \(G(s)= {\mathbb {E}(s^X)=}\sum _{x=1}^\infty p_x s^x\)). For the first inverse moment, \(\mu _{-1} \equiv {\mathbb E}(1/X)\), they show that \(\mu _{-1} = \lim _{t \rightarrow \infty } \{tH(t)\}\) where \(H(t) \equiv \int _0^1 G(s^t)dt\). I would first like to note a different version of this result, namely,

$$\begin{aligned} \mu _{-1} = \int _0^1 s^{-1} G(s)\,ds.\end{aligned}$$
(1)

To see this, note that the right-hand side equals

$$\sum _{x=1}^\infty p_x \int _0^1 s^{x-1}\,ds = \sum _{x=1}^\infty p_x /x = {\mathbb E}(1/X).$$

Note also that, for \(A>0\), by a similar argument,

$$\begin{aligned} \mathbb {E}\{1/(X+A)\} = \int _0^1 s^{A-1} G(s)\,ds.\end{aligned}$$
(2)

Example expressions given by Shibu et al. for \(\mu _{-1}\) for geometric and negative binomial distributions are correct but both can be evaluated beyond the integral forms given. The first reduces to \(p(-\log p)/(1-p)\). Using

$$\begin{aligned} \int _0^1 \frac{p^ms^{m-1}}{\{1-(1-p)s\}^m}\,ds= & {} \left( \frac{p}{1-p}\right) ^m\int _p^1 \frac{(1-w)^{m-1}}{w^m} \,dw \\= & {} \left( \frac{p}{1-p}\right) ^m \,\sum _{j=0}^{m-1}{m-1 \atopwithdelims ()j}(-1)^{m-1-j}\int _p^1 w^{-j-1} \,dw , \end{aligned}$$

the second becomes

$$\left( \frac{p}{1-p}\right) ^m\left[ (-1)^{m-1} (-\log p)+ \sum _{j=1}^{m-1} \frac{1}{j} {m-1 \atopwithdelims ()j}(-1)^{m-1-j} \,\{(1/p)^{j}-1\}\right] {;}$$

here, integer m is both the index and the starting point of the negative binomial distribution. When \(m=1\), the latter reduces to the former since then the sum vanishes.

Shibu et al. (2023a) also provide an expression for higher order integer inverse moments which is a limiting value of an expression involving H(t) and lower order integer inverse moments. To complement this with a corresponding direct integral expression, I note that Cressie et al. (1981) gave the formula

$$\begin{aligned} \mu _{-q} \equiv {\mathbb E}(1/X^q) = \frac{1}{\Upgamma (q)} \int _0^\infty t^{q-1}M(-t)\,dt, \end{aligned}$$
(3)

\(q>0\), where M(t) is the moment generating function given in terms of the p.g.f. by \(M(t) = G(e^{t})\). To see this, note that the right-hand side of (3) equals

$$\frac{1}{\Upgamma (q)} \sum _{x=1}^\infty p_x \int _0^\infty t^{q-1}e^{-xt}\,dt = \frac{1}{\Upgamma (q)} \sum _{x=1}^\infty p_x \Upgamma (q) /x^q = {\mathbb E}(1/X^q).$$

In terms of the p.g.f., therefore, we have

$$\begin{aligned} \mu _{-q} = \frac{1}{\Upgamma (q)} \int _0^\infty t^{q-1}G(e^{-t})\,dt = \frac{1}{\Upgamma (q)} \int _0^1 s^{-1}(-\log s)^{q-1}G(s)\,ds. \end{aligned}$$
(4)

Note that (4) reduces to (1) when \(q=1\).

Example 1

For the geometric distribution (starting at 1) with p.g.f. \(G(s) = ps/\{1-(1-p)s\}\), (4) shows that

$$\begin{aligned} \mu _{-q} = \frac{p}{\Upgamma (q)} \int _0^1 \frac{(-\log s)^{q-1}}{1-(1-p)s}\,ds= \frac{p}{\Upgamma (q)} \int _0^\infty \frac{w^{q-1}}{e^w-(1-p)}\,dw. \end{aligned}$$
(5)

While \(\mu _{-1} = p(-\log p)/(1-p)\), as noted above, (5) shows that for \(q=2,3,...\),

$$\mu _{-q} = \frac{p\,\textrm{Li}_q(1-p)}{1-p},$$

where \(\textrm{Li}_q(1-p)\) is the polylogarithm function as given, for example, in NIST (2024, Section 25.12). This route gives the integral representation of the polylogarithm function; directly, minor manipulation gives \(p/(1-p)\) times the series definition of the same function:

$$\mu _{-q} = \sum _{x=1}^\infty \frac{1}{x^q}\,p(1-p)^{x-1}=\frac{p}{1-p} \sum _{x=1}^\infty \frac{(1-p)^x}{x^q}.$$

The second inverse moment, consequently, involves the dilogarithm function.

Note that the general results above apply to any positive discrete valued random variable, not just integer valued versions, via just a notational change to the indexing of x.

2 Related Expressions, Particularly Inverse Factorial Moments

“When studying discrete distributions, it is often advantageous to use the factorial moments" (Johnson et al., 2005, p.52). This is because many discrete distributions involve factorial terms, usually arising from combinatorial considerations. So, in a discrete, integer valued, context, it might be expected that inverse factorial moments can be easier to work with than inverse power moments. For \(\nu _{-n} \equiv \mathbb {E}(1/\{X(X+1)\cdots (X+n-1)\}= \mathbb {E}\{(X-1)!/(X+n-1)!\}\), \(n=1,2,...,\) I earlier gave a closely related formula with an unfortunate typographical error (the power \(n-1\) is missing in the theorem in Jones, 1987) which I here correct to

$$\begin{aligned} \nu _{-n} = \frac{1}{(n-1)!} \int _0^1 s^{-1} (1-s)^{n-1} G(s)\,ds. \end{aligned}$$
(6)

To confirm this, observe that the right-hand side is

$$\begin{aligned} \frac{1}{(n-1)!} \sum _{x=1}^\infty p_x \int _0^1 s^{x-1}(1-s)^{n-1}\,ds= & {} \frac{1}{(n-1)!} \sum _{x=1}^\infty p_x B(x,n)\\= & {} \frac{1}{(n-1)!} \sum _{x=1}^\infty p_x \frac{(x-1)!(n-1)!}{(x+n-1)!}\\ {}= & {} {\mathbb E}\{(X-1)!/(X+n-1)!\}; \end{aligned}$$

equation (6) too reduces to (1) when \(n=1\). As shown in Jones (1987), (6) can also be proved by expanding \(1/\{X(X+1) \cdots (X+n-1)\}\) in partial fractions and using (2). And there is also an integral representation for \(\nu _{-n}\) rather like (3) for \(\mu _{-q}\) but involving the factorial moment generating function \(F_X(y)=\mathbb {E}\{(1+y)^X\}=G(1+y)\) rather than the moment generating function. This representation is

$$\begin{aligned} {\nu _{-n} = \frac{1}{(n-1)!} \int _0^1 y^{n-1} F_{X-1}(-y)\,dy} \end{aligned}$$
(7)

(the corollary in Jones, 1987). By way of proof, the right-hand side of (7) is

$$ \frac{1}{(n-1)!} \sum _{x=1}^\infty p_x \int _0^1 y^{n-1}(1-y)^{x-1}\,dy = \frac{1}{(n-1)!} \sum _{x=1}^\infty p_x \int _0^1 s^{x-1}(1-s)^{n-1}\,ds$$

and the argument is completed as above.

Example 2

By using (6), the n’th inverse factorial moment of the geometric distribution is

$$\begin{aligned} \nu _{-n} = \frac{p}{(n-1)!} \int _0^1 \frac{(1-s)^{n-1}}{1-(1-p)s}\,ds= \frac{p}{n!} \,_2F_1(1,1;n+1;1-p) \end{aligned}$$
(8)

where \(\,_2F_1(1,1;n+1;1-p)\) is a special case of the Gauss hypergeometric function (NIST, 2024, Section 15). Again, this approach directly yields an integral representation of the special function concerned; minor manipulation of the direct formula for the inverse factorial moment yields the usual series representation thereof.

More generally, other works concerning integral representations of inverse moments include Chao and Strawderman (1972); Adell et al. (1996) and Cressie and Borkent (1986).

3 Complement to Shibu et al. (2023b)

Shibu et al. (2023b) generalize the univariate results of Shibu et al. (2023a) to the multivariate case. Let \(X_1,...,X_k\) follow a positive integer valued multivariate distribution with p.m.f. \(p_{x_1 \ldots x_k}\), p.g.f. \(G(s_1,...,s_k) = \mathbb {E}(s_1^{X_1} \cdots s_k^{X_k})\) and inverse moments \(\mu _{-q_1,\ldots ,-q_k}= \mathbb {E}(X_1^{q_1} \cdots X_k^{q_k})\). Then, Shibu et al. (2023b) show that

$$\mu _{-1,\ldots ,-1} = \lim _{t_1 \rightarrow \infty ,...,t_k \rightarrow \infty } t_1 \cdots t_k \int _0^1 \cdots \int _0^1 G(s_1^{t_1},\ldots ,s_k^{t_k})\,ds_1 \cdots ds_k $$

together with a complicated expression for general \(\mu _{-q_1,\ldots ,-q_k}\) involving limits of multiple summations each involving lower order inverse moments. Formulas (1) and (4) generalise straightforwardly to the multivariate case, however. We have

$$\begin{aligned} \mu _{-q_1,...,-q_k} = \frac{1}{\Upgamma (q_1)\cdots \Upgamma (q_k)} \int _0^1\cdots \int _0^1 \frac{(-\log s_1)^{q_1-1}}{s_1}\cdots \frac{(-\log s_k)^{q_k-1}}{s_k}\,G(s_1,...,s_k)\,ds_1 \cdots ds_k. \end{aligned}$$
(9)

The proof is no more difficult than in the univariate case: the right-hand side of (9) is

$$\begin{aligned}{} & {} \frac{1}{\Upgamma (q_1)\cdots \Upgamma (q_k)} \int _0^\infty \cdots \int _0^\infty t_1^{q_1-1}\cdots t_k^{q_k-1}\,G(e^{-t_1},...,e^{-t_k})\,dt_1 \cdots dt_k\\= & {} \frac{1}{\Upgamma (q_1)\cdots \Upgamma (q_k)} \sum _{x_1=1}^\infty \cdots \sum _{x_k=1}^\infty p_{x_1...x_k} \int _0^\infty \cdots \int _0^\infty t_1^{q_1-1}\cdots t_k^{q_k-1}\,e^{-x_1t_1}\cdots e^{-x_k t_k}\,dt_1 \cdots dt_k\\= & {} \frac{1}{\Upgamma (q_1)\cdots \Upgamma (q_k)} \sum _{x_1=1}^\infty \cdots \sum _{x_k=1}^\infty p_{x_1...x_k} \left( \int _0^\infty t_1^{q_1-1}\,e^{-x_1t_1}\,dt_1\right) \cdots \left( \int _0^\infty t_k^{q_k-1}\,e^{-x_kt_k}\,dt_k\right) \\= & {} \frac{1}{\Upgamma (q_1)\cdots \Upgamma (q_k)} \sum _{x_1=1}^\infty \cdots \sum _{x_k=1}^\infty p_{x_1...x_k} \frac{\Upgamma (q_1)}{x_1^{q_1}} \cdots \frac{\Upgamma (q_k)}{x_k^{q_k}}\\ {}= & {} \sum _{x_1=1}^\infty \cdots \sum _{x_k=1}^\infty \frac{p_{x_1...x_k}}{x_1^{q_1}\cdots x_k^{q_k}} = \mu _{-q_1 ,...,-q_k}. \end{aligned}$$

Example 3

Example 2 of Shibu et al. (2023b) concerns inverse moments of the Barbiero (2019) bivariate geometric distribution which has p.g.f. of the form

$$G(s_1,s_2) = \sum _{j=1}^4 \frac{a_j s_1s_2}{(1-b_js_1)(1-c_js_2)}$$

for \(a_j,b_j,c_j,\) \(j=1,...,4\), being certain simple functions of parameters\(0<\theta _1,\theta _2<1\) and \(-1\le \alpha \le 1\). Since in the bivariate case

$$\mu _{-1,-1} = \int _0^1\int _0^1 \frac{G(s_1,s_2)}{s_1s_2}\,ds_1ds_2,$$

it is easy to see that

$$\mu _{-1,-1} = \sum _{j=1}^4 \frac{a_j}{b_jc_j} \log (1-b_j)\log (1-c_j).$$

This is simpler than the formula given by Shibu et al. (2023b).