El inverso de la distribución normal no proporciona un ejemplo adecuado, porque si $Y = Z^{-1}$ donde$Z \sim \operatorname{Normal}(0,1)$, $\operatorname{E}[Y]$ es indeterminado. Sin embargo, podemos considerar un doble inversa de la distribución gamma: definir $$f_X(x) = \frac{|x|}{2}e^{-|x|}, \quad -\infty < x < \infty.$$ It is trivial to see that this function indeed defines a density. Now let $Y = X^{-1}$, from which we find that the density of $Y$ is $$f_Y(y) = f_X(y^{-1})y^{-2} = \frac{1}{2y^2|y|} e^{-1/|y|}, \quad y \ne 0.$$ This function does have a well-defined expectation since $$\int_{y=0}^\infty yf_Y(y) \, dy = \frac{1}{2}.$$ Then, due to $f_Y$ being an even function, we trivially find $\operatorname{E}[Y] = 0$.
Ahora, si la media armónica de un ALCOHOLÍMETRO muestra tomada de $Y$ es, en cierto sentido, la "mejor" estimador de la media de la población es debido a $\bar x$ es el "mejor" estimador de la media de $X$$Y = 1/X$, yo no estoy tan seguro. Esto es debido a que podemos decir que el estimador $\tilde y = n (\sum_{i=1}^n y_i^{-1})^{-1}$ tiene la expectativa de $$\operatorname{E}[\tilde y] = n \operatorname{E}\left[\left(\sum_{i=1}^n y_i^{-1}\right)^{-1}\right],$$ but it cannot be said that the RHS is in general equal to $$n \left(\operatorname{E}\left[\sum_{i=1}^n y_i^{-1}\right]\right)^{-1},$$ in as much as we cannot generally write $$\operatorname{E}[g(X)] = g(\operatorname{E}[X]):$$ that is, the expectation of a function of a random variable does not generally equal the function evaluated at the variable's expected value. If you could say that, then the expectation passes through the sum via linearity and you'd get $$n \left(\sum_{i=1}^n \operatorname{E}[y_i^{-1}]\right)^{-1} = n \left(n \operatorname{E}[X]\right)^{-1} = \operatorname{E}[X]^{-1}.$$ And again, you run into the same problem/fallacy: you can't claim that this last expression equals $\operatorname{E}[Y]$. Por lo tanto, la idea de considerar la inversa distribuciones parece dudoso para mí.