Calculating Fisher Information for Bernoulli rv

\begin{equation} I_X(p)=E_p \left[\frac{X^2}{p^2}\right]-2E_p \left[ \frac{X - X^2}{p(1-p)} \right] + E_p \left[ \frac{X^2 - 2X + 1}{(1-p)^2}\right] \tag{1}. \end{equation} For a Bernoulli RV, we know \begin{align} E(X) &= 0(\Pr(X = 0)) + 1(\Pr(X = 1)) = p\\ E(X^2) &= 0^2(\Pr(X = 0)) + 1^2(\Pr(X = 1)) = p. \end{align} Now, replace in $(1)$, we get \begin{equation} I_X(p)=\frac{p}{p^2}-2\frac{0-0}{p(1-p)}+\frac{p-2p+1}{(1-p)^2} = \frac{1}{p}-\frac{p-1}{(1-p)^2} = \frac{1}{p} - \frac{1}{p-1} = \frac{1}{p(p-1)}. \end{equation}


Actually, the Fisher information of $X$ about $p$ is $$I_X(p)=E_p\left[\left(\frac{d}{dp}\log f(X\mid p) \right)^2 \right],$$ that is $$I_X(p)=E_p\left[\left(\frac{d}{dp}\log\left(p^X(1-p)^{1-X}\right)\right)^2\right].$$

I've only changed every $x$ by $X$, which may seem as a subtlety, but then you get $$I_X(p)=E_p\left(\frac{X^2}{p^2}\right)-2E_p\left(\frac{X(1-X)}{p(1-p)}\right)+E_p\left(\frac{(1-X)^2}{(1-p)^2}\right).$$

The expectation is there for the fact that $X$ is a random variable. So, for instance: $$E_p\left(\frac{X^2}{p^2}\right)=\frac{E_p\left(X^2\right)}{p^2}=\frac{p}{p^2}=\frac1p.$$

Here I used the fact that $E_p(X^2)=p$, which can easily be seen as $$E_p(X^2)=0^2\cdot p_X(0)+1^2\cdot p_X(1)=0^2(1-p)+1^2p=p,$$ or by the observation that $X\sim \operatorname{Be}(p) \implies X^n\sim \operatorname{Be}(p)$ as well. Then you can go on with the remaining terms.


Additionally, an equivalent formula can be proved for $I_X(p)$ given the second derivative of $\log f$ is well defined. This is $$I_X(p)=-E_p\left(\frac{d^2}{dp^2}\log f(X\mid p) \right),$$ and many times you'll get simpler expressions. In this case, for instance, you get $$I_X(p)=-E_p\left(\frac{d^2}{dp^2}\log p^X(1-p)^{1-X}\right)=$$ $$=-E_p\left(-\frac X{p^2}-\frac{1-X}{(1-p)^2} \right) = \frac {E_p(X)}{p^2}+\frac{E_p(1-X)}{(1-p)^2}=$$ $$=\frac {p}{p^2}+\frac{1-p}{(1-p)^2}=\frac 1p+\frac 1{1-p}=\frac 1{p(1-p)},$$ as desired.