Latest update

# Finding the expected value and variance for $Y_n^{(c)}$

2018-06-12 17:05:23

I am to find all $\beta > 0$ such that the following series converges:

$$\sum \limits_{n = 1}^{\infty} n^{- \beta} \big(X_n - E(X_n) \big). \tag{1}$$

$X_n$ is a random variable with exponential distribution with $\alpha$ parameter $(\alpha > 0)$.

Let's define:

$$Y_n = n^{- \beta} \big(X_n - E(X_n) \big).$$

It's easy to calculate that:

$E(Y_n) = 0,$

$Var(Y_n) = \frac{n^{-2 \beta}}{\alpha^2}.$

From Kolmogorov's two series theorem we can easily justify that for $\beta > \frac{1}{2}$ $(1)$ converges.

Now I would like to use Kolmogorov's three series theorem to check whether $(1)$ converges for $\beta \le \frac{1}{2}$.

Let's consider:

$$Y_n^{(c)} = \begin{cases} Y_n, \text{ for } |Y_n| \le c\\0, \text{ else}\end{cases}$$

My question is how can I find:

$E(Y_n^{(c)})$,

$Var(Y_n^{(c)})$?