We introduce here some notation very useful in probability and statistics.

Definition 1For a given sequence of random variables $latex {R_{n}}&fg=000000$,

$latex {(i)}&fg=000000$ $latex {X_{n}=o_{P}(R_{n})}&fg=000000$ means $latex {X_{n}=Y_{n}R_{n}}&fg=000000$ with $latex {Y_{n}}&fg=000000$ converging to $latex 0&fg=000000$ in probability.

$latex {(ii)}&fg=000000$ $latex {X_{n}=O_{P}(R_{n})}&fg=000000$ means $latex {X_{n}=Y_{n}R_{n}}&fg=000000$ with the family $latex {(Y_{n})_{n}}&fg=000000$ uniformly thigh.

These notations express if the sequence $latex {X_{n}}&fg=000000$ converge in probability to zero or is bounded in probability at the “rate” $latex {R_{n}}&fg=000000$. It is possible to apply the same algebra rules of the deterministic $latex {o}&fg=000000$ and $latex {O}&fg=000000$ symbols to the probabilistic ones:

$latex \displaystyle \begin{array}{rl} o_{p}(1)+o_{p}(1) & =o_{p}(1)\\ o_{p}(1)+O_{p}(1) & =O_{p}(1)\\ O_{p}(1)o_{p}(1) & =o_{p}(1)\\ \left(1+o_{p}(1)\right){}^{-1} & =O_{p}(1)\\ o_{p}(R_{n}) & =R_{n}o_{p}(1)\\ O_{p}(R_{n}) & =R_{n}O_{p}(1)\\ o_{p}\left(O_{p}(1)\right) & =o_{p}(1). \end{array} &fg=000000$

To verify these rules, we only need to write the symbols $latex {o_{p}(1)}&fg=000000$ and $latex {O_{p}(1)}&fg=000000$ explicitly in term of random variables. ** You could try to prove some and tell me about it in the comments**. For example, the first rule says: If $latex {X_{n}\stackrel{P}{\rightarrow}0}&fg=000000$ and $latex {Y_{n}\stackrel{P}{\rightarrow}0}&fg=000000$, then $latex {Z_{n}=X_{n}+Y_{n}\stackrel{P}{\rightarrow}0}&fg=000000$ by the continuous mapping theorem.

The following lemma, we will allow us to replace deterministic quantities for random quantities in the relations $latex {o}&fg=000000$ and $latex {O}&fg=000000$.

Lemma 2Let $latex {X_{n}}&fg=000000$ a sequence of random vectors which converge to zero in probability. Then, for all $latex {p>0}&fg=000000$, every function $latex {R}&fg=000000$ such that $latex {R(0)=0}&fg=000000$ and $latex {h\rightarrow0}&fg=000000$,

$latex {(i)}&fg=000000$ $latex {R(h)=o(\|h\|^{p})\Rightarrow R(X_{n})=o_{P}(\|X_{n}\|^{p})}&fg=000000$.

$latex {(ii)}&fg=000000$ $latex {R(h)=O(\|h\|^{p})\Rightarrow R(X_{n})=O_{P}(\|X_{n}\|^{p})}&fg=000000$.

*Proof:* Define

$latex \displaystyle {\displaystyle g(h)=\begin{cases} R(h)/\|h\|^{p} & \text{if }h\not=0\\ 0 & \text{if }h=0. \end{cases}} &fg=000000$

Then, $latex {R(X_{n})=g(X_{n})\|X_{n}\|^{p}}&fg=000000$.

$latex {(i)}&fg=000000$ The function $latex {g}&fg=000000$ is continue at zero by construction. We deduce by the continuous mapping theorem that $latex {g(X_{n})\stackrel{P}{\rightarrow}g(0)=0}&fg=000000$.

$latex {(ii)}&fg=000000$ By hypothesis there is $latex {M>0}&fg=000000$ and $latex {\delta>0}&fg=000000$ such that $latex {|g(h)|\leq M}&fg=000000$ whenever $latex {\|h\|\leq\delta}&fg=000000$. Thus, $latex {\mathbb P\left(|g(X_{n})|>M\right)\leq\mathbb P\left(\|X_{n}\|>\delta\right)}&fg=000000$. he last term goes to zero by hypothesis and therefore $latex {g(X_{n})}&fg=000000$ is thigh.

$latex \Box&fg=000000$

## Comments

Pingback: The Delta method: Main Result | Blog about Statistics