A first minimax lower bound in the two hypothesis scenario

Photos of  Johann Radon and Otto Nikodym. Sources: Apprendre les Mathématiques and Wikipedia. Consider the simplest case, $latex {M=1}&fg=000000$ with two hypothesis $latex {\{f_{1},f_{2}\}}&fg=000000$ belonging to $latex {\mathcal{F}}&fg=000000$. According to the last post, we need only to find lower bounds for the minimax probability of error $latex {p_{e,1}}&fg=000000$. Today, we will find a bound using …

A general reduction scheme for minimax lower bounds

In the last publication, we defined a minimax lower bound as $latex \displaystyle \mathcal{R}^{*}\geq cs_{n} &fg=000000$ where $latex {\mathcal{R}^{*}\triangleq\inf_{\hat{f}}\sup_{f\in\mathcal{F}}\mathbb E\left[d^{2}(\hat{f}_{n},f)\right]}&fg=000000$ and $latex {s_{n}\rightarrow0}&fg=000000$. The big issue with this definition is to take the supremum over a massive set $latex {\mathcal{F}}&fg=000000$ and then the infimum over all the possible estimators of $latex {f}&fg=000000$.

The Delta method: Main Result

Let $latex {T_{n}}&fg=000000$ an estimator of $latex {\theta}&fg=000000$, we want to estimate the parameter $latex {\phi(\theta)}&fg=000000$ where $latex {\phi}&fg=000000$ is a known function. It is natural to estimate $latex {\phi(\theta)}&fg=000000$ by $latex {\phi(T_{n})}&fg=000000$. Now, we can then ask: How the asymptotic properties of $latex {T_{n}}&fg=000000$ could be transfer to $latex {\phi(T_{n})}&fg=000000$?

The probability versions for the Big-O and little-o notations

We introduce here some notation very useful in probability and statistics. Definition 1 For a given sequence of random variables $latex {R_{n}}&fg=000000$, $latex {(i)}&fg=000000$ $latex {X_{n}=o_{P}(R_{n})}&fg=000000$ means $latex {X_{n}=Y_{n}R_{n}}&fg=000000$ with $latex {Y_{n}}&fg=000000$ converging to $latex 0&fg=000000$ in probability. $latex {(ii)}&fg=000000$ $latex {X_{n}=O_{P}(R_{n})}&fg=000000$ means $latex {X_{n}=Y_{n}R_{n}}&fg=000000$ with the family $latex {(Y_{n})_{n}}&fg=000000$ uniformly thigh.