Assume that $X_{1}, \ldots, X_{n}$ is a random sample from $N(\mu, 1)$, with $\mu \in \mathbb{R}$. We want to test $H_{0}: \mu=0$ against $H_{1}: \mu=1$. For a fixed integer $m \in{1, \ldots, n}$, the following statistics are defined:
$\begin{aligned} T_{1} &= \frac{\left(X_{1}+\ldots+X_{m}\right)}{m} \\ T_{2} &= \frac{\left(X_{2}+\ldots+X_{m+1}\right)} {m} \\ \vdots &=\vdots \\ T_{n-m+1} &= \frac{\left(X_{n-m+1}+\ldots+X_{n}\right)}{m} . \end{aligned}$
Fix $\alpha \in(0,1)$.
Consider the test
Reject $H_{0}$ if $\max \{T_{i}: 1 \leq i \leq n-m+1\}>c_{m, \alpha}$
Find a choice of $c_{m, \alpha} \in \mathbb{R}$ in terms of the standard normal distribution function $\Phi$ that ensures that the size of the test is at most $\alpha$.
Show that the problem is equivalent to finding that $P_{\mu = 0}(\max \{T_{i}: 1 \leq i \leq n-m+1\}\\>c_{m, \alpha}) \leq \alpha$
$P_{\mu = 0}(\max \{T_{i}: 1 \leq i \leq n-m+1\}\\>c_{m, \alpha})$
$= P_{\mu = 0}( T_1 > c_{m, \alpha} \cup T_2 > c_{m, \alpha} \cdots T_{n-m+1}\\ > c_{m, \alpha})$
Use Boole's Inequality o get
$P_{\mu = 0}( T_1 > c_{m, \alpha} \cup T_2 > c_{m, \alpha} \cdots T_{n-m+1}\\ > c_{m, \alpha}) \leq \sum_{i = 1}^{n-m+1} P(T_i > c_{m, \alpha}) = \alpha $
Show that under $H_0$, $T_i$ ~ $N(0,\frac{1}{m})$. Hence, find $c_{m, \alpha}$
See the full solution below.
If $\{x_{n}\}_{n \geq 1}$ is a sequence of real numbers such that $\lim _{n \rightarrow \infty} \frac{x_{n}}{n}=0.001$, then
(A) $\{x_{n}\}_{n \geq 1}$ is a bounded sequence
(B)$\{x_{n}\}_{n \geq 1}$ is an unbounded sequence
(C) $\{x_{n}\}_{n \geq 1}$ is a convergent sequence
(D) $\{x_{n}\}_{n \geq 1}$ is a monotonically decreasing sequence
If $\{x_{n}\}_{n \geq 1}$ was bounded, show that $\lim _{n \rightarrow \infty} \frac{x_{n}}{n}=0$, by sandwich theorem.
If $\{x_{n}\}_{n \geq 1}$ was convergent, show that $\lim _{n \rightarrow \infty} \frac{x_{n}}{n}=0$, by algebra of limits.
If $\{x_{n}\}_{n \geq 1}$ was motonotically decreasing and bounded below, then it would have been convergent by Monotone Convergence Theorem.
Let's consider if it is not below below, i.e. $\lim _{n \rightarrow \infty} {x_{n}} = -\infty $
Find the limit in each of this case.
Hence, it will be unbounded. See the full solution and proof idea below.
The multivariate limit is really akin to the univariate limit. But, how can we explain that?
However, We discuss the following aspects in this regard.
📌 Firstly, we discuss the ideas of proving and disprove Univariate Limits.
📌 Then, come Multivariate Limits - How to prove and disprove?
📌 Thereafter, Iterated Limits appear - Understanding and Geometry.
📌 Hence, we discover Relationship between Multivariate Limits and Iterated Limits.
📌 We end with Food for Thought.
We discover a rich relationship between the two. We give all the cases possible between multivariate limits and iterated limits.
Learn. Enjoy. Practice. Repeat.
Stay Tuned!
Maximum Likelihood Estimation is an algorithm to find a reasonable estimator. Personally, it really woos my mind - simple and yet so beautiful. Method of Moments is simpler. It doesn't woo me :p. However, still, they have a lot of similarities. Thus, we have set off to explore them. Finally, we ask for a lot of food for thought. After all, we are all explorers at heart.
We discover a rich relationship between the two. We discover the score function and so much more exciting.
Learn. Enjoy. Practice. Repeat.