Now $\frac{x}{2^n}\rightarrow0$ and $f$ is a continous function hence $\lim_{n\to\infty}f(\frac{x}{2^n})=f(0)$.
[Because we know if $f$ is a continous function and $x_n$ is a sequence that converges to $x$ then $\lim_{n\to\infty}f(x_n)=f(x)$]
using similar logic we can show that $f(y)=f(0)$ so $f(x)=f(y)$ for any $x,y\in \mathbb{R}$
IIT JAM Stat Mock Test Toppers
IIT JAM Stat Mock Test Toppers
We are really happy with the performance of our students and thus, we have initiated to name the Toppers of IIT JAM Stat Mock Test. These toppers are named in this leader board according to their performance in IIT JAM Stat Mock Tests.
This is a very beautiful sample problem from ISI MStat PSB 2006 Problem 8. It is based on basic idea of Maximum Likelihood Estimators, but with a bit of thinking. Give it a thought !
Problem- ISI MStat PSB 2006 Problem 8
Let \((X_1,Y_1),......,(X_n,Y_n)\) be a random sample from the discrete distributions with joint probability
\(f_{X,Y}(x,y) = \begin{cases} \frac{\theta}{4} & (x,y)=(0,0) \ and \ (1,1) \\ \frac{2-\theta}{4} & (x,y)=(0,1) \ and \ (1,0) \end{cases}\)
with \(0 \le \theta \le 2\). Find the maximum likelihood estimator of \(\theta\).
Prerequisites
Maximum Likelihood Estimators
Indicator Random Variables
Bernoulli Trials
Solution :
This is a very beautiful Problem, not very difficult, but her beauty is hidden in her simplicity, lets explore !!
Observe, that the given pmf is as good as useless while taking us anywhere, so we should think out of the box, but before going out of the box, lets collect whats in the box !
So, from the given pmf we get, \(P( \ of\ getting\ pairs \ of\ form \ (1,1) \ or \ (0,0))=2\times \frac{\theta}{4}=\frac{\theta}{2}\),
Similarly, \(P( \ of\ getting\ pairs \ of\ form \ (0,1) \ or \ (1,0))=2\times \frac{2-\theta}{4}=\frac{2-\theta}{2}=1-P( \ of\ getting\ pairs \ of\ form \ (1,1) \ or \ (0,0))\)
So, clearly it is giving us a push towards involving Bernoulli trials, isn't it !!
So, lets treat the pairs with match, .i.e. \(x=y\), be our success, and the other possibilities be failure, then our success probability is \(\frac{\theta}{2}\), where \(0\le \theta \le 2\). So, if \(S\) be the number of successful pairs in our given sample of size \(n\), then it is evident \(S \sim Binomial(n, \frac{\theta}{2})\).
So, now its simplified by all means, and we know the MLE of population proportion in binomial is the proportion of success in the sample,
Hence, \(\frac{\hat{\theta_{MLE}}}{2}= \frac{s}{n}\), where \(s\) is the number of those pairs in our sample where \(X_i=Y_i\).
So, \(\hat{\theta_{MLE}}=\frac{2(number\ of \ pairs \ in\ the\ sample\ of \ form\ (0,0)\ or \ (1,1))}{n}\).
Hence, we are done !!
Food For Thought
Say, \(X\) and \(Y\) are two independent exponential random variable with means \(\mu\) and \(\lambda\) respectively. But you observe two other variables, \(Z\) and \(W\), such that \(Z=min(X,Y)\) and \(W\) takes the value \(1\) when \(Z=X\) and \(0\) otherwise. Can you find the MLEs of the parameters ?
ISI MStat PSB 2009 Problem 8 | How big is the Mean?
This is a very simple and regular sample problem from ISI MStat PSB 2009 Problem 8. It It is based on testing the nature of the mean of Exponential distribution. Give it a Try it !
Problem- ISI MStat PSB 2009 Problem 8
Let \(X_1,.....,X_n\) be i.i.d. observation from the density,
\(f(x)=\frac{1}{\mu}exp(-\frac{x}{\mu}) , x>0\)
where \(\mu >0\) is an unknown parameter.
Consider the problem of testing the hypothesis \(H_o : \mu \le \mu_o\) against \(H_1 : \mu > \mu_o\).
(a) Show that the test with critical region \([\bar{X} \ge \mu_o {\chi_{2n,1-\alpha}}^2/2n]\), where \( {\chi^2}_{2n,1-\alpha} \) is the \((1-\alpha)\)th quantile of the \({\chi^2}_{2n}\) distribution, has size \(\alpha\).
(b) Give an expression of the power in terms of the c.d.f. of the \({\chi^2}_{2n}\) distribution.
Prerequisites
Likelihood Ratio Test
Exponential Distribution
Chi-squared Distribution
Solution :
This problem is quite regular and simple, from the given form of the hypotheses , it is almost clear that using Neyman-Pearson can land you in trouble. So, lets go for something more general , that is Likelihood Ratio Testing.
Hence, the Likelihood function of the \(\mu\) for the given sample is ,
\(L(\mu | \vec{X})=(\frac{1}{\mu})^n exp(-\frac{\sum_{i=1}^n X_i}{\mu}) , \mu>0\), also observe that sample mean \(\vec{X}\) is the MLE of \(\mu\).
ISI MStat PSB 2009 Problem 4 | Polarized to Normal
This is a very beautiful sample problem from ISI MStat PSB 2009 Problem 4. It is based on the idea of Polar Transformations, but need a good deal of observation o realize that. Give it a Try it !
Problem- ISI MStat PSB 2009 Problem 4
Let \(R\) and \(\theta\) be independent and non-negative random variables such that \(R^2 \sim {\chi_2}^2 \) and \(\theta \sim U(0,2\pi)\). Fix \(\theta_o \in (0,2\pi)\). Find the distribution of \(R\sin(\theta+\theta_o)\).
Prerequisites
Convolution
Polar Transformation
Normal Distribution
Solution :
This problem may get nasty, if one try to find the required distribution, by the so-called CDF method. Its better to observe a bit, before moving forward!! Recall how we derive the probability distribution of the sample variance of a sample from a normal population ??
Yes, you are thinking right, we need to use Polar Transformation !!
But, before transforming lets make some modifications, to reduce future complications,
Given, \(\theta \sim U(0,2\pi)\) and \(\theta_o \) is some fixed number in \((0,2\pi)\), so, let \(Z=\theta+\theta_o \sim U(\theta_o,2\pi +\theta_o)\).
Hence, we need to find the distribution of \(R\sin Z\). Now, from the given and modified information the joint pdf of \(R^2\) and \(Z\) are,
Since, we need the distribution of \(Y=R\sin Z=R\sin(\theta+\theta_o)\), we integrate \(f_{X,Y}\) w.r.t to \(X\) over the real line, and we will end up with, the conclusion that,
\(R\sin(\theta+\theta_o) \sim N(0,1)\). Hence, We are done !!
Food For Thought
From the above solution, the distribution of \(R\cos(\theta+\theta_o)\) is also determinable right !! Can you go further investigating the occurrence pattern of \(\tan(\theta+\theta_o)\) ?? \(R\) and \(\theta\) are the same variables as defined in the question.
ISI MStat PSB 2008 Problem 7 | Finding the Distribution of a Random Variable
This is a very beautiful sample problem from ISI MStat PSB 2008 Problem 7 based on finding the distribution of a random variable . Let's give it a try !!
Problem- ISI MStat PSB 2008 Problem 7
Let \( X\) and \( Y\) be exponential random variables with parameters 1 and 2 respectively. Another random variable \( Z\) is defined as follows.
A coin, with probability p of Heads (and probability 1-p of Tails) is tossed. Define \( Z\) by \( Z=\begin{cases} X & , \text { if the coin turns Heads } \\ Y & , \text { if the coin turns Tails } \end{cases} \) Find \( P(1 \leq Z \leq 2)\)
Prerequisites
Cumulative Distribution Function
Exponential Distribution
Solution :
Let , \( F_{i} \) be the CDF for i=X,Y, Z then we have ,
\( F_{Z}(z) = P(Z \le z) = P( Z \le z | coin turns Head )P(coin turns Head) + P( Z \le z | coin turns Tail ) P( coin turns Tail) \)
=\( P( X \le z)p + P(Y \le z ) (1-p) \) = \( F_{X}(z)p+F_{Y}(y) (1-p) \)
Therefore pdf of Z is given by \( f_{Z}(z)= pf_{X}(z)+(1-p)f_{Y}(z) \) , where \( f_{X} and f_{Y} \) are pdf of X,Y respectively .
So , \( P(1 \leq Z \leq 2) = \int_{1}^{2} \{pe^{-z} + (1-p) 2e^{-2z}\} dz = p \frac{e-1}{e^2} +(1-p) \frac{e^2-1}{e^4} \)
Food For Thought
Find the the distribution function of \( K=\frac{X}{Y} \) and then find \( \lim_{K \to \infty} P(K >1 ) \)
ISI MStat PSB 2008 Problem 2 | Definite integral as the limit of the Riemann sum
This is a very beautiful sample problem from ISI MStat PSB 2008 Problem 2 based on definite integral as the limit of the Riemann sum . Let's give it a try !!
Problem- ISI MStat PSB 2008 Problem 2
For \( k \geq 1,\) let \( a_{k}=\lim {n \rightarrow \infty} \frac{1}{n} \sum_{m=1}^{kn} \exp \left(-\frac{1}{2} \frac{m^{2}}{n^{2}}\right) \)
Find \( \lim_{k \rightarrow \infty} a_{k} \) .
Prerequisites
Integration
Gamma function
Definite integral as the limit of the Riemann sum
Solution :
\( a_{k}=\lim {n \rightarrow \infty} \frac{1}{n} \sum_{m=1}^{kn} \exp \left(-\frac{1}{2} \frac{m^{2}}{n^{2}}\right) = \int_{0}^{k} e^{\frac{-y^2}{2}} dy \) , this can be written you may see in details Definite integral as the limit of the Riemann sum .
Substituting we get , \( \int_{0}^{ \infty} z^{\frac{1}{2} -1} e^{z} \frac{1}{\sqrt{2}} dz =\frac{ \gamma(\frac{1}{2}) }{\sqrt{2}} = \sqrt{\frac{\pi}{2}} \)
Statistical Insight
Let \( X \sim N(0,1) \) i.e X is a standard normal random variable then,
\( Y=|X| \) called folded Normal has pdf \( f_{Y}(y)= \begin{cases} \frac{2}{\sqrt{2 \pi }} e^{\frac{-x^2}{2}} & , y>0 \\ 0 &, otherwise \end{cases} \) . (Verify!)
So, from (1) we can say that \( \int_{0}^{ \infty} e^{\frac{-y^2}{2}} dy = \frac{\sqrt{2 \pi }}{2} \int_{0}^{ \infty}\frac{2}{\sqrt{2 \pi }} f_{Y}(y) dy \)
\( =\frac{\sqrt{2 \pi }}{2} \times 1 \) ( As that a PDF of folded Normal distribution ) .
Food For Thought
Find the same when \( a_{k}=\lim {n \rightarrow \infty} \frac{1}{n} \sum_{m=1}^{kn} {(\frac{m}{n})}^{5} \exp \left(-\frac{1}{2} \frac{m}{n}\right) \).
It's a cauchy function as \(\phi(x)\) is also continuous . Hence , \( \phi(x)=cx \) , c is a constant \( \Rightarrow 1+log_5 (g(x))=cx \Rightarrow g(x)=5^{cx-1} \).
Now \(g(1)=1 \Rightarrow 5^{c-1}=1 \Rightarrow c=1 \).
Therefore , \(g(x)=5^{x-1} \)
Food For Thought
Let \( f:R to R \) be a non-constant , 3 times differentiable function . If \( f(1+ \frac{1}{n})=1\) for all integer n then find \( f''(1) \) .
ISI MStat PSB 2009 Problem 6 | abNormal MLE of Normal
This is a very beautiful sample problem from ISI MStat PSB 2009 Problem 6. It is based on the idea of Restricted Maximum Likelihood Estimators, and Mean Squared Errors. Give it a Try it !
Problem-ISI MStat PSB 2009 Problem 6
Suppose \(X_1,.....,X_n\) are i.i.d. \(N(\theta,1)\), \(\theta_o \le \theta \le \theta_1\), where \(\theta_o < \theta_1\) are two specified numbers. Find the MLE of \(\theta\) and show that it is better than the sample mean \(\bar{X}\) in the sense of having smaller mean squared error.
Prerequisites
Maximum Likelihood Estimators
Normal Distribution
Mean Squared Error
Solution :
This is a very interesting Problem ! We all know, that if the condition "\(\theta_o \le \theta \le \theta_1\), for some specified numbers \(\theta_o < \theta_1\)" had been not given, then the MLE would have been simply \(\bar{X}=\frac{1}{n}\sum_{k=1}^n X_k\), the sample mean of the given sample. But due to the restriction over \(\theta\) things get interestingly complicated.
So, simplify a bit, lets write the Likelihood Function of \(theta\) given this sample, \(\vec{X}=(X_1,....,X_n)'\),
\(L(\theta |\vec{X})={\frac{1}{\sqrt{2\pi}}}^nexp(-\frac{1}{2}\sum_{k=1}^n(X_k-\theta)^2)\), when \(\theta_o \le \theta \le \theta_1\)ow taking natural log both sides and differentiating, we find that ,
Now, verify that if \(\bar{X} < \theta_o\), then \(L(\theta |\vec{X})\) is always a decreasing function of \(\theta\), [ where, \(\theta_o \le \theta \le \theta_1\)], Hence the maximum likelihood attains at \(\theta_o\) itself. Similarly, when, \(\theta_o \le \bar{X} \le \theta_1\), the maximum likelihood attains at \(\bar{X}\), lastly the likelihood function will be increasing, hence the maximum likelihood will be found at \(\theta_1\).
Hence, the Restricted Maximum Likelihood Estimator of \(\theta\), say
Now, can you find an unbiased estimator, for \(\theta^2\) ?? Okay!! now its quite easy right !! But is the estimator you are thinking about is the best unbiased estimator !! Calculate the variance and also compare weather the Variance is attaining Cramer-Rao Lowe Bound.
Give it a try !! You may need the help of Stein's Identity.