Infinite Series- ISI B.MATH 2006 | Problem - 1

Problem

If $\sum_{n=1}^{\infty} \frac{1}{n^2} =\frac{{\pi}^2}{6}$ then $\sum_{n=1}^{\infty} \frac{1}{(2n-1)^2}$ is equal to

(A) $\frac{{\pi}^2}{24}$ (B) $\frac{{\pi}^2}{8}$ (C) $\frac{{\pi}^2}{6}$ (D) $\frac{{\pi}^2}{3}$

Hint

Try to write the summation as sum of square of reciprocal of odd numbers and even numbers and take the advantage of the infinite sum

Solution

$\sum_{n=1}^{\infty} \frac{1}{n^2} =\frac{{\pi}^2}{6}$

$\Rightarrow \sum_{n=1}^{\infty} \frac{1}{(2n)^2} + \sum_{n=1}^{\infty} \frac{1}{(2n-1)^2}= \frac{{\pi}^2}{6} $

$\Rightarrow \frac{1}{4}\sum_{n=1}^{\infty} \frac{1}{{n^2}} + \sum_{n=1}^{\infty} \frac{1}{(2n-1)^2} = \frac{{\pi}^2}{6} $

we know $\sum_{n=1}^{\infty} \frac{1}{n^2} =\frac{{\pi}^2}{6}$

So from the above equation we get

Hence $\sum_{n=1}^{\infty} \frac{1}{(2n-1)^2} = \frac{{\pi}^2}{6} - \frac{{\pi}^2}{6\cdot4}$

$\Rightarrow \sum_{n=1}^{\infty} \frac{1}{(2n-1)^2} = \frac{{\pi}^2}{8} $

So the correct answer is option B

ISI B.STAT PAPPER 2018 |SUBJECTIVE

Problem

Let $f$:$\mathbb{R} \rightarrow \mathbb{R}$ be a continous function such that for all$x \in \mathbb{R}$ and all $t\geq 0$

f(x)=f(ktx)
where $k>1$ is a fixed constant

Hint

Case-1


choose any 2 arbitary nos $x,y$ using the functional relationship prove that $f(x)=f(y)$

Case-2


when $x,y$ are of opposite signs then show that $$f(x)=f(\frac{x}{2})=f(\frac{x}{4})\dots$$
use continuity to show that $f(x)=f(0)$

Solution


Let us take any $2$ real nos $x$ and $y$.

Case-1

$x$ and $y$ are of same sign . WLG $0<x<y$

Then$\frac{y}{x}>1$
so there is a no $t\geq 0$ such that
$\frac{y}{x}=k^t$
$f(y)=f(k^tx)=f(x)$ [using$f(x)=f(k^tx)$]

case-2

$x,y$ are of opposite sign. WLG $x<0<y$
Then $f(x)=f(k^tx)$

$\Rightarrow f(k^tx)=f(k^t2\frac{1}{2}x)$


$\Rightarrow f(k^t2\frac{1}{2}x)=f(k^tk^{log_k2}\frac{x}{2})$


$\Rightarrow f(k^tk^{log_k2}\frac{x}{2})=f(k^{t+log_k2}\frac{x}{2})$

$\Rightarrow f(k^{t+log_k2}\frac{x}{2})=f(\frac{x}{2})$


Using this logic repeatedly we get


$f(x)=f(\frac{x}{2})=f(\frac{x}{4})\dots =f(\frac{x}{2^n})$


Now $\frac{x}{2^n}\rightarrow0$ and $f$ is a continous function hence $\lim_{n\to\infty}f(\frac{x}{2^n})=f(0)$.


[Because we know if $f$ is a continous function and $x_n$ is a sequence that converges to $x$ then $\lim_{n\to\infty}f(x_n)=f(x)$]


using similar logic we can show that $f(y)=f(0)$ so $f(x)=f(y)$ for any $x,y\in \mathbb{R}$


IIT JAM Stat Mock Test Toppers

IIT JAM Stat Mock Test Toppers

We are really happy with the performance of our students and thus, we have initiated to name the Toppers of IIT JAM Stat Mock Test. These toppers are named in this leader board according to their performance in IIT JAM Stat Mock Tests.

So, here goes the list:

Mock Test nameTopper's name and their scores
IIT JAM Mock Test 1 (Full)1. Somyadipta Ghosh - 88.5%
2. Mainack Paul - 83.7%
3. Abhradiptaa Ghosh - 78.7%
4. Prabirkumar Das - 71.2%
5. Debepsita Mukherjee - 68%
IIT JAM Mock Test 2 (Full)1. Somyadipta Ghosh - 74.2%
2. Mainack Paul - 68.2%
3. Prabirkumar Das - 58.6%
4. Saikat Kar - 57.6%
5. Debepsita Mukherjee - 49.7%
IIT JAM Mathematics Mock Test 11. Bidisha Ghosh - 51.4%
2. Mainack Paul - 51%
3. Somyadipta Ghosh - 50.3%
IIT JAM Mathematics Mock Test 21. Abhradiptaa Ghosh - 57.8%
2. Debepsita Mukherjee - 54.7%
3. Srija Mukherjee - 52.5%
IIT JAM Statistics Mock Test 11. Somyadipta Ghosh - 68%
2. Mainack Paul - 64%
3. Debepsita Mukherjee - 56%
4. Srija Mukherjee - 52%
5. Abhradiptaa Ghosh - 52%
IIT JAM Statistics Mock Test 21. Somyadipta Ghosh - 56.7%
2. Mainack Paul - 56.7%
IIT JAM Probability Mock Test 11. Mainack Paul - 80%
2. Anis Pakrashi - 76.7%
3. Somyadipta Ghosh - 76.7%
4. Prabirkumar Das - 73.3%
IIT JAM Probability Mock Test 21. Abhradiptaa Ghosh - 80%
2. Mainack Paul - 76%
3. Srija Mukherjee - 76%
4. Anis Pakrashi - 68%
5. Prabirkumar Das - 68%

These Mock Tests are part of our Cheenta Statistics Bronze Learning Path. You can learn more about it here.

Some Useful Links:

ISI MStat PSB 2006 Problem 8 | Bernoullian Beauty

This is a very beautiful sample problem from ISI MStat PSB 2006 Problem 8. It is based on basic idea of Maximum Likelihood Estimators, but with a bit of thinking. Give it a thought !

Problem- ISI MStat PSB 2006 Problem 8


Let \((X_1,Y_1),......,(X_n,Y_n)\) be a random sample from the discrete distributions with joint probability

\(f_{X,Y}(x,y) = \begin{cases} \frac{\theta}{4} & (x,y)=(0,0) \ and \ (1,1) \\ \frac{2-\theta}{4} & (x,y)=(0,1) \ and \ (1,0) \end{cases}\)

with \(0 \le \theta \le 2\). Find the maximum likelihood estimator of \(\theta\).

Prerequisites


Maximum Likelihood Estimators

Indicator Random Variables

Bernoulli Trials

Solution :

This is a very beautiful Problem, not very difficult, but her beauty is hidden in her simplicity, lets explore !!

Observe, that the given pmf is as good as useless while taking us anywhere, so we should think out of the box, but before going out of the box, lets collect whats in the box !

So, from the given pmf we get, \(P( \ of\ getting\ pairs \ of\ form \ (1,1) \ or \ (0,0))=2\times \frac{\theta}{4}=\frac{\theta}{2}\),

Similarly, \(P( \ of\ getting\ pairs \ of\ form \ (0,1) \ or \ (1,0))=2\times \frac{2-\theta}{4}=\frac{2-\theta}{2}=1-P( \ of\ getting\ pairs \ of\ form \ (1,1) \ or \ (0,0))\)

So, clearly it is giving us a push towards involving Bernoulli trials, isn't it !!

So, lets treat the pairs with match, .i.e. \(x=y\), be our success, and the other possibilities be failure, then our success probability is \(\frac{\theta}{2}\), where \(0\le \theta \le 2\). So, if \(S\) be the number of successful pairs in our given sample of size \(n\), then it is evident \(S \sim Binomial(n, \frac{\theta}{2})\).

So, now its simplified by all means, and we know the MLE of population proportion in binomial is the proportion of success in the sample,

Hence, \(\frac{\hat{\theta_{MLE}}}{2}= \frac{s}{n}\), where \(s\) is the number of those pairs in our sample where \(X_i=Y_i\).

So, \(\hat{\theta_{MLE}}=\frac{2(number\ of \ pairs \ in\ the\ sample\ of \ form\ (0,0)\ or \ (1,1))}{n}\).

Hence, we are done !!


Food For Thought

Say, \(X\) and \(Y\) are two independent exponential random variable with means \(\mu\) and \(\lambda\) respectively. But you observe two other variables, \(Z\) and \(W\), such that \(Z=min(X,Y)\) and \(W\) takes the value \(1\) when \(Z=X\) and \(0\) otherwise. Can you find the MLEs of the parameters ?

Give it a try !!


ISI MStat PSB 2008 Problem 10
Outstanding Statistics Program with Applications

Outstanding Statistics Program with Applications

Subscribe to Cheenta at Youtube


ISI MStat PSB 2009 Problem 8 | How big is the Mean?

This is a very simple and regular sample problem from ISI MStat PSB 2009 Problem 8. It It is based on testing the nature of the mean of Exponential distribution. Give it a Try it !

Problem- ISI MStat PSB 2009 Problem 8


Let \(X_1,.....,X_n\) be i.i.d. observation from the density,

\(f(x)=\frac{1}{\mu}exp(-\frac{x}{\mu}) , x>0\)

where \(\mu >0\) is an unknown parameter.

Consider the problem of testing the hypothesis \(H_o : \mu \le \mu_o\) against \(H_1 : \mu > \mu_o\).

(a) Show that the test with critical region \([\bar{X} \ge \mu_o {\chi_{2n,1-\alpha}}^2/2n]\), where \( {\chi^2}_{2n,1-\alpha} \) is the \((1-\alpha)\)th quantile of the \({\chi^2}_{2n}\) distribution, has size \(\alpha\).

(b) Give an expression of the power in terms of the c.d.f. of the \({\chi^2}_{2n}\) distribution.

Prerequisites


Likelihood Ratio Test

Exponential Distribution

Chi-squared Distribution

Solution :

This problem is quite regular and simple, from the given form of the hypotheses , it is almost clear that using Neyman-Pearson can land you in trouble. So, lets go for something more general , that is Likelihood Ratio Testing.

Hence, the Likelihood function of the \(\mu\) for the given sample is ,

\(L(\mu | \vec{X})=(\frac{1}{\mu})^n exp(-\frac{\sum_{i=1}^n X_i}{\mu}) , \mu>0\), also observe that sample mean \(\vec{X}\) is the MLE of \(\mu\).

So, the Likelihood Ratio statistic is,

\(\lambda(\vec{x})=\frac{\sup_{\mu \le \mu_o}L(\mu |\vec{x})}{\sup_\mu L(\mu |\vec{x})} \\ =\begin{cases} 1 & \mu_o \ge \bar{X} \\ \frac{L(\mu_o|\vec{x})}{L(\bar{X}|\vec{x})} & \mu_o < \bar{X} \end{cases} \)

So, our test function is ,

\(\phi(\vec{x})=\begin{cases} 1 & \lambda(\vec{x})<k \\ 0 & otherwise \end{cases}\).

We, reject \(H_o\) at size \(\alpha\), when \(\phi(\vec{x})=1\), for some \(k\), \(E_{H_o}(\phi) \le \alpha\),

Hence, \(\lambda(\vec{x}) < k \\ \Rightarrow L(\mu_o|\vec{x})<kL(\bar{X}|\vec{x}) \\ \ln k_1 -\frac{1}{\mu_o}\sum_{i=1}^n X_i < \ln k -n \ln \bar{X} -\frac{1}{n} \\ n \ln \bar{X}-\frac{n\bar{X}}{\mu_o} < K* \).

for some constant, \(K*\).

Let \(g(\bar{x})=n\ln \bar{x} -\frac{n\bar{x}}{\mu_o}\), and observe that \(g\) is,

Here, \(K*, \mu_o\) are fixed quantities.

decreasing function of \(\bar{x}\) for \(\bar{x} \ge \mu_o\),

Hence, there exists a \(c\) such that \(\bar{x} \ge c \),we have \(g(\bar) < K*\). See the figure.

So, the critical region of the test is of form \(\bar{X} \ge c\), for some \(c\) such that,

\(P_{H_o}(\bar{X} \ge c)=\alpha \), for some \(0 \le \alpha \le 1\), where \(\alpha\) is the size of the test.

Now, our task is to find \(c\), and for that observe, if \(X \sim Exponential(\theta)\), then \(\frac{2X}{\theta} \sim {\chi^2}_2\),

Hence, in this problem, since the \(X_i\)'s follows \(Exponential(\mu)\), hence, \(\frac{2n\bar{X}}{\mu} \sim {\chi^2}_{2n}\), we have,

\(P_{H_o}(\bar{X} \ge c)=\alpha \\ P_{H_o}(\frac{2n\bar{X}}{\mu_o} \ge \frac{2nc}{\mu_o})=\alpha \\ P_{H_o}({\chi^2}{2n} \ge \frac{2nc}{\mu_o})=\alpha \),

which gives \(c=\frac{\mu_o {\chi^2}_{2n;1-\alpha}}{2n}\),

Hence, the rejection region is indeed, \([\bar{X} \ge \frac{\mu_o {\chi^2}_{2n;1-\alpha}}{2n}\).

Hence Proved !

(b) Now, we know that the power of the test is,

\(\beta= E_{\mu}(\phi) \\ = P_{\mu}(\lambda(\bar{x})>k)=P(\bar{X} \ge \frac{\mu_o {\chi_{2n;1-\alpha}}^2}{2n}) \\ \beta = P_{\mu}({\chi^2}_{2n} \ge \frac{mu_o}{\mu}{\chi^2}_{2n;1-\alpha}) \).

Hence, the power of the test is of form of a cdf of chi-squared distribution.


Food For Thought

Can you use any other testing procedure to conduct this test ?

Think about it !!


Similar Problems and Solutions



ISI MStat PSB 2008 Problem 10
Outstanding Statistics Program with Applications

Outstanding Statistics Program with Applications

Subscribe to Cheenta at Youtube


ISI MStat PSB 2009 Problem 4 | Polarized to Normal

This is a very beautiful sample problem from ISI MStat PSB 2009 Problem 4. It is based on the idea of Polar Transformations, but need a good deal of observation o realize that. Give it a Try it !

Problem- ISI MStat PSB 2009 Problem 4


Let \(R\) and \(\theta\) be independent and non-negative random variables such that \(R^2 \sim {\chi_2}^2 \) and \(\theta \sim U(0,2\pi)\). Fix \(\theta_o \in (0,2\pi)\). Find the distribution of \(R\sin(\theta+\theta_o)\).

Prerequisites


Convolution

Polar Transformation

Normal Distribution

Solution :

This problem may get nasty, if one try to find the required distribution, by the so-called CDF method. Its better to observe a bit, before moving forward!! Recall how we derive the probability distribution of the sample variance of a sample from a normal population ??

Yes, you are thinking right, we need to use Polar Transformation !!

But, before transforming lets make some modifications, to reduce future complications,

Given, \(\theta \sim U(0,2\pi)\) and \(\theta_o \) is some fixed number in \((0,2\pi)\), so, let \(Z=\theta+\theta_o \sim U(\theta_o,2\pi +\theta_o)\).

Hence, we need to find the distribution of \(R\sin Z\). Now, from the given and modified information the joint pdf of \(R^2\) and \(Z\) are,

\(f_{R^2,Z}(r,z)=\frac{r}{2\pi}exp(-\frac{r^2}{2}) \ \ R>0, \theta_o \le z \le 2\pi +\theta_o \)

Now, let the transformation be \((R,Z) \to (X,Y)\),

\(X=R\cos Z \\ Y=R\sin Z\), Also, here \(X,Y \in \mathbb{R}\)

Hence, \(R^2=X^2+Y^2 \\ Z= \tan^{-1} (\frac{Y}{X}) \)

Hence, verify the Jacobian of the transformation \(J(\frac{r,z}{x,y})=\frac{1}{r}\).

Hence, the joint pdf of \(X\) and \(Y\) is,

\(f_{X,Y}(xy)=f_{R,Z}(x^2+y^2, \tan^{-1}(\frac{y}{x})) J(\frac{r,z}{x,y}) \\ =\frac{1}{2\pi}exp(-\frac{x^2+y^2}{2})\) , \(x,y \in \mathbb{R}\).

Yeah, Now it is looking familiar right !!

Since, we need the distribution of \(Y=R\sin Z=R\sin(\theta+\theta_o)\), we integrate \(f_{X,Y}\) w.r.t to \(X\) over the real line, and we will end up with, the conclusion that,

\(R\sin(\theta+\theta_o) \sim N(0,1)\). Hence, We are done !!


Food For Thought

From the above solution, the distribution of \(R\cos(\theta+\theta_o)\) is also determinable right !! Can you go further investigating the occurrence pattern of \(\tan(\theta+\theta_o)\) ?? \(R\) and \(\theta\) are the same variables as defined in the question.

Give it a try !!


Similar Problems and Solutions



ISI MStat PSB 2008 Problem 10
Outstanding Statistics Program with Applications

Outstanding Statistics Program with Applications

Subscribe to Cheenta at Youtube


ISI MStat PSB 2008 Problem 7 | Finding the Distribution of a Random Variable

This is a very beautiful sample problem from ISI MStat PSB 2008 Problem 7 based on finding the distribution of a random variable . Let's give it a try !!

Problem- ISI MStat PSB 2008 Problem 7


Let \( X\) and \( Y\) be exponential random variables with parameters 1 and 2 respectively. Another random variable \( Z\) is defined as follows.

A coin, with probability p of Heads (and probability 1-p of Tails) is
tossed. Define \( Z\) by \( Z=\begin{cases} X & , \text { if the coin turns Heads } \\ Y & , \text { if the coin turns Tails } \end{cases} \)
Find \( P(1 \leq Z \leq 2)\)

Prerequisites


Cumulative Distribution Function

Exponential Distribution

Solution :

Let , \( F_{i} \) be the CDF for i=X,Y, Z then we have ,

\( F_{Z}(z) = P(Z \le z) = P( Z \le z | coin turns Head )P(coin turns Head) + P( Z \le z | coin turns Tail ) P( coin turns Tail) \)

=\( P( X \le z)p + P(Y \le z ) (1-p) \) = \( F_{X}(z)p+F_{Y}(y) (1-p) \)

Therefore pdf of Z is given by \( f_{Z}(z)= pf_{X}(z)+(1-p)f_{Y}(z) \) , where \( f_{X} and f_{Y} \) are pdf of X,Y respectively .

So , \( P(1 \leq Z \leq 2) = \int_{1}^{2} \{pe^{-z} + (1-p) 2e^{-2z}\} dz = p \frac{e-1}{e^2} +(1-p) \frac{e^2-1}{e^4} \)

Food For Thought

Find the the distribution function of \( K=\frac{X}{Y} \) and then find \( \lim_{K \to \infty} P(K >1 ) \)


Similar Problems and Solutions



ISI MStat PSB 2008 Problem 10
Outstanding Statistics Program with Applications

Outstanding Statistics Program with Applications

Subscribe to Cheenta at Youtube


ISI MStat PSB 2008 Problem 2 | Definite integral as the limit of the Riemann sum

This is a very beautiful sample problem from ISI MStat PSB 2008 Problem 2 based on definite integral as the limit of the Riemann sum . Let's give it a try !!

Problem- ISI MStat PSB 2008 Problem 2


For \( k \geq 1,\) let \( a_{k}=\lim {n \rightarrow \infty} \frac{1}{n} \sum_{m=1}^{kn} \exp \left(-\frac{1}{2} \frac{m^{2}}{n^{2}}\right) \)

Find \( \lim_{k \rightarrow \infty} a_{k} \) .

Prerequisites


Integration

Gamma function

Definite integral as the limit of the Riemann sum

Solution :

\( a_{k}=\lim {n \rightarrow \infty} \frac{1}{n} \sum_{m=1}^{kn} \exp \left(-\frac{1}{2} \frac{m^{2}}{n^{2}}\right) = \int_{0}^{k} e^{\frac{-y^2}{2}} dy \) , this can be written you may see in details Definite integral as the limit of the Riemann sum .

Therefore , \( lim_{k \to \infty} a_{k}= \int_{0}^{ \infty} e^{\frac{-y^2}{2}} dy \) ----(1) , let \( \frac{y^2}{2}=z \Rightarrow dy= \frac{dz}{\sqrt{2z}} \)

Substituting we get , \( \int_{0}^{ \infty} z^{\frac{1}{2} -1} e^{z} \frac{1}{\sqrt{2}} dz =\frac{ \gamma(\frac{1}{2}) }{\sqrt{2}} = \sqrt{\frac{\pi}{2}} \)

Statistical Insight

Let \( X \sim N(0,1) \) i.e X is a standard normal random variable then,

\( Y=|X| \) called folded Normal has pdf \( f_{Y}(y)= \begin{cases} \frac{2}{\sqrt{2 \pi }} e^{\frac{-x^2}{2}} & , y>0 \\ 0 &, otherwise \end{cases} \) . (Verify!)

So, from (1) we can say that \( \int_{0}^{ \infty} e^{\frac{-y^2}{2}} dy = \frac{\sqrt{2 \pi }}{2} \int_{0}^{ \infty}\frac{2}{\sqrt{2 \pi }} f_{Y}(y) dy \)

\( =\frac{\sqrt{2 \pi }}{2} \times 1 \) ( As that a PDF of folded Normal distribution ) .


Food For Thought

Find the same when \( a_{k}=\lim {n \rightarrow \infty} \frac{1}{n} \sum_{m=1}^{kn} {(\frac{m}{n})}^{5} \exp \left(-\frac{1}{2} \frac{m}{n}\right) \).


Similar Problems and Solutions



ISI MStat PSB 2008 Problem 10
Outstanding Statistics Program with Applications

Outstanding Statistics Program with Applications

Subscribe to Cheenta at Youtube


ISI MStat PSB 2008 Problem 3 | Functional equation

This is a very beautiful sample problem from ISI MStat PSB 2008 Problem 3 based on Functional equation . Let's give it a try !!

Problem- ISI MStat PSB 2008 Problem 3


Let \(g\) be a continuous function with \( g(1)=1 \) such that \( g(x+y)=5 g(x) g(y) \) for all \( x, y .\) Find \( g(x) \).

Prerequisites


Continuity & Differentiability

Differential equation

Cauchy's functional equation

Solution :

We are g is continuous function such that\( g(x+y)=5 g(x) g(y) \) for all \( x, y \) and g(1)=1.

Now putting x=y=0 , we get \( g(0)=5{g(0)}^2 \Rightarrow g(0)=0\) or , \(g(0)= \frac{1}{5} \) .

If g(0)=0 , then g(x)=0 for all x but we are given that g(1)=1 . Hence contradiction .

So, \(g(0)=\frac{1}{5} \) .

Now , we can write \( g'(x)= \lim_{h \to 0} \frac{g(x+h)-g(x)}{h} = \lim_{h \to 0} \frac{5g(x)g(h)-g(x)}{h} \)

\(= 5g(x) \lim_{h \to 0} \frac{g(h)- \frac{1}{5} }{ h} = 5g(x) \lim_{h \to 0} \frac{g(h)- g(0) }{ h} = 5g(x)g'(0) \) (by definition)

Therefore , \( g(x)=5g'(0)g(x)= Kg(x) \) , for some constant k ,say.

Now we will solve the differential equation , let y=g(x) then we have from above

\( \frac{dy}{dx} = ky \Rightarrow \frac{dy}{y}=k{dx} \) . Integrating both sides we get ,

\( ln(y)=kx+c \) c is integrating constant . So , we get \( y=e^{kx+c} \Rightarrow g(x)=e^{kx+c} \)

Solve the equation g(0)=1/5 and g(1)=1 to get the values of K and c . Finally we will get , \( g(x)=\frac{1}{5} e^{(ln(5)) x} =5^{x-1}\).

But there is a little mistake in this solution .

What's the mistake ?

Ans- Here we assume that g is differentiable at x=0 , which may not be true .

Correct Solution comes here!

We are given that \( g(x+y)=5 g(x) g(y) \) for all \( x, y .\) Now taking log both sides we get ,

\( log(g(x+y))=log5+log(g(x))+log(g(y)) \Rightarrow log_5 (g(x+y))=1+log_5 (g(x))+log_5 (g(y)) \)

\( \Rightarrow log_5 (g(x+y)) +1= log_5 (g(x))+1+log_5 (g(y)) +1 \Rightarrow \phi(x+y)=\phi(x)+\phi(y) \) , where \( \phi(x)=1+log_5 (g(x)) \)

It's a cauchy function as \(\phi(x)\) is also continuous . Hence , \( \phi(x)=cx \) , c is a constant \( \Rightarrow 1+log_5 (g(x))=cx \Rightarrow g(x)=5^{cx-1} \).

Now \(g(1)=1 \Rightarrow 5^{c-1}=1 \Rightarrow c=1 \).

Therefore , \(g(x)=5^{x-1} \)


Food For Thought

Let \( f:R to R \) be a non-constant , 3 times differentiable function . If \( f(1+ \frac{1}{n})=1\) for all integer n then find \( f''(1) \) .


Similar Problems and Solutions



ISI MStat PSB 2008 Problem 10
Outstanding Statistics Program with Applications

Outstanding Statistics Program with Applications

Subscribe to Cheenta at Youtube


ISI MStat PSB 2009 Problem 6 | abNormal MLE of Normal

This is a very beautiful sample problem from ISI MStat PSB 2009 Problem 6. It is based on the idea of Restricted Maximum Likelihood Estimators, and Mean Squared Errors. Give it a Try it !

Problem-ISI MStat PSB 2009 Problem 6


Suppose \(X_1,.....,X_n\) are i.i.d. \(N(\theta,1)\), \(\theta_o \le \theta \le \theta_1\), where \(\theta_o < \theta_1\) are two specified numbers. Find the MLE of \(\theta\) and show that it is better than the sample mean \(\bar{X}\) in the sense of having smaller mean squared error.

Prerequisites


Maximum Likelihood Estimators

Normal Distribution

Mean Squared Error

Solution :

This is a very interesting Problem ! We all know, that if the condition "\(\theta_o \le \theta \le \theta_1\), for some specified numbers \(\theta_o < \theta_1\)" had been not given, then the MLE would have been simply \(\bar{X}=\frac{1}{n}\sum_{k=1}^n X_k\), the sample mean of the given sample. But due to the restriction over \(\theta\) things get interestingly complicated.

So, simplify a bit, lets write the Likelihood Function of \(theta\) given this sample, \(\vec{X}=(X_1,....,X_n)'\),

\(L(\theta |\vec{X})={\frac{1}{\sqrt{2\pi}}}^nexp(-\frac{1}{2}\sum_{k=1}^n(X_k-\theta)^2)\), when \(\theta_o \le \theta \le \theta_1\)ow taking natural log both sides and differentiating, we find that ,

\(\frac{d\ln L(\theta|\vec{X})}{d\theta}= \sum_{k=1}^n (X_k-\theta) \).

Now, verify that if \(\bar{X} < \theta_o\), then \(L(\theta |\vec{X})\) is always a decreasing function of \(\theta\), [ where, \(\theta_o \le \theta \le \theta_1\)], Hence the maximum likelihood attains at \(\theta_o\) itself. Similarly, when, \(\theta_o \le \bar{X} \le \theta_1\), the maximum likelihood attains at \(\bar{X}\), lastly the likelihood function will be increasing, hence the maximum likelihood will be found at \(\theta_1\).

Hence, the Restricted Maximum Likelihood Estimator of \(\theta\), say

\(\hat{\theta_{RML}} = \begin{cases} \theta_o & \bar{X} < \theta_o \\ \bar{X} & \theta_o\le \bar{X} \le \theta_1 \\ \theta_1 & \bar{X} > \theta_1 \end{cases}\)

Now, to check that, \(\hat{\theta_{RML}}\) is a better estimator than \(\bar{X}\), in terms of Mean Squared Error (MSE).

Now, \(MSE_{\theta}(\bar{X})=E_{\theta}(\bar{X}-\theta)^2=\int^{-\infty}_\infty (\bar{X}-\theta)^2f_X(x)\,dx\)

\(=\int^{-\infty}_{\theta_o} (\bar{X}-\theta)^2f_X(x)\,dx+\int^{\theta_o}_{\theta_1} (\bar{X}-\theta)^2f_X(x)\,dx+\int^{\theta_1}_\infty (\bar{X}-\theta)^2f_X(x)\,dx\).

\(\ge \int^{-\infty}_{\theta_o} (\theta_o-\theta)^2f_X(x)\,dx+\int^{\theta_o}_{\theta_1} (\bar{X}-\theta)^2f_X(x)\,dx+\int^{\theta_1}_\infty (\theta_1-\theta)^2f_X(x)\,dx\)

\(=E_{\theta}(\hat{\theta_{RML}}-\theta)^2=MSE_{\theta}(\hat{\theta_{RML}})\).

Hence proved !!


Food For Thought

Now, can you find an unbiased estimator, for \(\theta^2\) ?? Okay!! now its quite easy right !! But is the estimator you are thinking about is the best unbiased estimator !! Calculate the variance and also compare weather the Variance is attaining Cramer-Rao Lowe Bound.

Give it a try !! You may need the help of Stein's Identity.


Similar Problems and Solutions



ISI MStat PSB 2008 Problem 10
Outstanding Statistics Program with Applications

Outstanding Statistics Program with Applications

Subscribe to Cheenta at Youtube