ISI B.STAT PAPPER 2018 |SUBJECTIVE

Problem

Let $f$:$\mathbb{R} \rightarrow \mathbb{R}$ be a continous function such that for all$x \in \mathbb{R}$ and all $t\geq 0$

f(x)=f(ktx)
where $k>1$ is a fixed constant

Hint

Case-1


choose any 2 arbitary nos $x,y$ using the functional relationship prove that $f(x)=f(y)$

Case-2


when $x,y$ are of opposite signs then show that $$f(x)=f(\frac{x}{2})=f(\frac{x}{4})\dots$$
use continuity to show that $f(x)=f(0)$

Solution


Let us take any $2$ real nos $x$ and $y$.

Case-1

$x$ and $y$ are of same sign . WLG $0<x<y$

Then$\frac{y}{x}>1$
so there is a no $t\geq 0$ such that
$\frac{y}{x}=k^t$
$f(y)=f(k^tx)=f(x)$ [using$f(x)=f(k^tx)$]

case-2

$x,y$ are of opposite sign. WLG $x<0<y$
Then $f(x)=f(k^tx)$

$\Rightarrow f(k^tx)=f(k^t2\frac{1}{2}x)$


$\Rightarrow f(k^t2\frac{1}{2}x)=f(k^tk^{log_k2}\frac{x}{2})$


$\Rightarrow f(k^tk^{log_k2}\frac{x}{2})=f(k^{t+log_k2}\frac{x}{2})$

$\Rightarrow f(k^{t+log_k2}\frac{x}{2})=f(\frac{x}{2})$


Using this logic repeatedly we get


$f(x)=f(\frac{x}{2})=f(\frac{x}{4})\dots =f(\frac{x}{2^n})$


Now $\frac{x}{2^n}\rightarrow0$ and $f$ is a continous function hence $\lim_{n\to\infty}f(\frac{x}{2^n})=f(0)$.


[Because we know if $f$ is a continous function and $x_n$ is a sequence that converges to $x$ then $\lim_{n\to\infty}f(x_n)=f(x)$]


using similar logic we can show that $f(y)=f(0)$ so $f(x)=f(y)$ for any $x,y\in \mathbb{R}$


Symmetry, Counting, and Partition | ISI MStat PSB 2015 Problem 4

This problem is an application of the non negative integer solution and the symmetry argument. This is from ISI MStat 2015 PSB Problem 4.

Problem

Suppose 15 identical balls are placed in 3 boxes labeled A, B and C. What is the number of ways in which Box A can have more balls than
Box C?

Prerequisites

Solution ( No Algebra )

There are three possible cases.

The symmetry argument

The number of ways in which Box A has more balls than Box C = The number of ways in which Box C has more balls than Box A.

Isn't that very obvious, since the balls are not biased towards boxes. Why will they be?

Total Number of Ways = The number of ways in which Box A has more balls than Box C + The number of ways in which Box C has more balls than Box A + The number of ways in which Box A has the same number of balls as Box C.

The number of ways in which Box A has the same number of balls as Box C = 8 right?

Why? They can have either 0 ball each, 1 ball each, 2 balls each, ..., 7 balls each at most.

Therefore, The number of ways in which Box A has more balls than Box C = \(\frac{{17 \choose 2 }- 8}{2} = 64 = \frac{(n+1)^2}{2}\). [\( n = 15\)]

Challenge Problem

Suppose 15 identical balls are placed in 3 boxes labeled A, B, and C.
What is the number of ways in which Box A have no fewer balls than
Box B and Box B have no fewer balls than Box C?

This is related to the topic of mathematics called Partitions of Numbers.

Likelihood & the Moment | ISI MStat 2016 PSB Problem 7

This problem is a beautiful example when the maximum likelihood estimator is same as the method of moment estimator. Infact, we have proposed a general problem, is when exactly, they are equal? This is from ISI MStat 2016 PSB Problem 7, Stay Tuned.

Problem

Let \(X_{1}, X_{2}, \ldots, X_{n}\) be independent and identically distributed random variables ~ \(X\) with probability mass function
$$
f(x ; \theta)=\frac{x \theta^{x}}{h(\theta)} \quad \text { for } x=1,2,3, \dots
$$
where \(0<\theta<1\) is an unknown parameter and \(h(\theta)\) is a function of \(\theta\) Show that the maximum likelihood estimator of \(\theta\) is also a method of moments estimator.

Prerequisites

Solution

This \(h(\theta)\) looks really irritating.

Find the \( h(\theta) \).

\( \sum_{x = 1}^{\infty} f(x ; \theta) = \sum_{x = 1}^{\infty} \frac{x \theta^{x}}{h(\theta)} = 1 \)

\( \Rightarrow h(\theta) = \sum_{x = 1}^{\infty} {x \theta^{x}} \)

\( \Rightarrow (1 - \theta) \times h(\theta) = \sum_{x = 1}^{\infty} {\theta^{x}} = \frac{\theta}{1 - \theta} \Rightarrow h(\theta) = \frac{\theta}{(1 - \theta)^2}\).

Maximum Likelihood Estimator of \(\theta\)

\( L(\theta)=\prod_{i=1}^{n} f\left(x_{i} | \theta\right) \)

\( l(\theta) = log(L(\theta)) = \sum_{i=1}^{n} \log \left(f\left(x_{i} | \theta\right)\right) \)

Note: All irrelevant stuff except the thing associated with \( \theta \) is kept as constant (\(c\)).

\( \Rightarrow l(\theta) = c + n\bar{X}log(\theta) - nlog(h(\theta)) \)

\( l^{\prime}(\theta) = 0 \overset{Check!}{\Rightarrow} \hat{\theta}_{mle} = \frac{\bar{X} -1}{\bar{X} +1}\)

Method of Moments Estimator

We need to know the \( E(X)\).

\( E(X) = \sum_{x = 1}^{\infty} xf(x ; \theta) = \sum_{x = 1}^{\infty} \frac{x^2 \theta^{x}}{h(\theta)} \).

\( E(X)(1 - \theta) = \sum_{x = 1}^{\infty} \frac{(2x-1)\theta^{x}}{h(\theta)} \).

\( E(X)\theta(1 - \theta) = \sum_{x = 1}^{\infty} \frac{(2x-1)\theta^{x+1}}{h(\theta)} \)

\( E(X)((1 - \theta) - \theta(1 - \theta)) =\frac{\sum_{x = 1}^{\infty} 2\theta^{x} - \theta }{h(\theta)} = \frac{\theta(1 + \theta)}{(1 - \theta)h(\theta)}\).

\( \Rightarrow E(X) = \frac{\theta(1 + \theta)}{(1 - \theta)^3h(\theta)} = \frac{1+\theta}{1-\theta}.\)

\( E(X) = \bar{X} \Rightarrow \frac{1+\theta_{mom}}{1-\theta_{mom}}= \bar{X} \Rightarrow \hat{\theta}_{mom} = \frac{\bar{X} -1}{\bar{X} +1}\)

Food For Thought and Research Problem

Normal (unknown mean and variance), exponential, and Poisson all have sufficient statistics equal to their moments and have MLEs and MoM estimators the same (not strictly true for things like Poisson where there are multiple MoM estimators).

So, when do you think, the Method of Moments Estimator = Maximum Likelihood Estimator?

Pitman Kooper Lemma tells us that it is an exponential family.

Also, you can prove that that there exists a specific form of the exponential family.

Stay tuned for more exciting such stuff!