ISI MStat Entrance 2021 Problems and Solutions PSA & PSB

This post contains ISI MStat Entrance PSA and PSB 2021 Problems and Solutions that can be very helpful and resourceful for your ISI MStat Preparation.

Download Paper
PSA Paper
PSB Paper
ISI MStat 2021 PSA Answer Key and Solutions

Click on the links to learn about the detailed solution. (Coming Soon)

  1. 49 (Rolle's Theorem)

2. 2 (4 - number of linear constraints)

3. k = 2 (a = -d, and form a biquadratic which has two real solutions)

4. 0 (divide by $x^4$, use $\frac{sinx}{x}$ limit result)

5. $\frac{p}{q}$ must be a rational number. (The product must be a rational number.)

6. $\alpha = 1, \beta =1$ (Use sandwich theorem on an easy inequality on ceiling of x)

7. $\frac{2n}{n+1}$ (Use geometry and definite integration)

8. $2+ \sqrt{5}$ (Just write down the pythagoras theorem in terms of the variables and solve)

9. 10 (Use the roots of unity)

10. $\frac{3}{8}$ (Find out the cases when it is non zero, and use classical probability)

11. $\frac{(n+1)^n}{n!}$ (Use ${{n} \choose {r}}={{n-1} \choose {r-1}}+{{n-1} \choose r}$)

12. $P(\pi)$ is even for all $\pi$. (Observe that there is one more odd than number of evens, so there will be one odd-odd match)

13. is equal to 12. (The $i,j$th element is $a_{ii}b{ij}c{jj}$. Use gp series then.)

14. 160 (Use the fact any permutation can be written as compositions of transpositions. Observe that the given condition is equivalent to that 2 transpositions are not possible)

15. $m_t < \infty$ for all $t \geq 0$ (All monotone functions are bounded on [a,b])

16.$H(x) = \frac{1-F(-x)+ F(x)}{2}$ (If $F(x)$ is right continuous, $F(-x)$ is left continuous.).

17. $\frac{1}{25}$ (Use the distribution function of $\frac{X}{Y}$)

18. 3 (Find the distribution of order statistic, and find the expectation)

19. (II) but not (I) (If $F(x)$ is right continuous, $F(-x)$ is left continuous.).

20. $20\lambda^4$ (Use gamma integral to find the $E(X_{1}^4)$.)

21. The two new observations are 15 and 5. (Use the condition to find two linear equations to find the observations).

22. It is less than 2. (Use the beta coefficients in terms of sample covariance and sample variance, and compare)

23. 4:3 (Use Bayes' Theorem)

24. The two-sample t-test statistic and the ANOVA statistics yield the same power for any non-zero value of $\mu_1 - \mu_2$ and for any $n,m$. (Both the test statistic are one to one function of one another)

25. t³-1 - 2(t-1)

26. $\frac{2 \sum_{i=1}^{n} X_i}{n(n+1)}$ (Use the invariance property of MLE)

27. $Y_1^2 + Y_2^2 + Y_1Y_2$ (Write the bivariate normal distribution in terms of $Y_1, Y_2$ and use Neyman Factorization Theorem.)

28. can be negative (Simson's Paradox)

29. $2z$ (There are three random variables, $N$ = stopping time to get $Y=1$, $Y$ and $X$. Use the conditioning properly. Take your time)

30. $\frac{40}{3}$ (Use the property that Poisson | Poisson in the given problem follows Binomial)


ISI MStat 2021 PSB Solutions
Coming soon.

ISI MStat PSB 2021 Problem 1

Solution

ISI MStat PSB 2021 Problem 2

Solution

ISI MStat PSB 2021 Problem 3

Solution

ISI MStat PSB 2021 Problem 4

Solution

ISI MStat PSB 2021 Problem 5

Solution

ISI MStat PSB 2021 Problem 6

Solution

ISI MStat PSB 2021 Problem 7

Solution

ISI MStat PSB 2021 Problem 8

Solution

ISI MStat PSB 2021 Problem 9

Solution

Please suggest changes in the comment section.

Cheena Statistics Logo
Cheenta Statistics Department
ISI MStat and IIT JAM Training Program

Is Multivariate Limit = Iterated Limit? Multivariate Limit Demystified

Is Multivariate Limit equal to Iterated Limit?

The multivariate limit is really akin to the univariate limit. But, how can we explain that?

However, We discuss the following aspects in this regard.

📌 Firstly, we discuss the ideas of proving and disprove Univariate Limits.
📌 Then, come Multivariate Limits - How to prove and disprove?
📌 Thereafter, Iterated Limits appear - Understanding and Geometry.
📌 Hence, we discover Relationship between Multivariate Limits and Iterated Limits.
📌 We end with Food for Thought.

Iterated Limits are a bypass. Do they really explain the Multivariate Limit?

We discover a rich relationship between the two. We give all the cases possible between multivariate limits and iterated limits.

Hints, Solution, and More

Enjoy the video

Build your foundations.

Ace your Exams.

Learn. Enjoy. Practice. Repeat.

Some Useful Links:

Stay Tuned!

When Maximum Likelihood = Method of Moments?

Is Maximum Likelihood = Method of Moments?

Maximum Likelihood Estimation is an algorithm to find a reasonable estimator. Personally, it really woos my mind - simple and yet so beautiful. Method of Moments is simpler. It doesn't woo me :p. However, still, they have a lot of similarities. Thus, we have set off to explore them. Finally, we ask for a lot of food for thought. After all, we are all explorers at heart.

We ask "Is MLE = MOM? If not, when?"

We discover a rich relationship between the two. We discover the score function and so much more exciting.

Hints, Solution, and More

Enjoy the video

Build your foundations.

Ace your Exams.

Learn. Enjoy. Practice. Repeat.

Some Useful Links:

ISI MStat 2020 PSB Problem 9 | Discussion & Solution

ISI MStat 2020 PSB Problem 9

This post discuses the problem 9 of the ISI MStat 2020 PSB Entrance Exam.

A finite population has \(N\) units, with \(x_{i}\) being the value associated with the \(i^{\text {th }}\) unit, \(i=1,2, \ldots, N\). Let \(\bar{x}{N}\) be the population mean.

A statistician carries out the following experiment.

Step 1: Draw a SRSWOR of size \(n({1}\) and denote the sample mean by \(\bar{X}{n}\).

Step 2: Draw a SRSWR of size \(m\) from \(S{1}\). The \(x\) -values of the sampled units are denoted by {\(Y_{1}, \cdots, Y_{m} \)}.

Hints, Solution, and More

Do subscribe to our channel to get instant notification of Live Session, so that you can join us live in the master classes!

Build your foundations.

Ace your Exams.

Learn. Enjoy. Practice. Repeat.

Some Useful Links:

Is MLE always a function of a Sufficient Statistic?

Is MLE always a function of a Sufficient Statistic?

MLE is an algorithm to find a reasonable estimator (personally, it really woos my mind - simple and yet so beautiful.).

Now, well - Life is hard. People has devised ways to check if an esimator is good or not - why will they care I like it or not.

So, they have developed Small Sample Properties and Large Sample Properties to do the quality control of MLE.

This post tests the flamboyancy of MLE is terms of the idea of "Sufficiency".

We ask "Is MLE sufficient? How is MLE and Sufficiency related?"

We discover a rich relationship between the two. Again, MLE wins my heart. Does it win yours? Check with the hints and the solution.

Hints, Solution, and More

Do subscribe to our channel to get instant notification of Live Session, so that you can join us live in the master classes!

Build your foundations.

Ace your Exams.

Learn. Enjoy. Practice. Repeat.

Some Useful Links:

ISI MStat 2020 PSB Problem 6 Problem & Solution

ISI MStat 2020 PSB Problem 6

This post discuses the problem 6 of the ISI MStat 2020 PSB Entrance Exam.

Suppose individuals are classified into three categories C1,C2 and C3.

Let p2,(1p)2 and 2p(1p) be the respective population proportions, where p∈(0,1). A random sample of N individuals is selected from the population and the category of each selected individual recorded.

For i=1,2,3, let Xi denote the number of individuals in the sample belonging to category Ci. Define U=X1+X32.

Hints, Solution, and More

Do subscribe to our channel to get instant notification of Live Session, so that you can join us live in the master classes!

Build your foundations.

Ace your Exams.

Learn. Enjoy. Practice. Repeat.

Some Useful Links:

ISI MStat Entrance 2020 Problems and Solutions PSA & PSB

This post contains ISI MStat Entrance PSA and PSB 2020 Problems and Solutions that can be very helpful and resourceful for your ISI MStat Preparation.

ISI MStat Entrance 2020 Problems and Solutions - Subjective Paper


ISI MStat 2020 Problem 1

Let f(x)=x2−2x+2. Let L1 and L2 be the tangents to its graph at x=0 and x=2 respectively. Find the area of the region enclosed by the graph of f and the two lines L1 and L2.

Solution

ISI MStat 2020 Problem 2

Find the number of 3×3 matrices A such that the entries of A belong to the set Z of all integers, and such that the trace of AtA is 6 . (At denotes the transpose of the matrix A).

Solution

ISI MStat 2020 Problem 3

Consider $n$ independent and identically distributed positive random variables $X_{1}, X_{2}, \ldots, X_{n}$. Suppose $S$ is a fixed subect of ${1,2, \ldots, n}$ consisting of $k$ distinct ekements where $1 \leq k<n$.
(a) Compute
$$
\mathrm{E}\left[\frac{\sum_{i \in s} X_{i}}{\sum_{i=1}^{\infty} X_{i}}\right]
$$
(b) Assume that $X_{i}$ is have mean $\mu$ and variance $\sigma^{2}, 0<\sigma^{2}<\infty$. If $j \notin S$, show that the correlation between ( $\left.\sum_{i \in s} X_{i}\right) X_{j}$ and $\sum_{i \in}X_{i} $ lies between $-\frac{1}{\sqrt{k+1}}$ and $\frac{1}{\sqrt{k+1}}$.

Solution

ISI MStat 2020 Problem 4

Let X1,X2,…,Xn be independent and identically distributed random variables. Let Sn=X1+⋯+Xn. For each of the following statements, determine whether they are true or false. Give reasons in each case.

(a) If Sn∼Exp with mean n, then each Xi∼Exp with mean 1 .

(b) If Sn∼Bin(nk,p), then each Xi∼Bin(k,p)

Solution

ISI MStat 2020 Problem 5

Let U1,U2,…,Un be independent and identically distributed random variables each having a uniform distribution on (0,1) . Let X=min{U1,U2,…,Un}, Y=max{U1,U2,…,Un}

Evaluate E[X∣Y=y] and E[Y∣X=x].

Solution

ISI MStat 2020 Problem 6

Suppose individuals are classified into three categories C1,C2 and C3 Let p2,(1−p)2 and 2p(1−p) be the respective population proportions, where p∈(0,1). A random sample of N individuals is selected from the population and the category of each selected individual recorded.

For i=1,2,3, let Xi denote the number of individuals in the sample belonging to category Ci. Define U=X1+X32

(a) Is U sufficient for p? Justify your answer.

(b) Show that the mean squared error of UN is p(1−p)2N

Solution

ISI MStat 2020 Problem 7

Consider the following model:
$$
y_{i}=\beta x_{i}+\varepsilon_{i} x_{i}, \quad i=1,2, \ldots, n
$$
where $y_{i}, i=1,2, \ldots, n$ are observed; $x_{i}, i=1,2, \ldots, n$ are known positive constants and $\beta$ is an unknown parameter. The errors $\varepsilon_{1}, \varepsilon_{2}, \ldots, \varepsilon_{n}$ are independent and identically distributed random variables having the
probability density function
$$
f(u)=\frac{1}{2 \lambda} \exp \left(-\frac{|u|}{\lambda}\right),-\infty<u<\infty
$$
and $\lambda$ is an unknown parameter.
(a) Find the least squares estimator of $\beta$.
(b) Find the maximum likelihood estimator of $\beta$.

Solution

ISI MStat 2020 Problem 8

Assume that $X_{1}, \ldots, X_{n}$ is a random sample from $N(\mu, 1)$, with $\mu \in \mathbb{R}$. We want to test $H_{0}: \underline{\mu}=0$ against $H_{1}: \mu=1$. For a fixed integer $m \in{1, \ldots, n}$, the following statistics are defined:

\begin{aligned}
T_{1} &=\left(X_{1}+\ldots+X_{m}\right) / m \\
T_{2} &=\left(X_{2}+\ldots+X_{m+1}\right) / m \\
\vdots &=\vdots \\
T_{n-m+1} &=\left(X_{n-m+1}+\ldots+X_{n}\right) / m .
\end{aligned}

Fix $\alpha \in(0,1)$. Consider the test

reject $H_{0}$ if max {${T_{i}: 1 \leq i \leq n-m+1}>c_{m, \alpha}$}

Find a choice of $c_{m, \alpha}$ $\mathbb{R}$ in terms of the standard normal distribution
function $\Phi$ that ensures that the size of the test is at most $\alpha$.

Solution

ISI MStat 2020 Problem 9

ISI MStat 2020 - Objective Paper


ISI MStat 2020 PSA Answer Key

Click on the links to learn about the detailed solution.

1. C2. D3. A4. B5. A
6. B7. C8. A9. C10. A
11. C12. D13. C14. B15. B
16. C17. D18. B19. B20. C
21. C22. D23. A24. B25. D
26. B27. D28. D29. B30. C

Please suggest changes in the comment section.

ISI MStat 2020 Probability Problems Discussion [Recorded Class]

Cheena Statistics Logo
Cheenta Statistics Department
ISI MStat and IIT JAM Training Program

Testing of Hypothesis | ISI MStat 2016 PSB Problem 9

This is a problem from the ISI MStat Entrance Examination, 2016 involving the basic idea of Type 1 error of Testing of Hypothesis but focussing on the fundamental relationship of Exponential Distribution and the Geometric Distribution.

The Problem:

Suppose \(X_{1}, X_{2}, \ldots, X_{n}\) is a random sample from an exponential distribution with mean \(\lambda\).

Assume that the observed data is available on \(\left[X_{1}\right], \ldots,\left[X_{n}\right]\), instead of \(X_{1}, \ldots, X_{n},\) where \([x]\) denotes the largest integer less than or equal to \(x\).

Consider a test for \(H_{0}: \lambda=1\) vs \(H_{1}: \lambda>1\) which rejects \(H_{0}\) when \(\sum_{i=1}^{n}\left[X_{i}\right]>c_{n} .\)

Given \(\alpha \in(0,1),\) obtain values of \(c_{n}\) such that the size of the test converges to \(\alpha\) as \(n \rightarrow \infty\).

Prerequisites:

(a) Testing of Hypothesis

(b) Type 1 Error

(c) Exponential Distribution

(d) Relationship of Exponential Distribution and Geometric Distribution

(e) Central Limit Theorem

Solution:

Proof:

\(Y\) is clearly discrete taking values in the set of non-negative integers, due to the flooring. Then, for any integer \(n \geq 0\) we have
\(
\begin{array}{c}
P(Y=n)=P(X \in[\text {an, } a(n+1))) \
=\int_{a n}^{a(n+1)} \lambda \mathrm{e}^{-\lambda x} d x=(1-p)^{n} p
\end{array}
\)
where \(p=1-e^{-\lambda a} \in(0,1),\) as \(\lambda>0\) and \(a>0\).

Testing of Hypothesis

\(H_{0}: \lambda=1\) vs \(H_{1}: \lambda>1\)

We reject \(H_{0}\) when \(\sum_{i=1}^{n}\left[X_{i}\right]>c_{n} .\)

Here, the size of the test i.e the Type 1 error (for simple hypothesis), \( \alpha_n\) = \( P(S_n > c_{n} | \lambda=1)\).

We want to select \(c_n\) such that \(\alpha_n \to \alpha\).

\(S_n\) ~ NBinom(\(n,p\)), where \( p = 1-e^{-1} \) under \(H_0\).

Now, \(\frac{\sqrt{n}(\frac{S_n}{n} - \frac{1}{p})}{\sqrt{\frac{1-p}{p^2}}} \rightarrow Z = N(0,1)\) by Central Limit Theorem.

Observe that thus, \( \alpha_n = P(S_n > c_{n} | \lambda=1) \rightarrow P(Z > \frac{\sqrt{n}(\frac{c_n}{n} - \frac{1}{p})}{\sqrt{\frac{1-p}{p^2}}}) = \alpha\).

Thus, \( \frac{\sqrt{n}(\frac{c_n}{n} - \frac{1}{p})}{\sqrt{\frac{1-p}{p^2}}} = Z_{\alpha} \).

We can solve this to find \(c_n\), where \( p = 1-e^{-1} \)

Food for Thought

If X ~ Exponential(\(\lambda\)), then what is the distribution of {X} [ The fractional part of X]. This question is crucial is getting back Exponential Distrbution from Geometric Distribution.

Rather, the food for thought, asks you how do we get Exponential Distribution from Geometric Distribution.

Stay Tuned. Stay Blessed! See you in the next post.

ISI MStat PSB 2006 Problem 8 | Bernoullian Beauty

This is a very beautiful sample problem from ISI MStat PSB 2006 Problem 8. It is based on basic idea of Maximum Likelihood Estimators, but with a bit of thinking. Give it a thought !

Problem- ISI MStat PSB 2006 Problem 8


Let \((X_1,Y_1),......,(X_n,Y_n)\) be a random sample from the discrete distributions with joint probability

\(f_{X,Y}(x,y) = \begin{cases} \frac{\theta}{4} & (x,y)=(0,0) \ and \ (1,1) \\ \frac{2-\theta}{4} & (x,y)=(0,1) \ and \ (1,0) \end{cases}\)

with \(0 \le \theta \le 2\). Find the maximum likelihood estimator of \(\theta\).

Prerequisites


Maximum Likelihood Estimators

Indicator Random Variables

Bernoulli Trials

Solution :

This is a very beautiful Problem, not very difficult, but her beauty is hidden in her simplicity, lets explore !!

Observe, that the given pmf is as good as useless while taking us anywhere, so we should think out of the box, but before going out of the box, lets collect whats in the box !

So, from the given pmf we get, \(P( \ of\ getting\ pairs \ of\ form \ (1,1) \ or \ (0,0))=2\times \frac{\theta}{4}=\frac{\theta}{2}\),

Similarly, \(P( \ of\ getting\ pairs \ of\ form \ (0,1) \ or \ (1,0))=2\times \frac{2-\theta}{4}=\frac{2-\theta}{2}=1-P( \ of\ getting\ pairs \ of\ form \ (1,1) \ or \ (0,0))\)

So, clearly it is giving us a push towards involving Bernoulli trials, isn't it !!

So, lets treat the pairs with match, .i.e. \(x=y\), be our success, and the other possibilities be failure, then our success probability is \(\frac{\theta}{2}\), where \(0\le \theta \le 2\). So, if \(S\) be the number of successful pairs in our given sample of size \(n\), then it is evident \(S \sim Binomial(n, \frac{\theta}{2})\).

So, now its simplified by all means, and we know the MLE of population proportion in binomial is the proportion of success in the sample,

Hence, \(\frac{\hat{\theta_{MLE}}}{2}= \frac{s}{n}\), where \(s\) is the number of those pairs in our sample where \(X_i=Y_i\).

So, \(\hat{\theta_{MLE}}=\frac{2(number\ of \ pairs \ in\ the\ sample\ of \ form\ (0,0)\ or \ (1,1))}{n}\).

Hence, we are done !!


Food For Thought

Say, \(X\) and \(Y\) are two independent exponential random variable with means \(\mu\) and \(\lambda\) respectively. But you observe two other variables, \(Z\) and \(W\), such that \(Z=min(X,Y)\) and \(W\) takes the value \(1\) when \(Z=X\) and \(0\) otherwise. Can you find the MLEs of the parameters ?

Give it a try !!


ISI MStat PSB 2008 Problem 10
Outstanding Statistics Program with Applications

Outstanding Statistics Program with Applications

Subscribe to Cheenta at Youtube


ISI MStat PSB 2009 Problem 8 | How big is the Mean?

This is a very simple and regular sample problem from ISI MStat PSB 2009 Problem 8. It It is based on testing the nature of the mean of Exponential distribution. Give it a Try it !

Problem- ISI MStat PSB 2009 Problem 8


Let \(X_1,.....,X_n\) be i.i.d. observation from the density,

\(f(x)=\frac{1}{\mu}exp(-\frac{x}{\mu}) , x>0\)

where \(\mu >0\) is an unknown parameter.

Consider the problem of testing the hypothesis \(H_o : \mu \le \mu_o\) against \(H_1 : \mu > \mu_o\).

(a) Show that the test with critical region \([\bar{X} \ge \mu_o {\chi_{2n,1-\alpha}}^2/2n]\), where \( {\chi^2}_{2n,1-\alpha} \) is the \((1-\alpha)\)th quantile of the \({\chi^2}_{2n}\) distribution, has size \(\alpha\).

(b) Give an expression of the power in terms of the c.d.f. of the \({\chi^2}_{2n}\) distribution.

Prerequisites


Likelihood Ratio Test

Exponential Distribution

Chi-squared Distribution

Solution :

This problem is quite regular and simple, from the given form of the hypotheses , it is almost clear that using Neyman-Pearson can land you in trouble. So, lets go for something more general , that is Likelihood Ratio Testing.

Hence, the Likelihood function of the \(\mu\) for the given sample is ,

\(L(\mu | \vec{X})=(\frac{1}{\mu})^n exp(-\frac{\sum_{i=1}^n X_i}{\mu}) , \mu>0\), also observe that sample mean \(\vec{X}\) is the MLE of \(\mu\).

So, the Likelihood Ratio statistic is,

\(\lambda(\vec{x})=\frac{\sup_{\mu \le \mu_o}L(\mu |\vec{x})}{\sup_\mu L(\mu |\vec{x})} \\ =\begin{cases} 1 & \mu_o \ge \bar{X} \\ \frac{L(\mu_o|\vec{x})}{L(\bar{X}|\vec{x})} & \mu_o < \bar{X} \end{cases} \)

So, our test function is ,

\(\phi(\vec{x})=\begin{cases} 1 & \lambda(\vec{x})<k \\ 0 & otherwise \end{cases}\).

We, reject \(H_o\) at size \(\alpha\), when \(\phi(\vec{x})=1\), for some \(k\), \(E_{H_o}(\phi) \le \alpha\),

Hence, \(\lambda(\vec{x}) < k \\ \Rightarrow L(\mu_o|\vec{x})<kL(\bar{X}|\vec{x}) \\ \ln k_1 -\frac{1}{\mu_o}\sum_{i=1}^n X_i < \ln k -n \ln \bar{X} -\frac{1}{n} \\ n \ln \bar{X}-\frac{n\bar{X}}{\mu_o} < K* \).

for some constant, \(K*\).

Let \(g(\bar{x})=n\ln \bar{x} -\frac{n\bar{x}}{\mu_o}\), and observe that \(g\) is,

Here, \(K*, \mu_o\) are fixed quantities.

decreasing function of \(\bar{x}\) for \(\bar{x} \ge \mu_o\),

Hence, there exists a \(c\) such that \(\bar{x} \ge c \),we have \(g(\bar) < K*\). See the figure.

So, the critical region of the test is of form \(\bar{X} \ge c\), for some \(c\) such that,

\(P_{H_o}(\bar{X} \ge c)=\alpha \), for some \(0 \le \alpha \le 1\), where \(\alpha\) is the size of the test.

Now, our task is to find \(c\), and for that observe, if \(X \sim Exponential(\theta)\), then \(\frac{2X}{\theta} \sim {\chi^2}_2\),

Hence, in this problem, since the \(X_i\)'s follows \(Exponential(\mu)\), hence, \(\frac{2n\bar{X}}{\mu} \sim {\chi^2}_{2n}\), we have,

\(P_{H_o}(\bar{X} \ge c)=\alpha \\ P_{H_o}(\frac{2n\bar{X}}{\mu_o} \ge \frac{2nc}{\mu_o})=\alpha \\ P_{H_o}({\chi^2}{2n} \ge \frac{2nc}{\mu_o})=\alpha \),

which gives \(c=\frac{\mu_o {\chi^2}_{2n;1-\alpha}}{2n}\),

Hence, the rejection region is indeed, \([\bar{X} \ge \frac{\mu_o {\chi^2}_{2n;1-\alpha}}{2n}\).

Hence Proved !

(b) Now, we know that the power of the test is,

\(\beta= E_{\mu}(\phi) \\ = P_{\mu}(\lambda(\bar{x})>k)=P(\bar{X} \ge \frac{\mu_o {\chi_{2n;1-\alpha}}^2}{2n}) \\ \beta = P_{\mu}({\chi^2}_{2n} \ge \frac{mu_o}{\mu}{\chi^2}_{2n;1-\alpha}) \).

Hence, the power of the test is of form of a cdf of chi-squared distribution.


Food For Thought

Can you use any other testing procedure to conduct this test ?

Think about it !!


Similar Problems and Solutions



ISI MStat PSB 2008 Problem 10
Outstanding Statistics Program with Applications

Outstanding Statistics Program with Applications

Subscribe to Cheenta at Youtube