# IGNOU MMT-008 Solved Assignment 2024 | M.Sc. MACS

Solved By – Narendra Kr. Sharma – M.Sc (Mathematics Honors) – Delhi University

365.00

Access via our Android App Only

Details For MMT-008 Solved Assignment

## IGNOU MMT-008 Assignment Question Paper 2024

mmt-008-assignment-question-paper-71fd4fc7-0370-49c3-9763-0667d1defc6c

# mmt-008-assignment-question-paper-71fd4fc7-0370-49c3-9763-0667d1defc6c

MMT-008 Assignment Question Paper
1. a) Consider the Markov chain having the following transition probability matrix.
$\mathrm{p}=3\left[\begin{array}{cccccc}1& 2& 3& 4& 5& 6\\ \frac{1}{3}& \frac{2}{3}& 0& 0& 0& 0\\ \frac{2}{3}& \frac{1}{3}& 0& 0& 0& 0\\ \frac{1}{4}& 0& \frac{1}{4}& 0& \frac{1}{4}& \frac{1}{4}\\ \frac{1}{6}& \frac{1}{6}& \frac{1}{6}& \frac{1}{6}& \frac{1}{6}& \frac{1}{6}\\ 0& 0& \frac{1}{4}& \frac{3}{4}& 0& 0\\ 0& 0& \frac{1}{5}& \frac{4}{5}& 0& 0\end{array}\right]$$\mathrm{p}=3\left[\begin{array}{cccccc}1& 2& 3& 4& 5& 6\\ \frac{1}{3}& \frac{2}{3}& 0& 0& 0& 0\\ \frac{2}{3}& \frac{1}{3}& 0& 0& 0& 0\\ \frac{1}{4}& 0& \frac{1}{4}& 0& \frac{1}{4}& \frac{1}{4}\\ \frac{1}{6}& \frac{1}{6}& \frac{1}{6}& \frac{1}{6}& \frac{1}{6}& \frac{1}{6}\\ 0& 0& \frac{1}{4}& \frac{3}{4}& 0& 0\\ 0& 0& \frac{1}{5}& \frac{4}{5}& 0& 0\end{array}\right]$p=3[[1,2,3,4,5,6],[(1)/(3),(2)/(3),0,0,0,0],[(2)/(3),(1)/(3),0,0,0,0],[(1)/(4),0,(1)/(4),0,(1)/(4),(1)/(4)],[(1)/(6),(1)/(6),(1)/(6),(1)/(6),(1)/(6),(1)/(6)],[0,0,(1)/(4),(3)/(4),0,0],[0,0,(1)/(5),(4)/(5),0,0]]\mathrm{p}=3\left[\begin{array}{cccccc} 1 & 2 & 3 & 4 & 5 & 6 \\ \frac{1}{3} & \frac{2}{3} & 0 & 0 & 0 & 0 \\ \frac{2}{3} & \frac{1}{3} & 0 & 0 & 0 & 0 \\ \frac{1}{4} & 0 & \frac{1}{4} & 0 & \frac{1}{4} & \frac{1}{4} \\ \frac{1}{6} & \frac{1}{6} & \frac{1}{6} & \frac{1}{6} & \frac{1}{6} & \frac{1}{6} \\ 0 & 0 & \frac{1}{4} & \frac{3}{4} & 0 & 0 \\ 0 & 0 & \frac{1}{5} & \frac{4}{5} & 0 & 0 \end{array}\right]$\mathrm{p}=3\left[\begin{array}{cccccc}1& 2& 3& 4& 5& 6\\ \frac{1}{3}& \frac{2}{3}& 0& 0& 0& 0\\ \frac{2}{3}& \frac{1}{3}& 0& 0& 0& 0\\ \frac{1}{4}& 0& \frac{1}{4}& 0& \frac{1}{4}& \frac{1}{4}\\ \frac{1}{6}& \frac{1}{6}& \frac{1}{6}& \frac{1}{6}& \frac{1}{6}& \frac{1}{6}\\ 0& 0& \frac{1}{4}& \frac{3}{4}& 0& 0\\ 0& 0& \frac{1}{5}& \frac{4}{5}& 0& 0\end{array}\right]$
i) Draw the diagram of a Markov chain.
ii) Classify the states of a Markov chain, i.e., persistent, transient, non-null and a periodic state. Also check the irreducibility of Markov chain.
iii) Find the closed sets.
iv) Find the probability of absorption to the closed classes. Also find the mean time up to absorption from transient state 3 to 4 .
b) Determine the parameters of the bivariate normal distribution:
$f\left(x,y\right)=k\mathrm{exp}\left[-\frac{8}{27}\left\{\left(x-7{\right)}^{2}-2\left(x-7\right)\left(y+5\right)+4\left(y+5{\right)}^{2}\right\}\right]$$f\left(x,y\right)=k\mathrm{exp}\left[-\frac{8}{27}\left\{\left(x-7{\right)}^{2}-2\left(x-7\right)\left(y+5\right)+4\left(y+5{\right)}^{2}\right\}\right]$f(x,y)=k exp[-(8)/(27){(x-7)^(2)-2(x-7)(y+5)+4(y+5)^(2)}]f(x, y)=k \exp \left[-\frac{8}{27}\left\{(x-7)^2-2(x-7)(y+5)+4(y+5)^2\right\}\right]$f\left(x,y\right)=k\mathrm{exp}\left[-\frac{8}{27}\left\{\left(x-7{\right)}^{2}-2\left(x-7\right)\left(y+5\right)+4\left(y+5{\right)}^{2}\right\}\right]$
Also find the value of $\mathrm{k}$$\mathrm{k}$k\mathrm{k}$\mathrm{k}$.
2. a) Suppose that the probability of a dry day (State 0 ) following a rainy day (State 1 ) is $\frac{1}{3}$$\frac{1}{3}$(1)/(3)\frac{1}{3}$\frac{1}{3}$ and the probability of a rainy day following a dry day is $\frac{1}{2}$$\frac{1}{2}$(1)/(2)\frac{1}{2}$\frac{1}{2}$. Write the transition probability matrix of the above Markov chain.
Given that ${1}^{\text{st}}$${1}^{\text{st}}$1^(“st “)1^{\text {st }}${1}^{\text{st}}$ May is a dry day, then calculate
i) the probability that ${3}^{\text{rd}}$${3}^{\text{rd}}$3^(“rd “)3^{\text {rd }}${3}^{\text{rd}}$ May is also a dry day.
ii) the stationary probabilities.
b) Let $\underset{\sim }{\mathrm{X}}\sim {\mathrm{N}}_{4}\left(\mu ,\mathrm{\Sigma }\right)$$\underset{\sim }{\mathrm{X}}\sim {\mathrm{N}}_{4}\left(\mu ,\mathrm{\Sigma }\right)$X∼∼N_(4)(mu,Sigma)\underset{\sim}{\mathrm{X}} \sim \mathrm{N}_4(\mu, \Sigma)$\underset{\sim }{\mathrm{X}}\sim {\mathrm{N}}_{4}\left(\mu ,\mathrm{\Sigma }\right)$ with
$\underset{\sim }{\mu }=\left(\begin{array}{c}2\\ 1\\ 3\\ -4\end{array}\right)\text{and}\sum \left[\begin{array}{cccc}1& 1& 1& 1\\ 1& 2& -2& -1\\ 1& -2& 9& -1\\ 1& -1& -1& 16\end{array}\right]$$\underset{\sim }{\mu }=\left(\begin{array}{c}2\\ 1\\ 3\\ -4\end{array}\right)\text{and}\sum \left[\begin{array}{cccc}1& 1& 1& 1\\ 1& 2& -2& -1\\ 1& -2& 9& -1\\ 1& -1& -1& 16\end{array}\right]$mu∼=([2],[1],[3],[-4])” and “sum[[1,1,1,1],[1,2,-2,-1],[1,-2,9,-1],[1,-1,-1,16]]\underset{\sim}{\mu}=\left(\begin{array}{c} 2 \\ 1 \\ 3 \\ -4 \end{array}\right) \text { and } \sum\left[\begin{array}{cccc} 1 & 1 & 1 & 1 \\ 1 & 2 & -2 & -1 \\ 1 & -2 & 9 & -1 \\ 1 & -1 & -1 & 16 \end{array}\right]$\underset{\sim }{\mu }=\left(\begin{array}{c}2\\ 1\\ 3\\ -4\end{array}\right)\text{and}\sum \left[\begin{array}{cccc}1& 1& 1& 1\\ 1& 2& -2& -1\\ 1& -2& 9& -1\\ 1& -1& -1& 16\end{array}\right]$
Support $\underset{\sim }{\mathrm{Y}}$$\underset{\sim }{\mathrm{Y}}$Y∼\underset{\sim}{\mathrm{Y}}$\underset{\sim }{\mathrm{Y}}$ and $\underset{\sim }{\mathrm{Z}}$$\underset{\sim }{\mathrm{Z}}$Z∼\underset{\sim}{\mathrm{Z}}$\underset{\sim }{\mathrm{Z}}$ are two partitioned subvectors of $\underset{\sim }{\mathrm{X}}$$\underset{\sim }{\mathrm{X}}$X∼\underset{\sim}{\mathrm{X}}$\underset{\sim }{\mathrm{X}}$ such that $\underset{\sim }{{\mathrm{Y}}^{\mathrm{\prime }}}=\left({\mathrm{x}}_{1}{\mathrm{x}}_{3}\right)$$\underset{\sim }{{\mathrm{Y}}^{\mathrm{\prime }}}=\left({\mathrm{x}}_{1}{\mathrm{x}}_{3}\right)$Y^(‘)∼=(x_(1)x_(3))\underset{\sim}{\mathrm{Y}^{\prime}}=\left(\mathrm{x}_1 \mathrm{x}_3\right)$\underset{\sim }{{\mathrm{Y}}^{\mathrm{\prime }}}=\left({\mathrm{x}}_{1}{\mathrm{x}}_{3}\right)$ and ${\mathrm{Z}}^{\mathrm{\prime }}=\left({\mathrm{x}}_{2}{\mathrm{x}}_{4}\right)$${\mathrm{Z}}^{\mathrm{\prime }}=\left({\mathrm{x}}_{2}{\mathrm{x}}_{4}\right)$Z^(‘)=(x_(2)x_(4))\mathrm{Z}^{\prime}=\left(\mathrm{x}_2 \mathrm{x}_4\right)${\mathrm{Z}}^{\mathrm{\prime }}=\left({\mathrm{x}}_{2}{\mathrm{x}}_{4}\right)$
i) Obtain the marginal distribution of ${\mathrm{Y}}^{\mathrm{\prime }}$${\mathrm{Y}}^{\mathrm{\prime }}$Y^(‘)\mathrm{Y}^{\prime}${\mathrm{Y}}^{\mathrm{\prime }}$.
ii) Check the independence of ${\mathrm{Y}}^{\mathrm{\prime }}$${\mathrm{Y}}^{\mathrm{\prime }}$Y^(‘)\mathrm{Y}^{\prime}${\mathrm{Y}}^{\mathrm{\prime }}$ and ${\mathrm{Z}}^{\mathrm{\prime }}$${\mathrm{Z}}^{\mathrm{\prime }}$Z^(‘)\mathrm{Z}^{\prime}${\mathrm{Z}}^{\mathrm{\prime }}$.
iii) Obtain the conditional distribution of ${\mathrm{Y}}^{\mathrm{\prime }}\mid {\mathrm{Z}}_{\sim }^{\mathrm{\prime }}$${\mathrm{Y}}^{\mathrm{\prime }}\mid {\mathrm{Z}}_{\sim }^{\mathrm{\prime }}$Y^(‘)∣Z_(∼)^(‘)\mathrm{Y}^{\prime} \mid \mathrm{Z}_{\sim}^{\prime}${\mathrm{Y}}^{\mathrm{\prime }}\mid {\mathrm{Z}}_{\sim }^{\mathrm{\prime }}$; where ${\mathrm{Y}}_{\sim }^{\mathrm{\prime }}=\left({\mathrm{x}}_{1}{\mathrm{x}}_{2}\right),{\mathrm{Z}}^{\mathrm{\prime }}=\left({\mathrm{x}}_{3}{\mathrm{x}}_{4}\right)$${\mathrm{Y}}_{\sim }^{\mathrm{\prime }}=\left({\mathrm{x}}_{1}{\mathrm{x}}_{2}\right),{\mathrm{Z}}^{\mathrm{\prime }}=\left({\mathrm{x}}_{3}{\mathrm{x}}_{4}\right)$Y_(∼)^(‘)=(x_(1)x_(2)),Z^(‘)=(x_(3)x_(4))\mathrm{Y}_{\sim}^{\prime}=\left(\mathrm{x}_1 \mathrm{x}_2\right), \mathrm{Z}^{\prime}=\left(\mathrm{x}_3 \mathrm{x}_4\right)${\mathrm{Y}}_{\sim }^{\mathrm{\prime }}=\left({\mathrm{x}}_{1}{\mathrm{x}}_{2}\right),{\mathrm{Z}}^{\mathrm{\prime }}=\left({\mathrm{x}}_{3}{\mathrm{x}}_{4}\right)$.
iv) Find $\mathrm{E}\left(\underset{\sim }{{\mathrm{Y}}^{\mathrm{\prime }}}\mid \underset{\sim }{{\mathrm{Z}}^{\mathrm{\prime }}}\right)$$\mathrm{E}\left(\underset{\sim }{{\mathrm{Y}}^{\mathrm{\prime }}}\mid \underset{\sim }{{\mathrm{Z}}^{\mathrm{\prime }}}\right)$E(Y^(‘)∼∣Z^(‘)∼)\mathrm{E}\left(\underset{\sim}{\mathrm{Y}^{\prime}} \mid \underset{\sim}{\mathrm{Z}^{\prime}}\right)$\mathrm{E}\left(\underset{\sim }{{\mathrm{Y}}^{\mathrm{\prime }}}\mid \underset{\sim }{{\mathrm{Z}}^{\mathrm{\prime }}}\right)$ ]; where $\underset{\sim }{{\mathrm{Y}}^{\mathrm{\prime }}}$$\underset{\sim }{{\mathrm{Y}}^{\mathrm{\prime }}}$Y^(‘)∼\underset{\sim}{\mathrm{Y}^{\prime}}$\underset{\sim }{{\mathrm{Y}}^{\mathrm{\prime }}}$ and ${\mathrm{Z}}_{\sim }^{\mathrm{\prime }}$${\mathrm{Z}}_{\sim }^{\mathrm{\prime }}$Z_(∼)^(‘)\mathrm{Z}_{\sim}^{\prime}${\mathrm{Z}}_{\sim }^{\mathrm{\prime }}$ are same as in (iii).
3. a) Suppose that customers arrive at a service counter in accordance with a Poisson process with the mean rate 2 per minute. Then obtain the probability that the interval between two successive arrivals is
i) more than 1 minute.
ii) 4 minutes or less.
iii) between 1 and 2 minutes.
4. a) Find the differential equation of pure birth process with ${\lambda }_{\mathrm{K}}=\mathrm{K}\lambda$${\lambda }_{\mathrm{K}}=\mathrm{K}\lambda$lambda_(K)=Klambda\lambda_{\mathrm{K}}=\mathrm{K} \lambda${\lambda }_{\mathrm{K}}=\mathrm{K}\lambda$ and the process start with one individual at time $t=0$$t=0$t=0t=0$t=0$. Hence, find ${p}_{n}\left(t\right)=P\left(N\left(t\right)=n\right)\left[N\left(t\right)$${p}_{n}\left(t\right)=P\left(N\left(t\right)=n\right)\left[N\left(t\right)$p_(n)(t)=P(N(t)=n)[N(t)p_n(t)=P(N(t)=n)[N(t)${p}_{n}\left(t\right)=P\left(N\left(t\right)=n\right)\left[N\left(t\right)$ is the number present at time $t$$t$tt$t$ ] with $\mathrm{E}\left(\mathrm{N}\left(\mathrm{t}\right)\right)$$\mathrm{E}\left(\mathrm{N}\left(\mathrm{t}\right)\right)$E(N(t))\mathrm{E}(\mathrm{N}(\mathrm{t}))$\mathrm{E}\left(\mathrm{N}\left(\mathrm{t}\right)\right)$ and $\mathrm{Var}\left(\mathrm{N}\left(\mathrm{t}\right)\right)$$\mathrm{Var}\left(\mathrm{N}\left(\mathrm{t}\right)\right)$Var(N(t))\operatorname{Var}(\mathrm{N}(\mathrm{t}))$\mathrm{Var}\left(\mathrm{N}\left(\mathrm{t}\right)\right)$. Also identify the distribution.
b) Let $\left\{{X}_{n};n\ge 1\right\}$$\left\{{X}_{n};n\ge 1\right\}${X_(n);n >= 1}\left\{X_n ; n \geq 1\right\}$\left\{{X}_{n};n\ge 1\right\}$ be an i.i.d. sequence of interoccurrence times with common probability mass function given by
$\mathrm{P}\left({\mathrm{X}}_{\mathrm{n}}=0\right)=\frac{2}{3},\mathrm{P}\left({\mathrm{X}}_{\mathrm{n}}=1\right)=\mathrm{P}\left({\mathrm{X}}_{\mathrm{n}}=2\right)=\frac{1}{6}.$$\mathrm{P}\left({\mathrm{X}}_{\mathrm{n}}=0\right)=\frac{2}{3},\mathrm{P}\left({\mathrm{X}}_{\mathrm{n}}=1\right)=\mathrm{P}\left({\mathrm{X}}_{\mathrm{n}}=2\right)=\frac{1}{6}.$P(X_(n)=0)=(2)/(3),P(X_(n)=1)=P(X_(n)=2)=(1)/(6).\mathrm{P}\left(\mathrm{X}_{\mathrm{n}}=0\right)=\frac{2}{3}, \mathrm{P}\left(\mathrm{X}_{\mathrm{n}}=1\right)=\mathrm{P}\left(\mathrm{X}_{\mathrm{n}}=2\right)=\frac{1}{6} .$\mathrm{P}\left({\mathrm{X}}_{\mathrm{n}}=0\right)=\frac{2}{3},\mathrm{P}\left({\mathrm{X}}_{\mathrm{n}}=1\right)=\mathrm{P}\left({\mathrm{X}}_{\mathrm{n}}=2\right)=\frac{1}{6}.$
Let ${N}_{t};t\ge 0$${N}_{t};t\ge 0$N_(t);t >= 0N_t ; t \geq 0${N}_{t};t\ge 0$ be the corresponding renewal process. Find the Laplace transform ${\stackrel{~}{M}}_{t}$${\stackrel{~}{M}}_{t}$tilde(M)_(t)\tilde{M}_t${\stackrel{~}{M}}_{t}$ of the renewal function, ${\mathrm{M}}_{\mathrm{t}}$${\mathrm{M}}_{\mathrm{t}}$M_(t)\mathrm{M}_{\mathrm{t}}${\mathrm{M}}_{\mathrm{t}}$.
5. The body dimensions of a certain species have been recorded. The information of body length $\mathrm{L}$$\mathrm{L}$L\mathrm{L}$\mathrm{L}$ and body weight $\mathrm{W}$$\mathrm{W}$W\mathrm{W}$\mathrm{W}$ are given below:
 Body length L (in mm)
Body length L (in mm)| Body length L | | :—: | | (in mm) |
 Body weight W (in mg)
Body weight W (in mg)| Body weight W | | :—: | | (in mg) |
45 2.9
48 2.4
45 2.8
48 2.9
44 2.4
45 2.3
45 3.1
42 1.7
50 2.4
52 3.7
“Body length L (in mm)” “Body weight W (in mg)” 45 2.9 48 2.4 45 2.8 48 2.9 44 2.4 45 2.3 45 3.1 42 1.7 50 2.4 52 3.7| Body length L <br> (in mm) | Body weight W <br> (in mg) | | :—: | :—: | | 45 | 2.9 | | 48 | 2.4 | | 45 | 2.8 | | 48 | 2.9 | | 44 | 2.4 | | 45 | 2.3 | | 45 | 3.1 | | 42 | 1.7 | | 50 | 2.4 | | 52 | 3.7 |
At 5% level of significance, test the hypothesis that all variances are equal and all covariances are equal in variance-covariance matrix for the given data.
[You may like to use the values, ${\chi }_{9,0.05}^{2}=3.84,{\chi }_{10,0.05}^{2}=4.10,{\chi }_{11,0.05}^{2}=5.09$${\chi }_{9,0.05}^{2}=3.84,{\chi }_{10,0.05}^{2}=4.10,{\chi }_{11,0.05}^{2}=5.09$chi_(9,0.05)^(2)=3.84,chi_(10,0.05)^(2)=4.10,chi_(11,0.05)^(2)=5.09\chi_{9,0.05}^2=3.84, \chi_{10,0.05}^2=4.10, \chi_{11,0.05}^2=5.09${\chi }_{9,0.05}^{2}=3.84,{\chi }_{10,0.05}^{2}=4.10,{\chi }_{11,0.05}^{2}=5.09$ ]
6. The Tooth Care Hospital provides free dental service to the patients on every Saturday morning. There are 3 dentists on duty, who are equally qualified and experienced. It takes on an average 20 minutes for a patient to get treatment and the actual time taken is known to vary approximately exponentially around this average. The patients arrive according to the Poisson distribution with an average of 6 per hour. The officer of the hospital wants to investigate the following:
i) The expected number of patients in the queue.
ii) The average time that a patient spends at the clinic.
iii) The average percentage idle time for each of the dentists.
7. a) For the two-state Markov chain, whose transition probability matrix is
$\mathrm{P}=\left(\begin{array}{cc}1-\mathrm{p}& \mathrm{p}\\ \mathrm{p}& 1-\mathrm{p}\end{array}\right);0\le \mathrm{p}\le 1$$\mathrm{P}=\left(\begin{array}{cc}1-\mathrm{p}& \mathrm{p}\\ \mathrm{p}& 1-\mathrm{p}\end{array}\right);0\le \mathrm{p}\le 1$P=([1-p,p],[p,1-p]);0 <= p <= 1\mathrm{P}=\left(\begin{array}{cc} 1-\mathrm{p} & \mathrm{p} \\ \mathrm{p} & 1-\mathrm{p} \end{array}\right) ; 0 \leq \mathrm{p} \leq 1$\mathrm{P}=\left(\begin{array}{cc}1-\mathrm{p}& \mathrm{p}\\ \mathrm{p}& 1-\mathrm{p}\end{array}\right);0\le \mathrm{p}\le 1$
Find all stationary distributions.
b) Let ${\mathrm{p}}_{\mathrm{K}}$${\mathrm{p}}_{\mathrm{K}}$p_(K)\mathrm{p}_{\mathrm{K}}${\mathrm{p}}_{\mathrm{K}}$, where $\mathrm{K}=0,1,2$$\mathrm{K}=0,1,2$K=0,1,2\mathrm{K}=0,1,2$\mathrm{K}=0,1,2$ be the probability that an individual generates $\mathrm{K}$$\mathrm{K}$K\mathrm{K}$\mathrm{K}$ offsprings. Then find the p.g.f. of $\left\{{\mathrm{p}}_{\mathrm{K}}\right\}$$\left\{{\mathrm{p}}_{\mathrm{K}}\right\}${p_(K)}\left\{\mathrm{p}_{\mathrm{K}}\right\}$\left\{{\mathrm{p}}_{\mathrm{K}}\right\}$. Also, calculate the probability of extinction when
i) ${\mathrm{p}}_{0}=\frac{1}{4},{\mathrm{p}}_{1}=\frac{1}{4}$${\mathrm{p}}_{0}=\frac{1}{4},{\mathrm{p}}_{1}=\frac{1}{4}$p_(0)=(1)/(4),p_(1)=(1)/(4)\mathrm{p}_0=\frac{1}{4}, \mathrm{p}_1=\frac{1}{4}${\mathrm{p}}_{0}=\frac{1}{4},{\mathrm{p}}_{1}=\frac{1}{4}$ and ${\mathrm{p}}_{2}=\frac{1}{2}$${\mathrm{p}}_{2}=\frac{1}{2}$p_(2)=(1)/(2)\mathrm{p}_2=\frac{1}{2}${\mathrm{p}}_{2}=\frac{1}{2}$.
ii) ${\mathrm{p}}_{0}=\frac{2}{3},{\mathrm{p}}_{1}=\frac{1}{6}$${\mathrm{p}}_{0}=\frac{2}{3},{\mathrm{p}}_{1}=\frac{1}{6}$p_(0)=(2)/(3),p_(1)=(1)/(6)\mathrm{p}_0=\frac{2}{3}, \mathrm{p}_1=\frac{1}{6}${\mathrm{p}}_{0}=\frac{2}{3},{\mathrm{p}}_{1}=\frac{1}{6}$ and ${\mathrm{p}}_{2}=\frac{1}{6}$${\mathrm{p}}_{2}=\frac{1}{6}$p_(2)=(1)/(6)\mathrm{p}_2=\frac{1}{6}${\mathrm{p}}_{2}=\frac{1}{6}$.
8. a) Let $\mathrm{p}=3$$\mathrm{p}=3$p=3\mathrm{p}=3$\mathrm{p}=3$ and $\mathrm{m}=1$$\mathrm{m}=1$m=1\mathrm{m}=1$\mathrm{m}=1$ and suppose the random variables ${\mathrm{X}}_{1},{\mathrm{X}}_{2}$${\mathrm{X}}_{1},{\mathrm{X}}_{2}$X_(1),X_(2)\mathrm{X}_1, \mathrm{X}_2${\mathrm{X}}_{1},{\mathrm{X}}_{2}$ and ${\mathrm{X}}_{3}$${\mathrm{X}}_{3}$X_(3)\mathrm{X}_3${\mathrm{X}}_{3}$ have the positive definite covariance matrix:
$\sum =\left[\begin{array}{ccc}1& 0.4& 0.3\\ 0.4& 1& 0.2\\ 0.3& 0.2& 1\end{array}\right]$$\sum =\left[\begin{array}{ccc}1& 0.4& 0.3\\ 0.4& 1& 0.2\\ 0.3& 0.2& 1\end{array}\right]$sum=[[1,0.4,0.3],[0.4,1,0.2],[0.3,0.2,1]]\sum=\left[\begin{array}{ccc} 1 & 0.4 & 0.3 \\ 0.4 & 1 & 0.2 \\ 0.3 & 0.2 & 1 \end{array}\right]$\sum =\left[\begin{array}{ccc}1& 0.4& 0.3\\ 0.4& 1& 0.2\\ 0.3& 0.2& 1\end{array}\right]$
Write its factor model.
b) For $X$$X$XX$X$ distributed as ${N}_{3}\left(\mu ,\mathrm{\Sigma }\right)$${N}_{3}\left(\mu ,\mathrm{\Sigma }\right)$N_(3)(mu,Sigma)N_3(\mu, \Sigma)${N}_{3}\left(\mu ,\mathrm{\Sigma }\right)$, find the distribution of
$\left[\begin{array}{ccc}{\mathbf{X}}_{1}& -{\mathrm{X}}_{2}& {\mathrm{X}}_{3}\\ -{\mathrm{X}}_{1}& {\mathrm{X}}_{2}& {\mathrm{X}}_{3}\end{array}\right]$$\left[\begin{array}{ccc}{\mathbf{X}}_{1}& -{\mathrm{X}}_{2}& {\mathrm{X}}_{3}\\ -{\mathrm{X}}_{1}& {\mathrm{X}}_{2}& {\mathrm{X}}_{3}\end{array}\right]$[[X_(1),-X_(2),X_(3)],[-X_(1),X_(2),X_(3)]]\left[\begin{array}{ccc} \mathbf{X}_1 & -\mathrm{X}_2 & \mathrm{X}_3 \\ -\mathrm{X}_1 & \mathrm{X}_2 & \mathrm{X}_3 \end{array}\right]$\left[\begin{array}{ccc}{\mathbf{X}}_{1}& -{\mathrm{X}}_{2}& {\mathrm{X}}_{3}\\ -{\mathrm{X}}_{1}& {\mathrm{X}}_{2}& {\mathrm{X}}_{3}\end{array}\right]$
1. a) The joint density function of random variables $\mathrm{X},\mathrm{Y}$$\mathrm{X},\mathrm{Y}$X,Y\mathrm{X}, \mathrm{Y}$\mathrm{X},\mathrm{Y}$ and $\mathrm{Z}$$\mathrm{Z}$Z\mathrm{Z}$\mathrm{Z}$ is given as
$\mathrm{f}\left(\mathrm{x},\mathrm{y},\mathrm{z}\right)=\mathrm{K}\cdot \mathrm{x}\cdot {\mathrm{e}}^{-\left(\mathrm{y}+\mathrm{z}\right)};\text{where}0<\mathrm{x}<2,\mathrm{y}\ge 0\text{and}\mathrm{z}\ge 0.$$\mathrm{f}\left(\mathrm{x},\mathrm{y},\mathrm{z}\right)=\mathrm{K}\cdot \mathrm{x}\cdot {\mathrm{e}}^{-\left(\mathrm{y}+\mathrm{z}\right)};\text{where}0<\mathrm{x}<2,\mathrm{y}\ge 0\text{and}\mathrm{z}\ge 0.$f(x,y,z)=K*x*e^(-(y+z));” where “0 < x < 2,y >= 0” and “z >= 0.\mathrm{f}(\mathrm{x}, \mathrm{y}, \mathrm{z})=\mathrm{K} \cdot \mathrm{x} \cdot \mathrm{e}^{-(\mathrm{y}+\mathrm{z})} ; \text { where } 0<\mathrm{x}<2, \mathrm{y} \geq 0 \text { and } \mathrm{z} \geq 0 .$\mathrm{f}\left(\mathrm{x},\mathrm{y},\mathrm{z}\right)=\mathrm{K}\cdot \mathrm{x}\cdot {\mathrm{e}}^{-\left(\mathrm{y}+\mathrm{z}\right)};\text{where}0<\mathrm{x}<2,\mathrm{y}\ge 0\text{and}\mathrm{z}\ge 0.$
Find
i) the constant $\mathrm{K}$$\mathrm{K}$K\mathrm{K}$\mathrm{K}$.
ii) the marginal distributions of $\mathrm{X},\mathrm{Y}$$\mathrm{X},\mathrm{Y}$X,Y\mathrm{X}, \mathrm{Y}$\mathrm{X},\mathrm{Y}$ and $\mathrm{Z}$$\mathrm{Z}$Z\mathrm{Z}$\mathrm{Z}$.
iii) $\mathrm{E}\left(\mathrm{X}\right),\mathrm{E}\left(\mathrm{Y}\right)$$\mathrm{E}\left(\mathrm{X}\right),\mathrm{E}\left(\mathrm{Y}\right)$E(X),E(Y)\mathrm{E}(\mathrm{X}), \mathrm{E}(\mathrm{Y})$\mathrm{E}\left(\mathrm{X}\right),\mathrm{E}\left(\mathrm{Y}\right)$ and $\mathrm{E}\left(\mathrm{Z}\right)$$\mathrm{E}\left(\mathrm{Z}\right)$E(Z)\mathrm{E}(\mathrm{Z})$\mathrm{E}\left(\mathrm{Z}\right)$.
iv) the conditional expectation of $\mathrm{Y}$$\mathrm{Y}$Y\mathrm{Y}$\mathrm{Y}$ given $\mathrm{X}$$\mathrm{X}$X\mathrm{X}$\mathrm{X}$ and $\mathrm{Z}$$\mathrm{Z}$Z\mathrm{Z}$\mathrm{Z}$.
v) the correlation coefficient between $\mathrm{X}$$\mathrm{X}$X\mathrm{X}$\mathrm{X}$ and $\mathrm{Y}$$\mathrm{Y}$Y\mathrm{Y}$\mathrm{Y}$.
b) For the model $\mathrm{M}|\mathrm{M}|1|\text{}\mathrm{N}|\mathrm{F}\mathrm{I}\mathrm{F}\mathrm{O}$$\mathrm{M}|\mathrm{M}|1|\text{}\mathrm{N}|\mathrm{F}\mathrm{I}\mathrm{F}\mathrm{O}$M|M|1|N|FIFO\mathrm{M}|\mathrm{M}| 1|\mathrm{~N}| \mathrm{FIFO}$\mathrm{M}|\mathrm{M}|1|\text{}\mathrm{N}|\mathrm{F}\mathrm{I}\mathrm{F}\mathrm{O}$, calculate the steady state solution for ${\mathrm{P}}_{0}$${\mathrm{P}}_{0}$P_(0)\mathrm{P}_0${\mathrm{P}}_{0}$.
$\mathrm{E}\left(\mathrm{n}\right)$$\mathrm{E}\left(\mathrm{n}\right)$E(n)\mathrm{E}(\mathrm{n})$\mathrm{E}\left(\mathrm{n}\right)$ – Average number of customers in the system
$\mathrm{E}\left(\mathrm{V}\right)$$\mathrm{E}\left(\mathrm{V}\right)$E(V)\mathrm{E}(\mathrm{V})$\mathrm{E}\left(\mathrm{V}\right)$ – Average waiting time in the system
10. State which of the following statements are true and which are false. Give a short proof or a counter example in support of your answer.
a) For three independent events ${\mathrm{E}}_{1},{\mathrm{E}}_{2}$${\mathrm{E}}_{1},{\mathrm{E}}_{2}$E_(1),E_(2)\mathrm{E}_1, \mathrm{E}_2${\mathrm{E}}_{1},{\mathrm{E}}_{2}$ and ${\mathrm{E}}_{3}$${\mathrm{E}}_{3}$E_(3)\mathrm{E}_3${\mathrm{E}}_{3}$,
$\mathrm{P}\left({\mathrm{E}}_{1}\cup {\mathrm{E}}_{2}\cup {\mathrm{E}}_{3}\right)+\mathrm{P}\left({\overline{\mathrm{E}}}_{1}\right)\mathrm{P}\left({\overline{\mathrm{E}}}_{2}\right)\mathrm{P}\left({\overline{\mathrm{E}}}_{3}\right)=0.$$\mathrm{P}\left({\mathrm{E}}_{1}\cup {\mathrm{E}}_{2}\cup {\mathrm{E}}_{3}\right)+\mathrm{P}\left({\overline{\mathrm{E}}}_{1}\right)\mathrm{P}\left({\overline{\mathrm{E}}}_{2}\right)\mathrm{P}\left({\overline{\mathrm{E}}}_{3}\right)=0.$P(E_(1)uuE_(2)uuE_(3))+P( bar(E)_(1))P( bar(E)_(2))P( bar(E)_(3))=0.\mathrm{P}\left(\mathrm{E}_1 \cup \mathrm{E}_2 \cup \mathrm{E}_3\right)+\mathrm{P}\left(\overline{\mathrm{E}}_1\right) \mathrm{P}\left(\overline{\mathrm{E}}_2\right) \mathrm{P}\left(\overline{\mathrm{E}}_3\right)=0 .$\mathrm{P}\left({\mathrm{E}}_{1}\cup {\mathrm{E}}_{2}\cup {\mathrm{E}}_{3}\right)+\mathrm{P}\left({\overline{\mathrm{E}}}_{1}\right)\mathrm{P}\left({\overline{\mathrm{E}}}_{2}\right)\mathrm{P}\left({\overline{\mathrm{E}}}_{3}\right)=0.$
b) The range of multiple and partial correlation coefficient is ]-1,1[.
c) If $\left\{X\left(t\right);t\ge 0\right\}$$\left\{X\left(t\right);t\ge 0\right\}${X(t);t >= 0}\{X(t) ; t \geq 0\}$\left\{X\left(t\right);t\ge 0\right\}$ is a poisson process, then $N\left(t\right)=\left[X\left(t+{S}_{0}\right)-X\left(t\right)\right]$$N\left(t\right)=\left[X\left(t+{S}_{0}\right)-X\left(t\right)\right]$N(t)=[X(t+S_(0))-X(t)]N(t)=\left[X\left(t+S_0\right)-X(t)\right]$N\left(t\right)=\left[X\left(t+{S}_{0}\right)-X\left(t\right)\right]$ where ${S}_{0}>0$${S}_{0}>0$S_(0) > 0S_0>0${S}_{0}>0$ is a fixed constant, is also a poisson process.
d) In Hotelling ${\mathrm{T}}^{2}$${\mathrm{T}}^{2}$T^(2)\mathrm{T}^2${\mathrm{T}}^{2}$, the value of $\mathrm{S}$$\mathrm{S}$S\mathrm{S}$\mathrm{S}$ is given by
$S=\frac{1}{n-1}\sum _{j=1}^{n}\left({X}_{j}-\mu \right){\left({X}_{j}-\mu \right)}^{\mathrm{\prime }}.$$S=\frac{1}{n-1}\sum _{j=1}^{n} \left({X}_{j}-\mu \right){\left({X}_{j}-\mu \right)}^{\mathrm{\prime }}.$S=(1)/(n-1)sum_(j=1)^(n)(X_(j)-mu)(X_(j)-mu)^(‘).S=\frac{1}{n-1} \sum_{j=1}^n\left(X_j-\mu\right)\left(X_j-\mu\right)^{\prime} .$S=\frac{1}{n-1}\sum _{j=1}^{n}\left({X}_{j}-\mu \right){\left({X}_{j}-\mu \right)}^{\mathrm{\prime }}.$
e) Let ${X}_{\sim ×1}\sim {N}_{p}\left(\mu ,\mathrm{\Sigma }\right)$${X}_{\sim ×1}\sim {N}_{p}\left(\mu ,\mathrm{\Sigma }\right)$X_(∼xx1)∼N_(p)(mu,Sigma)X_{\sim \times 1} \sim N_p(\mu, \Sigma)${X}_{\sim ×1}\sim {N}_{p}\left(\mu ,\mathrm{\Sigma }\right)$ and ${X}_{p×n}$${X}_{p×n}$X_(p xx n)X_{p \times n}${X}_{p×n}$ be the state matrix, then parameters involved in the above distribution are $\mathrm{p}$$\mathrm{p}$p\mathrm{p}$\mathrm{p}$ for $\mu$$\mu$mu\mu$\mu$ and $\frac{1}{2}\mathrm{p}\left(\mathrm{p}+1\right)$$\frac{1}{2}\mathrm{p}\left(\mathrm{p}+1\right)$(1)/(2)p(p+1)\frac{1}{2} \mathrm{p}(\mathrm{p}+1)$\frac{1}{2}\mathrm{p}\left(\mathrm{p}+1\right)$ for $\mathrm{\Sigma }$$\mathrm{\Sigma }$Sigma\Sigma$\mathrm{\Sigma }$.
$$sin^2\left(\frac{\theta }{2}\right)=\frac{1-cos\:\theta }{2}$$

## MMT-008 Sample Solution 2024

mmt-008-solved-assignment-2024-ss-8e24e610-06c9-4b43-84f6-a5bf6ef5ab5c

# mmt-008-solved-assignment-2024-ss-8e24e610-06c9-4b43-84f6-a5bf6ef5ab5c

MMT-008 Solved Assignment 2024
1. a) Consider the Markov chain having the following transition probability matrix.
$\mathrm{p}=\left[\begin{array}{cccccc}1& 2& 3& 4& 5& 6\\ \frac{1}{3}& \frac{2}{3}& 0& 0& 0& 0\\ \frac{2}{3}& \frac{1}{3}& 0& 0& 0& 0\\ \frac{1}{4}& 0& \frac{1}{4}& 0& \frac{1}{4}& \frac{1}{4}\\ \frac{1}{6}& \frac{1}{6}& \frac{1}{6}& \frac{1}{6}& \frac{1}{6}& \frac{1}{6}\\ 0& 0& \frac{1}{4}& \frac{3}{4}& 0& 0\\ 0& 0& \frac{1}{5}& \frac{4}{5}& 0& 0\end{array}\right]$$\mathrm{p}=\left[\begin{array}{cccccc}1& 2& 3& 4& 5& 6\\ \frac{1}{3}& \frac{2}{3}& 0& 0& 0& 0\\ \frac{2}{3}& \frac{1}{3}& 0& 0& 0& 0\\ \frac{1}{4}& 0& \frac{1}{4}& 0& \frac{1}{4}& \frac{1}{4}\\ \frac{1}{6}& \frac{1}{6}& \frac{1}{6}& \frac{1}{6}& \frac{1}{6}& \frac{1}{6}\\ 0& 0& \frac{1}{4}& \frac{3}{4}& 0& 0\\ 0& 0& \frac{1}{5}& \frac{4}{5}& 0& 0\end{array}\right]$p=[[1,2,3,4,5,6],[(1)/(3),(2)/(3),0,0,0,0],[(2)/(3),(1)/(3),0,0,0,0],[(1)/(4),0,(1)/(4),0,(1)/(4),(1)/(4)],[(1)/(6),(1)/(6),(1)/(6),(1)/(6),(1)/(6),(1)/(6)],[0,0,(1)/(4),(3)/(4),0,0],[0,0,(1)/(5),(4)/(5),0,0]]\mathrm{p}=\left[\begin{array}{cccccc} 1 & 2 & 3 & 4 & 5 & 6 \\ \frac{1}{3} & \frac{2}{3} & 0 & 0 & 0 & 0 \\ \frac{2}{3} & \frac{1}{3} & 0 & 0 & 0 & 0 \\ \frac{1}{4} & 0 & \frac{1}{4} & 0 & \frac{1}{4} & \frac{1}{4} \\ \frac{1}{6} & \frac{1}{6} & \frac{1}{6} & \frac{1}{6} & \frac{1}{6} & \frac{1}{6} \\ 0 & 0 & \frac{1}{4} & \frac{3}{4} & 0 & 0 \\ 0 & 0 & \frac{1}{5} & \frac{4}{5} & 0 & 0 \end{array}\right]$\mathrm{p}=\left[\begin{array}{cccccc}1& 2& 3& 4& 5& 6\\ \frac{1}{3}& \frac{2}{3}& 0& 0& 0& 0\\ \frac{2}{3}& \frac{1}{3}& 0& 0& 0& 0\\ \frac{1}{4}& 0& \frac{1}{4}& 0& \frac{1}{4}& \frac{1}{4}\\ \frac{1}{6}& \frac{1}{6}& \frac{1}{6}& \frac{1}{6}& \frac{1}{6}& \frac{1}{6}\\ 0& 0& \frac{1}{4}& \frac{3}{4}& 0& 0\\ 0& 0& \frac{1}{5}& \frac{4}{5}& 0& 0\end{array}\right]$
i) Draw the diagram of a Markov chain.
ii) Classify the states of a Markov chain, i.e., persistent, transient, non-null and a periodic state. Also check the irreducibility of Markov chain.
iii) Find the closed sets.
iv) Find the probability of absorption to the closed classes. Also find the mean time up to absorption from transient state 3 to 4 .
i) Diagram of the Markov Chain:
To draw the diagram of the Markov chain, we represent each state as a node and draw directed edges between nodes with the corresponding transition probabilities as labels. Here’s the diagram:
Here’s the diagram of the Markov chain with states labeled as 1 to 6:
        +--------+     +--------+     +--------+     +--------+     +--------+     +--------+
|  1     | ----> |  2     | ----> |  3     | ----> |  4     | ----> |  5     | ----> |  6     |
+--+-----+     +--+-----+     +--+-----+     +--+-----+     +--+-----+     +--+-----+
|       \     |       \     |       \     |       \     |       \     |       \
| 1/3     | ----> | 2/3     | ----> (0)     | ----> (0)     | ----> (0)     | ----> (0)
|       /     |       /     |       /     |       /     |       /     |       /
+--------+     +--+-----+     +--+-----+     +--+-----+     +--+-----+     +--+-----+
|               |       \     |       \     |       \     |       \     |       \
|       \        | 1/4     | ----> | 1/4     | ----> | 1/4     | ----> | 1/4     |
|        -----> |       /     |       /     |       /     |       /     |       /
+--------+     +--+-----+     +--+-----+     +--+-----+     +--+-----+     +--+-----+
|                |               |
(1/6)                          | (1/6)          | (1/6)          | (1/6)

+--------+     +--------+
|  6     | ---- |  4     |
+--+-----+     +--+-----+
|               |
| (3/4)         |
|               |
+--------+     +--------+

Explanation:
• Each circle represents a state (1 to 6).
• Arrows show the possible transitions between states.
• The number on each arrow represents the probability of that transition.
• States 3, 5, and 6 are absorbing states as they have no outgoing arrows.
• States 1, 2, and 4 are transient states as they can leave and enter again.
ii) Classification of the States of the Markov Chain:
• Persistent States: These are states that, once entered, the process has a nonzero probability of staying in forever. In this chain, there are no persistent states.
• Transient States: These are states that, once left, the process has a zero probability of returning to. In this chain, states 1, 2, 3, and 4 are transient states.
• Null States: These are states that do not lead to any other state. In this chain, there are no null states.
• Periodic States: A state is periodic if the process can return to it only at multiples of some integer greater than 1. In this chain, there are no periodic states, as all states can potentially be returned to at any time.
• Irreducibility: A Markov chain is irreducible if it is possible to get from any state to any other state. This chain is not irreducible because there are states that cannot be reached from other states (e.g., there is no path from state 1 to state 6).
iii) Closed Sets:
A closed set in a Markov chain is a set of states such that once the process enters any state in the set, it cannot leave the set. In this chain, the closed sets are:
• {5, 6}
iv) Probability of Absorption to the Closed Classes:
The probability of absorption to a closed class is the probability that the process will eventually end up in that class starting from a transient state. To find these probabilities, we usually set up and solve a system of linear equations based on the transition probabilities. However, in this case, it’s straightforward to see:
• From state 3, the probability of absorption to the closed class {5, 6} is 1, as there is a direct transition to state 5 with probability 1/6.
• From state 4, the probability of absorption to the closed class {5, 6} is also 1, as there is a direct transition to state 5 with probability 1/4.
Mean Time Up to Absorption from Transient State 3 to 4:
The mean time up to absorption is the expected number of steps it takes to reach an absorbing state from a given transient state. To find this, we can use the fundamental matrix method, which involves inverting a matrix derived from the transition probabilities. However, in this simple case, we can observe:
• From state 3, the process moves to state 5 with probability 1/6 in one step, so the mean time to absorption is 1.
• From state 4, the process moves to state 5 with probability 1/4 in one step, so the mean time to absorption is also 1.
In conclusion, the Markov chain has transient states 1, 2, 3, and 4, with closed set {5, 6}. The probability of absorption to the closed class {5, 6} is 1 for states 3 and 4, and the mean time to absorption from states 3 and 4 is 1 step.
b) Determine the parameters of the bivariate normal distribution:
$f\left(x,y\right)=k\mathrm{exp}\left[-\frac{8}{27}\left\{\left(x-7{\right)}^{2}-2\left(x-7\right)\left(y+5\right)+4\left(y+5{\right)}^{2}\right\}\right]$$f\left(x,y\right)=k\mathrm{exp}\left[-\frac{8}{27}\left\{\left(x-7{\right)}^{2}-2\left(x-7\right)\left(y+5\right)+4\left(y+5{\right)}^{2}\right\}\right]$f(x,y)=k exp[-(8)/(27){(x-7)^(2)-2(x-7)(y+5)+4(y+5)^(2)}]f(x, y)=k \exp \left[-\frac{8}{27}\left\{(x-7)^2-2(x-7)(y+5)+4(y+5)^2\right\}\right]$f\left(x,y\right)=k\mathrm{exp}\left[-\frac{8}{27}\left\{\left(x-7{\right)}^{2}-2\left(x-7\right)\left(y+5\right)+4\left(y+5{\right)}^{2}\right\}\right]$
Also find the value of $\mathrm{k}$$\mathrm{k}$k\mathrm{k}$\mathrm{k}$.
$f\left(x,y\right)=k\mathrm{exp}\left[-\frac{8}{27}\left\{\left(x-7{\right)}^{2}-2\left(x-7\right)\left(y+5\right)+4\left(y+5{\right)}^{2}\right\}\right]$$f\left(x,y\right)=k\mathrm{exp}\left[-\frac{8}{27}\left\{\left(x-7{\right)}^{2}-2\left(x-7\right)\left(y+5\right)+4\left(y+5{\right)}^{2}\right\}\right]$f(x,y)=k exp[-(8)/(27){(x-7)^(2)-2(x-7)(y+5)+4(y+5)^(2)}]f(x, y) = k \exp \left[-\frac{8}{27}\left\{(x-7)^2 – 2(x-7)(y+5) + 4(y+5)^2\right\}\right]$f\left(x,y\right)=k\mathrm{exp}\left[-\frac{8}{27}\left\{\left(x-7{\right)}^{2}-2\left(x-7\right)\left(y+5\right)+4\left(y+5{\right)}^{2}\right\}\right]$
$f\left(x,y\right)=\frac{1}{2\pi {\sigma }_{x}{\sigma }_{y}\sqrt{1-{\rho }^{2}}}\mathrm{exp}\left[-\frac{1}{2\left(1-{\rho }^{2}\right)}\left(\frac{\left(x-{\mu }_{x}{\right)}^{2}}{{\sigma }_{x}^{2}}-2\rho \frac{\left(x-{\mu }_{x}\right)\left(y-{\mu }_{y}\right)}{{\sigma }_{x}{\sigma }_{y}}+\frac{\left(y-{\mu }_{y}{\right)}^{2}}{{\sigma }_{y}^{2}}\right)\right]$$f\left(x,y\right)=\frac{1}{2\pi {\sigma }_{x}{\sigma }_{y}\sqrt{1-{\rho }^{2}}}\mathrm{exp}\left[-\frac{1}{2\left(1-{\rho }^{2}\right)}\left(\frac{\left(x-{\mu }_{x}{\right)}^{2}}{{\sigma }_{x}^{2}}-2\rho \frac{\left(x-{\mu }_{x}\right)\left(y-{\mu }_{y}\right)}{{\sigma }_{x}{\sigma }_{y}}+\frac{\left(y-{\mu }_{y}{\right)}^{2}}{{\sigma }_{y}^{2}}\right)\right]$f(x,y)=(1)/(2pisigma _(x)sigma _(y)sqrt(1-rho^(2)))exp[-(1)/(2(1-rho^(2)))(((x-mu _(x))^(2))/(sigma_(x)^(2))-2rho((x-mu _(x))(y-mu _(y)))/(sigma _(x)sigma _(y))+((y-mu _(y))^(2))/(sigma_(y)^(2)))]f(x, y) = \frac{1}{2\pi\sigma_x\sigma_y\sqrt{1-\rho^2}} \exp \left[ -\frac{1}{2(1-\rho^2)}\left( \frac{(x-\mu_x)^2}{\sigma_x^2} – 2\rho\frac{(x-\mu_x)(y-\mu_y)}{\sigma_x\sigma_y} + \frac{(y-\mu_y)^2}{\sigma_y^2} \right) \right]$f\left(x,y\right)=\frac{1}{2\pi {\sigma }_{x}{\sigma }_{y}\sqrt{1-{\rho }^{2}}}\mathrm{exp}\left[-\frac{1}{2\left(1-{\rho }^{2}\right)}\left(\frac{\left(x-{\mu }_{x}{\right)}^{2}}{{\sigma }_{x}^{2}}-2\rho \frac{\left(x-{\mu }_{x}\right)\left(y-{\mu }_{y}\right)}{{\sigma }_{x}{\sigma }_{y}}+\frac{\left(y-{\mu }_{y}{\right)}^{2}}{{\sigma }_{y}^{2}}\right)\right]$
1. Mean of $x$$x$xx$x$ (${\mu }_{x}$${\mu }_{x}$mu _(x)\mu_x${\mu }_{x}$): 7
2. Mean of $y$$y$yy$y$ (${\mu }_{y}$${\mu }_{y}$mu _(y)\mu_y${\mu }_{y}$): -5
3. Variance of $x$$x$xx$x$ (${\sigma }_{x}^{2}$${\sigma }_{x}^{2}$sigma_(x)^(2)\sigma_x^2${\sigma }_{x}^{2}$): The coefficient of $\left(x-7{\right)}^{2}$$\left(x-7{\right)}^{2}$(x-7)^(2)(x-7)^2$\left(x-7{\right)}^{2}$ is $\frac{8}{27}$$\frac{8}{27}$(8)/(27)\frac{8}{27}$\frac{8}{27}$, so ${\sigma }_{x}^{2}=\frac{27}{2×\frac{8}{27}}=\frac{27}{16}$${\sigma }_{x}^{2}=\frac{27}{2×\frac{8}{27}}=\frac{27}{16}$sigma_(x)^(2)=(27)/(2xx(8)/(27))=(27)/(16)\sigma_x^2 = \frac{27}{2 \times \frac{8}{27}} = \frac{27}{16}${\sigma }_{x}^{2}=\frac{27}{2×\frac{8}{27}}=\frac{27}{16}$, and ${\sigma }_{x}=\sqrt{\frac{27}{16}}=\frac{3\sqrt{3}}{4}$${\sigma }_{x}=\sqrt{\frac{27}{16}}=\frac{3\sqrt{3}}{4}$sigma _(x)=sqrt((27)/(16))=(3sqrt3)/(4)\sigma_x = \sqrt{\frac{27}{16}} = \frac{3\sqrt{3}}{4}${\sigma }_{x}=\sqrt{\frac{27}{16}}=\frac{3\sqrt{3}}{4}$.
4. Variance of $y$$y$yy$y$ (${\sigma }_{y}^{2}$${\sigma }_{y}^{2}$sigma_(y)^(2)\sigma_y^2${\sigma }_{y}^{2}$): The coefficient of $\left(y+5{\right)}^{2}$$\left(y+5{\right)}^{2}$(y+5)^(2)(y+5)^2$\left(y+5{\right)}^{2}$ is $\frac{32}{27}$$\frac{32}{27}$(32)/(27)\frac{32}{27}$\frac{32}{27}$, so ${\sigma }_{y}^{2}=\frac{27}{2×\frac{32}{27}}=\frac{27}{64}$${\sigma }_{y}^{2}=\frac{27}{2×\frac{32}{27}}=\frac{27}{64}$sigma_(y)^(2)=(27)/(2xx(32)/(27))=(27)/(64)\sigma_y^2 = \frac{27}{2 \times \frac{32}{27}} = \frac{27}{64}${\sigma }_{y}^{2}=\frac{27}{2×\frac{32}{27}}=\frac{27}{64}$, and ${\sigma }_{y}=\sqrt{\frac{27}{64}}=\frac{3\sqrt{3}}{8}$${\sigma }_{y}=\sqrt{\frac{27}{64}}=\frac{3\sqrt{3}}{8}$sigma _(y)=sqrt((27)/(64))=(3sqrt3)/(8)\sigma_y = \sqrt{\frac{27}{64}} = \frac{3\sqrt{3}}{8}${\sigma }_{y}=\sqrt{\frac{27}{64}}=\frac{3\sqrt{3}}{8}$.
5. Correlation coefficient ($\rho$$\rho$rho\rho$\rho$): The coefficient of $\left(x-7\right)\left(y+5\right)$$\left(x-7\right)\left(y+5\right)$(x-7)(y+5)(x-7)(y+5)$\left(x-7\right)\left(y+5\right)$ is $-\frac{16}{27}$$-\frac{16}{27}$-(16)/(27)-\frac{16}{27}$-\frac{16}{27}$, so $\rho =\frac{-\frac{16}{27}}{2×\frac{3\sqrt{3}}{4}×\frac{3\sqrt{3}}{8}}=-\frac{1}{2}$$\rho =\frac{-\frac{16}{27}}{2×\frac{3\sqrt{3}}{4}×\frac{3\sqrt{3}}{8}}=-\frac{1}{2}$rho=(-(16)/(27))/(2xx(3sqrt3)/(4)xx(3sqrt3)/(8))=-(1)/(2)\rho = \frac{-\frac{16}{27}}{2 \times \frac{3\sqrt{3}}{4} \times \frac{3\sqrt{3}}{8}} = -\frac{1}{2}$\rho =\frac{-\frac{16}{27}}{2×\frac{3\sqrt{3}}{4}×\frac{3\sqrt{3}}{8}}=-\frac{1}{2}$.
To find the value of $k$$k$kk$k$, we use the fact that the integral of the bivariate normal distribution function over the entire plane is 1:
${\iint }_{-\mathrm{\infty }}^{\mathrm{\infty }}f\left(x,y\right)\phantom{\rule{thinmathspace}{0ex}}dx\phantom{\rule{thinmathspace}{0ex}}dy=1$${\iint }_{-\mathrm{\infty }}^{\mathrm{\infty }} f\left(x,y\right)\phantom{\rule{thinmathspace}{0ex}}dx\phantom{\rule{thinmathspace}{0ex}}dy=1$∬_(-oo)^(oo)f(x,y)dxdy=1\iint_{-\infty}^{\infty} f(x, y) \,dx\,dy = 1