# 概率学|MA20225 Probability 2B代写

0

Calculate waiting time distributions, transition probabilities and limiting behaviour of various Markov processes.

$B_{a}{ }^{2}=\frac{1}{n} \sum_{i=1}^{n}\left(z_{1}-M_{n}\right)^{2}$
where $M_{a}$ is the sample mean as defined earlies. We may expand the above expression,
$$S_{n}{ }^{t}=\frac{1}{\pi} \sum_{i=1}^{n} x_{i}{ }^{n}-\frac{2}{n} M_{a} \sum_{i=1}^{n} x_{i}+M_{a}{ }^{n}=\sum_{n} \sum_{i=1}^{n} x_{i}{ }^{2}-M_{a^{n}}$$
This is a more useful form of $S_{\mathrm{m}}{ }^{2}$ for the ealeulation of its expectntion
$$B\left(S_{m}{ }^{2}\right)=\frac{1}{x} B\left(\sum_{i=1}^{n} x_{4}^{7}\right)-E\left(M_{n}{ }^{2}\right)$$

## MA20225 COURSE NOTES ：

$$\sum_{j=1}^{k} P_{2}\left(C_{3} \mid A_{1} C_{2}\right)=0.0$$
Part (e) is mast ensily done by direct sabestitution, $\sum_{j=1}^{k} P\left(A_{j}^{f}\right]=\sum_{j=1}^{\infty}\left[1-P\left(A_{j}\right)\right]=k-\sum_{j=1}^{k} P\left(A_{j}\right)$
and gince we are told that the $\mathcal{A}{j} / s$ ser nutually exclusive and eollectizely exhaustive, we have $$\sum{k=1}^{k} P\left(A_{j}^{r}\right)=k-1.0$$
For part (ل), we ean uee the definition ol cotditional probabtlity and the given properties of listo 1 and 2 to $w$ rite
$$\sum_{i=1}^{k} \sum_{i=1}^{2} P\left(A_{1}\right) P\left(B_{j} \mid A_{i}\right)=\sum_{i=1}^{k} \sum_{i=1}^{t} P\left(A_{4} B_{j}\right)=\sum_{i=1}^{k} P\left(A_{i}\right)=1.0$$

# 概率学|Probability MATH0069

0

Proof. Let $0<r \leq 1$. Then
$$E\left|X_{n}\right|^{r}=E\left|X_{n}-X+X\right|^{r}$$
so that
$$E\left|X_{n}\right|^{r}-E|X|^{r} \leq E\left|X_{n}-X\right|^{r} .$$

Interchanging $X_{n}$ and $X$, we get
$$E|X|^{r}-E\left|X_{n}\right|^{r} \leq E\left|X_{n}-X\right|^{r}$$It follows that
$$\left.|E| X\right|^{r}-E\left|X_{n}\right|^{r}|\leq E| X_{n}-\left.X\right|^{r} \rightarrow 0 \quad \text { as } \quad n \rightarrow \infty$$
For $r>1$, we use Minkowski’s inequality and obtain
$$\left[E\left|X_{n}\right|^{\mid r}\right]^{1 / r} \leq\left[E\left|X_{n}-X\right|^{r}\right]^{1 / r}+\left[E|X|^{r}\right]^{1 / r}$$
and
$$\left[E|X|^{r}\right]^{1 / r} \leq\left[E\left|X_{n}-X\right|^{r}\right]^{1 / r}+\left[E\left|X_{n}\right|^{r}\right]^{1 / r}$$

## MATH0069 COURSE NOTES ：

Proof. If $A$ is ancillary, then $P_{\theta}{A(\mathrm{X}) \leq a}$ is free of $\theta$ for all $a$. Consider the conditional probability $g_{a}(s)=P{A(\mathbf{X}) \leq a \mid S(\mathbf{X})=s}$. Clearly

Thus
$$E_{\theta}\left(g_{a}(S)-P{A(\mathbf{X}) \leq a}\right)=0$$for all $\theta$. By completeness of $S$ it follows that

that is ,
$$P_{\theta}{A(\mathbf{X}) \leq a \mid S(\mathbf{X})=s}=P{A(\mathbf{X}) \leq a}$$
with probability 1 . Hence $A$ and $S$ are independent.

# 概率学|Probability代写 MATH 605

0

Let $F_{n}$ be a sequence of DFs defined by
$$F_{n}(x)= \begin{cases}0, & x<0 \ 1-\frac{1}{n}, & 0 \leq x<n \ 1, & n \leq x .\end{cases}$$
Clearly $F_{n} \stackrel{w}{\rightarrow} F$, where $F$ is the DF given by
$$F(x)= \begin{cases}0, & x<0 \ 1, & x \geq 0\end{cases}$$

Note that $F_{n}$ is the DF of the RV $X_{n}$ with PMFand $F$ is the DF of the RV $X$ degenerate at 0 . We have
$$E X_{n}^{k}=n^{k}\left(\frac{1}{n}\right)=n^{k-1}$$
where $k$ is a positive integer. Also $E X^{k}=0$. So that
$$E X_{n}^{k} \nrightarrow E X^{k} \quad \text { for any } k \geq 1$$

Proof. Since $X$ is an RV, we can, given $\varepsilon>0$, find a constant $k=k(\varepsilon)$ such that
$$P{|X|>k}<\frac{\varepsilon}{2} .$$
Also, $g$ is continuous on $\mathcal{R}$, so that $g$ is uniformly continuous on $[-k, k]$. It follows that there exists a $\delta=\delta(\varepsilon, k)$ such that
$$\left|g\left(x_{n}\right)-g(x)\right|<\varepsilon$$
whenever $|x| \leq k$ and $\left|x_{n}-x\right|<\delta$. Let