# 概率论 Probability ST118-15/ST119-10

0

and it follows that when $X$ and $Y$ are independent
$$w\left(u_{1}, u_{2}\right)= \begin{cases}\frac{1}{2 \pi \sigma^{2}} \frac{2 u_{1}}{1+u_{2}^{2}} e^{-u_{1}^{2} / 2 \sigma^{2}}, & u_{1}>0,-\infty0 \ 0, & u_{1} \leq 0\end{cases}$$
and
$$w_{2}\left(u_{2}\right)=\frac{1}{\pi\left(1+u_{2}^{2}\right)}, \quad-\infty<u_{2}<\infty$$
respectively.

## ST118-15/ST119-10 COURSE NOTES ：

Let $F$ and $G$ be two absolutely continuous DFs; then
$$H(x)=\int_{-\infty}^{\infty} F(x-y) G^{\prime}(y) d y=\int_{-\infty}^{\infty} G(x-y) F^{\prime}(y) d y$$
is also an absolutely continuous DF with PDF
$$H^{\prime}(x)=\int_{-\infty}^{\infty} F^{\prime}(x-y) G^{\prime}(y) d y=\int_{-\infty}^{\infty} G^{\prime}(x-y) F^{\prime}(y) d y$$
If
$$F(x)=\sum_{k} p_{k} \varepsilon\left(x-x_{k}\right) \quad \text { and } \quad G(x)=\sum_{j} q_{j} \varepsilon\left(x-y_{j}\right)$$
are two DFs, then
$$H(x)=\sum_{k} \sum_{j} p_{k} q_{j} \varepsilon\left(x-x_{k}-y_{j}\right)$$

# 概率学|Probability MATH0069

0

Proof. Let $0<r \leq 1$. Then
$$E\left|X_{n}\right|^{r}=E\left|X_{n}-X+X\right|^{r}$$
so that
$$E\left|X_{n}\right|^{r}-E|X|^{r} \leq E\left|X_{n}-X\right|^{r} .$$

Interchanging $X_{n}$ and $X$, we get
$$E|X|^{r}-E\left|X_{n}\right|^{r} \leq E\left|X_{n}-X\right|^{r}$$It follows that
$$\left.|E| X\right|^{r}-E\left|X_{n}\right|^{r}|\leq E| X_{n}-\left.X\right|^{r} \rightarrow 0 \quad \text { as } \quad n \rightarrow \infty$$
For $r>1$, we use Minkowski’s inequality and obtain
$$\left[E\left|X_{n}\right|^{\mid r}\right]^{1 / r} \leq\left[E\left|X_{n}-X\right|^{r}\right]^{1 / r}+\left[E|X|^{r}\right]^{1 / r}$$
and
$$\left[E|X|^{r}\right]^{1 / r} \leq\left[E\left|X_{n}-X\right|^{r}\right]^{1 / r}+\left[E\left|X_{n}\right|^{r}\right]^{1 / r}$$

## MATH0069 COURSE NOTES ：

Proof. If $A$ is ancillary, then $P_{\theta}{A(\mathrm{X}) \leq a}$ is free of $\theta$ for all $a$. Consider the conditional probability $g_{a}(s)=P{A(\mathbf{X}) \leq a \mid S(\mathbf{X})=s}$. Clearly

Thus
$$E_{\theta}\left(g_{a}(S)-P{A(\mathbf{X}) \leq a}\right)=0$$for all $\theta$. By completeness of $S$ it follows that

that is ,
$$P_{\theta}{A(\mathbf{X}) \leq a \mid S(\mathbf{X})=s}=P{A(\mathbf{X}) \leq a}$$
with probability 1 . Hence $A$ and $S$ are independent.

# 概率学|Probability代写 MATH 605

0

Let $F_{n}$ be a sequence of DFs defined by
$$F_{n}(x)= \begin{cases}0, & x<0 \ 1-\frac{1}{n}, & 0 \leq x<n \ 1, & n \leq x .\end{cases}$$
Clearly $F_{n} \stackrel{w}{\rightarrow} F$, where $F$ is the DF given by
$$F(x)= \begin{cases}0, & x<0 \ 1, & x \geq 0\end{cases}$$

Note that $F_{n}$ is the DF of the RV $X_{n}$ with PMFand $F$ is the DF of the RV $X$ degenerate at 0 . We have
$$E X_{n}^{k}=n^{k}\left(\frac{1}{n}\right)=n^{k-1}$$
where $k$ is a positive integer. Also $E X^{k}=0$. So that
$$E X_{n}^{k} \nrightarrow E X^{k} \quad \text { for any } k \geq 1$$

## MATH605 COURSE NOTES ：

Proof. Since $X$ is an RV, we can, given $\varepsilon>0$, find a constant $k=k(\varepsilon)$ such that
$$P{|X|>k}<\frac{\varepsilon}{2} .$$
Also, $g$ is continuous on $\mathcal{R}$, so that $g$ is uniformly continuous on $[-k, k]$. It follows that there exists a $\delta=\delta(\varepsilon, k)$ such that
$$\left|g\left(x_{n}\right)-g(x)\right|<\varepsilon$$
whenever $|x| \leq k$ and $\left|x_{n}-x\right|<\delta$. Let