and it follows that when $X$ and $Y$ are independent $$ w\left(u_{1}, u_{2}\right)= \begin{cases}\frac{1}{2 \pi \sigma^{2}} \frac{2 u_{1}}{1+u_{2}^{2}} e^{-u_{1}^{2} / 2 \sigma^{2}}, & u_{1}>0,-\infty0 \ 0, & u_{1} \leq 0\end{cases} $$ and $$ w_{2}\left(u_{2}\right)=\frac{1}{\pi\left(1+u_{2}^{2}\right)}, \quad-\infty<u_{2}<\infty $$ respectively.
ST118-15/ST119-10 COURSE NOTES :
Let $F$ and $G$ be two absolutely continuous DFs; then $$ H(x)=\int_{-\infty}^{\infty} F(x-y) G^{\prime}(y) d y=\int_{-\infty}^{\infty} G(x-y) F^{\prime}(y) d y $$ is also an absolutely continuous DF with PDF $$ H^{\prime}(x)=\int_{-\infty}^{\infty} F^{\prime}(x-y) G^{\prime}(y) d y=\int_{-\infty}^{\infty} G^{\prime}(x-y) F^{\prime}(y) d y $$ If $$ F(x)=\sum_{k} p_{k} \varepsilon\left(x-x_{k}\right) \quad \text { and } \quad G(x)=\sum_{j} q_{j} \varepsilon\left(x-y_{j}\right) $$ are two DFs, then $$ H(x)=\sum_{k} \sum_{j} p_{k} q_{j} \varepsilon\left(x-x_{k}-y_{j}\right) $$
现代概率数学理论起源于16世纪Gerolamo Cardano以及17世纪Pierre de Fermat和Blaise Pascal对机会游戏的分析尝试(例如 “点的问题”)。Christiaan Huygens在1657年出版了一本关于这个主题的书,在19世纪,Pierre Laplace完成了今天被认为是经典的解释。
The modern mathematical theory of probability has its roots in attempts to analyze games of chance by Gerolamo Cardano in the sixteenth century, and by Pierre de Fermat and Blaise Pascal in the seventeenth century (for example the “problem of points”).Christiaan Huygens published a book on the subject in 1657and in the 19th century, Pierre Laplace completed what is today considered the classic interpretation.
概率论课后作业代写
Fix $x \in(\mu, \theta)$. Note that by Markov’s inequality, for any $t>0, n \geq 1$. $$ \begin{aligned} P\left(\bar{X}{n} \geq x\right) &=P\left(e^{t \bar{X}{n}} \geq e^{t x}\right) \ & \leq e^{-t x} E\left(e^{t \bar{X}{n}}\right) \ &=\exp (-t x+n \log \phi(t / n)) \end{aligned} $$ Hence, $$ \begin{aligned} & n^{-1} \log P\left(\bar{X}{n} \geq x\right) \leq-x \cdot \frac{t}{n}+\log \phi\left(\frac{t}{n}\right) \quad \text { for all } \quad t>0, n \geq 1 \ \Rightarrow & \limsup {n \rightarrow \infty} n^{-1} \log P\left(\bar{X}{n} \geq x\right) \leq \inf {t>0}{-x t+\log \phi(t)}=-\gamma(x) . \end{aligned} $$ This yields the upper bound. Next it will be shown that $$ \liminf {n \rightarrow \infty} n^{-1} \log P\left(\bar{X}_{n} \geq x\right) \geq-\gamma(x) $$