# 统计学入门 Introduction to Statistics MATH10282

0

where $K$ is any one-dimensional kernel. Then there is a single bandwidth parameter $h$. At a target value $x=\left(x_{1}, \ldots, x_{d}\right)^{T}$, the local sum of squares is given by
$$\sum_{i=1}^{n} w_{i}(x)\left(Y_{i}-a_{0}-\sum_{j=1}^{d} a_{j}\left(x_{i j}-x_{j}\right)\right)^{2}$$
where
$$w_{i}(x)=K\left(\left|x_{i}-x\right| / h\right)$$

where $K$ is any one-dimensional kernel. Then there is a single bandwidth parameter $h$. At a target value $x=\left(x_{1}, \ldots, x_{d}\right)^{T}$, the local sum of squares is given by
$$\sum_{i=1}^{n} w_{i}(x)\left(Y_{i}-a_{0}-\sum_{j=1}^{d} a_{j}\left(x_{i j}-x_{j}\right)\right)^{2}$$
where
$$w_{i}(x)=K\left(\left|x_{i}-x\right| / h\right)$$
The estimator is
$$\widehat{r}{n}(x)=\widehat{a}{0}$$

## MATH10282 COURSE NOTES ：

Generally one grows a very large tree, then the tree is pruned to form a subtree by collapsing regions together. The size of the tree is a tuning parameter chosen as follows. Let $N_{m}$ denote the number of points in a rectangle $R_{m}$ of a subtree $T$ and define
$$c_{m}=\frac{1}{N_{m}} \sum_{x_{i} \in R_{m}} Y_{i}, \quad Q_{m}(T)=\frac{1}{N_{m}} \sum_{x_{i} \in R_{m}}\left(Y_{i}-c_{m}\right)^{2}$$
Define the complexity of $T$ by
$$C_{\alpha}(T)=\sum_{m=1}^{|T|} N_{m} Q_{m}(T)+\alpha|T|$$

# 统计学入门|Introduction to Statistics代写 STAT 515

0

To determine the expected value of a chi-squared random variable, note first that for a standard normal random variable $Z$,
\begin{aligned} 1 &=\operatorname{Var}(Z) \ &=E\left[Z^{2}\right]-(E[Z])^{2} \ &=E\left[Z^{2}\right] \quad \text { since } E[Z]=0 \end{aligned}
Hence, $E\left[Z^{2}\right]=1$ and so
$$E\left[\sum_{i=1}^{n} Z_{i}^{2}\right]=\sum_{i=1}^{n} E\left[Z_{i}^{2}\right]=n$$

The expected value of a chi-squared random variable is equal to its number of degrees of freedom.

Suppose now that we have a sample $X_{1}, \ldots, X_{n}$ from a normal population having mean $\mu$ and variance $\sigma^{2}$. Consider the sample variance $S^{2}$ defined by
$$S^{2}=\frac{\sum_{i=1}^{n}\left(X_{i}-\bar{X}\right)^{2}}{n-1}$$

## STAT515 COURSE NOTES ：

If the population mean $\mu$ is known, then the appropriate estimator of the population variance $\sigma^{2}$ is
$$\frac{\sum_{i=1}^{n}\left(X_{i}-\mu\right)^{2}}{n}$$
If the population mean $\mu$ is unknown, then the appropriate estimator of the population variance $\sigma^{2}$ is
$$S^{2}=\frac{\sum_{i=1}^{n}\left(X_{i}-\bar{X}\right)^{2}}{n-1}$$
$S^{2}$ is an unbiased estimator of $\sigma^{2}$, that is,
$$E\left[S^{2}\right]=\sigma^{2}$$