这是一份andrews圣安德鲁斯大学 MT4113作业代写的成功案例


Given a joint probability density $f(\mathbf{y} \mid \theta)$ for an $n \times 1$ observed data vector $\mathbf{y}$ and unknown $p \times 1$ parameter vector $\theta$, denote the $n \times p$ matrix of first derivatives with respect to $\theta$ as
$$
g(\theta \mid \mathbf{y})=\partial \ln [f(\mathbf{y} \mid \theta)] / \partial \theta
$$
and the $p \times p$ matrix of second derivatives as
$$
\mathbf{H}=\mathbf{H}(\theta \mid \mathbf{y})=\partial^{2} \ln [f(\mathbf{y} \mid \boldsymbol{\theta})] / \partial \boldsymbol{\theta} \partial \boldsymbol{\theta}^{\prime} .
$$
Then the Hessian is $\mathbf{H}$, normally considered to be the estimate
$$
E\left[g(\theta \mid \mathbf{y}) g(\theta \mid \mathbf{y})^{\prime}\right]=E[\mathbf{H}(\theta \mid \mathbf{y})] .
$$

MT4113 COURSE NOTES :
$$
\frac{\partial \ell(\beta)}{\partial \beta}=\sum_{i} \mathbf{x}{i} y{i}-\sum_{i} \mathbf{x}{i} \hat{y}{i}
$$
where $\hat{y}{i}$ is the predicted value of $y{i}$ :
$$
\hat{y}{i}=\frac{1}{1+\exp \left(-\beta \mathbf{x}{i}\right)} .
$$
The next step is to set the derivative equal to 0 and solve for $\beta$ :
$$
\sum_{i} \mathbf{x}{i} y{i}-\sum_{i} \mathbf{x}{i} \hat{y}{i}=0 \text {. }
$$