金融数学代写|Mathematics for Economics EFIM10023 University of Bristol Assignment

0

Assignment-daixieTM为您提供布里斯托大学University of Bristol Mathematics for Economics EFIM10023金融数学代写代考辅导服务!

Instructions:

Mathematics is an essential tool for economics, as it allows economists to model economic phenomena and analyze economic data using rigorous and precise methods. Here are some of the key mathematical concepts that are important for economics:

  1. Calculus: Calculus is essential for modeling changes in economic variables over time. It is used to derive optimization problems, which are fundamental to economic analysis. Calculus is also used to analyze the behavior of economic functions, such as demand and supply curves.
  2. Linear Algebra: Linear algebra is used to solve systems of linear equations, which arise frequently in economics. It is also used in matrix algebra, which is important for input-output analysis, among other things.
  3. Probability and Statistics: Probability and statistics are essential for analyzing economic data, making inferences about populations from samples, and testing hypotheses. Probability is used to model the uncertainty that is inherent in economic decisions, while statistics is used to measure the variability in economic data.
  4. Optimization Theory: Optimization theory is used to model economic agents’ behavior, such as firms and consumers, who seek to maximize their objectives subject to constraints. This is the basis of microeconomic theory.
金融数学代写|FINANCIAL MATHEMATICS MATH260 University of Liverpool Assignment

问题 1.

Let $X_1, \ldots, X_n$ be indpendent random variables with $$ X_i \sim P_{\theta_i}, \text { for } i=1, \ldots, n $$ (a). For $P_\theta=N(\theta, 1)$, determine the maximum likelihood estimate of $$ \left(\theta_1, \ldots, \theta_n\right) $$ when there are no restrictions on the $\theta_i$.

证明 .

(a) For $P_{\theta}=N(\theta,1)$, the likelihood function is given by \begin{align*} L(\theta_1,\ldots,\theta_n;x_1,\ldots,x_n) &= \prod_{i=1}^{n} \frac{1}{\sqrt{2\pi}} \exp\left(-\frac{(x_i-\theta_i)^2}{2}\right) \ &= \frac{1}{(2\pi)^{n/2}} \exp\left(-\frac{1}{2}\sum_{i=1}^{n}(x_i-\theta_i)^2\right). \end{align*} Taking the logarithm of the likelihood function, we obtain \begin{align*} \log L(\theta_1,\ldots,\theta_n;x_1,\ldots,x_n) &= -\frac{n}{2}\log(2\pi) – \frac{1}{2}\sum_{i=1}^{n}(x_i-\theta_i)^2. \end{align*} Differentiating the log-likelihood function with respect to $\theta_i$, we obtain \begin{align*} \frac{\partial}{\partial \theta_i} \log L(\theta_1,\ldots,\theta_n;x_1,\ldots,x_n) &= \sum_{i=1}^{n}(x_i-\theta_i). \end{align*} Setting the derivative equal to zero, we obtain the maximum likelihood estimate (MLE) of $\theta_i$ as \begin{align*} \hat{\theta_i} &= \frac{1}{n}\sum_{i=1}^{n} x_i. \end{align*} Therefore, the MLE of $\left(\theta_1,\ldots,\theta_n\right)$ is $\left(\frac{1}{n}\sum_{i=1}^{n} x_i,\ldots,\frac{1}{n}\sum_{i=1}^{n} x_i\right)$.

问题 2.

(b). In (a), for $n=2$ determine the maximum likelihood estimate of when $\left(\theta_1, \theta_2\right)$ is restricted to satisfy $\theta_1 \leq \theta_2$.

证明 .

(b) For $\$ n=2 \$$, we have
$\backslash$ begin{align $}$
$\backslash \log L\left(\backslash\right.$ theta_1,\theta_2;x_1,x_2) $\&=-\backslash \log (2 \backslash p i)-\backslash f r a c\left{\left(x_{-} 1-\backslash \text { theta_1 }\right)^{\wedge} 2\right}{2}-$
$\backslash f r a c\left{\left(x_{-} 2-\backslash \text { theta_2 }\right)^{\wedge} 2\right}{2}$.
\end } { \text { align } { } ^ { * } }
Taking partial derivatives of the log-likelihood function with respect to $\$ \backslash$ theta_1\$ and \$\theta_2\$, we obtain
$\backslash$ begin{align $}$
$\backslash f r a c{$ partial $} \backslash$ partial $\backslash$ theta_2} $\backslash \log L\left(\backslash\right.$ theta_1,\theta_2;x_1,x_2) $8=x_{-} 2$ – $\backslash$ theta_2.
\end{align } }
Setting both derivatives equal to zero, we obtain the MLE of $\$ \backslash$ theta_1\$ and
\$1theta_2\$ as
\begin{align } { } ^ { * } }
\hat{theta_1} $8=x _1, \backslash$
\hat ${$ theta_2} $8=x$. 2 .
$\backslash$ \end } { a \operatorname { l i g n } { } ^ { * } }

问题 3.

(c). Repeat (a) and (b) when $P_\theta$ is the Laplace distribution with density $$ f(x \mid \theta)=\frac{1}{2} \exp \{-|x-\theta|\},-\infty<x<\infty . $$

证明 .

(c) For $P_{\theta}$ with Laplace distribution, the likelihood function is given by \begin{align*} L(\theta_1,\ldots,\theta_n;x_1,\ldots,x_n) &= \prod_{i=1}^{n} \frac{1}{2} \exp\left(-|x_i-\theta_i|\right) \ &= \frac{1}{2^n} \exp\left(-\sum_{i=1}^{n}|x_i-\theta_i|\right). \end{align*} Taking the logarithm of the likelihood function, we obtain \begin{align*} \log L(\theta_1,\ldots,\theta_n;x_1,\ldots,x_n) &= -n\log 2 – \sum_{i=1}^{n}|x_i-\theta_i|. \end{align*} Differentiating the log-likelihood function with respect to $\theta_i$, we obtain \begin{align*} \frac{\partial}{\partial \theta_i} \log L(\theta_1,\ldots,\theta_n;x_1,\ldots,x_n) &= \begin{cases} 1, & x_i > \theta_i \ -1, & x_i < \theta_i \ \text{undefined}, & x_i = \theta_i. \end{cases} \end{align*} Since the derivative is undefined at $x_i=\theta_i$, we need to consider the two cases $x_i < \theta_i$ and $x_i > \theta_i$ separately. For $x_i < \theta_i$, we have $\frac{\partial}{\partial \theta_i} \log L(\theta_1,\ldots,\theta_n;x_1,\ldots,x_n) = -1$. For $x_i > \theta_i$, we have $\frac{\partial}{\partial \theta_i} \log L(\theta_1,\ldots,\theta_n;x_1,\ldots,x_n) = 1$. Therefore, the MLE of $\theta_i$ is the median of ${x_1,\ldots,x_n}$, i.e., $\hat{\theta_i} = \text{median}(x_1,\ldots,x_n)$.

这是一份2023年的布里斯托大学University of Bristol Mathematics for Economics EFIM10023代写的成功案例