# 凸优化作业代写convex optimization代考

## 代写凸优化作业代写convex optimization

### 凸集Convex set代写

• 水平集Level set
• 凹函数Concave function

## 凸优化的相关

Unconstrained convex optimization can be easily solved with gradient descent (a special case of steepest descent) or Newton’s method, combined with line search for an appropriate step size; these can be mathematically proven to converge quickly, especially the latter method.

## 凸优化课后作业代写

$$p^{\star}=\inf {x} \sup {\lambda \geq 0} L(x, \lambda) .$$
By the definition of the dual function, we also have
$$d^{\star}=\sup {\lambda \succeq 0} \inf {x} L(x, \lambda) .$$
Thus, weak duality can be expressed as the inequality
$$\sup {\lambda \succeq 0} \inf {x} L(x, \lambda) \leq \inf {x} \sup {\lambda \geq 0} L(x, \lambda),$$
and strong duality as the equality
$$\sup {\lambda \succeq 0} \inf {x} L(x, \lambda)=\inf {x} \sup {\lambda \succeq 0} L(x, \lambda) .$$

## 凸优化课后作业代写的应用代写

Ben-Hain和Elishakoff(1990)，Elishakoff等人(1994)将凸分析应用于模型的不确定性。