title-meta | author-meta | ||
---|---|---|---|
|
|
Suppose that
-
$g'(\alpha) = 1$ -
$|g'(\alpha)| < 1$ -
$|g'(\alpha)| \leq 1$ - none of the above
Answer: For example, consider a function
Suppose that a sequence
-
$y_k$ converges linearly to$0$ -
$y_k$ converges superlinearly to$0$
Answer: Linear convergence means
Consider a scalar equation
- bisection method
- Newton's method
- secant method
- none of the above
Consider a continuous function
- if
$f$ is coercive on$\mathbb{R}$ , then$f$ has a global minimum in$\mathbb{R}$ - if
$f$ has a unique global minimum in$\mathbb{R}$ , then$f$ is coercive on$\mathbb{R}$ - none of the above
The function
- coercive
- convex
- strictly convex
- none of the above
The Hessian of the function
- positive definite
- negative definite
- indefinite
- none of the above
To optimize a function
- the function
$f$ - the gradient
$\nabla f$ - the Hessian
$H_f$
Recall the Lagrangian function
- one
- depends on
$|A|_2$ - depends on
$|A|_2$ and$|y|_2$
Answer: Newton's method solves any quadratic optimization problem exactly after one iteration.