Gradient descent: Difference between revisions
Content added Content deleted
(Added Go) |
(julia example) |
||
Line 108: | Line 108: | ||
The minimum is at x[0] = 0.10764302056464771, x[1] = -1.223351901171944 |
The minimum is at x[0] = 0.10764302056464771, x[1] = -1.223351901171944 |
||
</pre> |
</pre> |
||
=={{header|Jualia}}== |
|||
<lang julia>using Optim, Base.MathConstants |
|||
f(x) = (x[1] - 1) * (x[1] - 1) * e^(-x[2]^2) + x[2] * (x[2] + 2) * e^(-2 * x[1]^2) |
|||
println(optimize(f, [0.1, -1.0], GradientDescent())) |
|||
</lang><pre> |
|||
Results of Optimization Algorithm |
|||
* Algorithm: Gradient Descent |
|||
* Starting Point: [0.1,-1.0] |
|||
* Minimizer: [0.107626844383003,-1.2232596628723371] |
|||
* Minimum: -7.500634e-01 |
|||
* Iterations: 14 |
|||
* Convergence: true |
|||
* |x - x'| ≤ 0.0e+00: false |
|||
|x - x'| = 2.97e-09 |
|||
* |f(x) - f(x')| ≤ 0.0e+00 |f(x)|: true |
|||
|f(x) - f(x')| = 0.00e+00 |f(x)| |
|||
* |g(x)| ≤ 1.0e-08: true |
|||
|g(x)| = 2.54e-09 |
|||
* Stopped by an increasing objective: false |
|||
* Reached Maximum Number of Iterations: false |
|||
* Objective Calls: 35 |
|||
* Gradient Calls: 35 |
|||
</pre> |
|||
=={{header|TypeScript}}== |
=={{header|TypeScript}}== |