Newton's Method for a step size to move in the direction of the gradient
$begingroup$
I am reading this article that talks about Newton's method that can give us an ideal step size to move in the direction of the gradient.
I do not understand what $epsilon$ is in the following part of the article:
I am confused on what $g$ is and what this math expression represents.
Can someone help me to understand it?
Thank you!
calculus gradient-descent hessian-matrix
$endgroup$
add a comment |
$begingroup$
I am reading this article that talks about Newton's method that can give us an ideal step size to move in the direction of the gradient.
I do not understand what $epsilon$ is in the following part of the article:
I am confused on what $g$ is and what this math expression represents.
Can someone help me to understand it?
Thank you!
calculus gradient-descent hessian-matrix
$endgroup$
$begingroup$
You may also be interested in Steffensen's method, another Newton variant that has shrinking step size as we approach the root.
$endgroup$
– J.G.
Dec 24 '18 at 23:37
$begingroup$
If you approximate the cost near $x$ by $f(x+h) approx langle g, h rangle >+ { 1over 2} langle h, H h rangle$ ($g$ is the gradient of $f$, $H$ is the Hessian) and you consider a step in the direction $-g$ then the step size that extremises the approximate cost is given by the above formula. (That is, consider the approximation with parameter $h=-lambda g$.)
$endgroup$
– copper.hat
Dec 25 '18 at 1:40
add a comment |
$begingroup$
I am reading this article that talks about Newton's method that can give us an ideal step size to move in the direction of the gradient.
I do not understand what $epsilon$ is in the following part of the article:
I am confused on what $g$ is and what this math expression represents.
Can someone help me to understand it?
Thank you!
calculus gradient-descent hessian-matrix
$endgroup$
I am reading this article that talks about Newton's method that can give us an ideal step size to move in the direction of the gradient.
I do not understand what $epsilon$ is in the following part of the article:
I am confused on what $g$ is and what this math expression represents.
Can someone help me to understand it?
Thank you!
calculus gradient-descent hessian-matrix
calculus gradient-descent hessian-matrix
asked Dec 24 '18 at 23:29
YohanRothYohanRoth
6471715
6471715
$begingroup$
You may also be interested in Steffensen's method, another Newton variant that has shrinking step size as we approach the root.
$endgroup$
– J.G.
Dec 24 '18 at 23:37
$begingroup$
If you approximate the cost near $x$ by $f(x+h) approx langle g, h rangle >+ { 1over 2} langle h, H h rangle$ ($g$ is the gradient of $f$, $H$ is the Hessian) and you consider a step in the direction $-g$ then the step size that extremises the approximate cost is given by the above formula. (That is, consider the approximation with parameter $h=-lambda g$.)
$endgroup$
– copper.hat
Dec 25 '18 at 1:40
add a comment |
$begingroup$
You may also be interested in Steffensen's method, another Newton variant that has shrinking step size as we approach the root.
$endgroup$
– J.G.
Dec 24 '18 at 23:37
$begingroup$
If you approximate the cost near $x$ by $f(x+h) approx langle g, h rangle >+ { 1over 2} langle h, H h rangle$ ($g$ is the gradient of $f$, $H$ is the Hessian) and you consider a step in the direction $-g$ then the step size that extremises the approximate cost is given by the above formula. (That is, consider the approximation with parameter $h=-lambda g$.)
$endgroup$
– copper.hat
Dec 25 '18 at 1:40
$begingroup$
You may also be interested in Steffensen's method, another Newton variant that has shrinking step size as we approach the root.
$endgroup$
– J.G.
Dec 24 '18 at 23:37
$begingroup$
You may also be interested in Steffensen's method, another Newton variant that has shrinking step size as we approach the root.
$endgroup$
– J.G.
Dec 24 '18 at 23:37
$begingroup$
If you approximate the cost near $x$ by $f(x+h) approx langle g, h rangle >+ { 1over 2} langle h, H h rangle$ ($g$ is the gradient of $f$, $H$ is the Hessian) and you consider a step in the direction $-g$ then the step size that extremises the approximate cost is given by the above formula. (That is, consider the approximation with parameter $h=-lambda g$.)
$endgroup$
– copper.hat
Dec 25 '18 at 1:40
$begingroup$
If you approximate the cost near $x$ by $f(x+h) approx langle g, h rangle >+ { 1over 2} langle h, H h rangle$ ($g$ is the gradient of $f$, $H$ is the Hessian) and you consider a step in the direction $-g$ then the step size that extremises the approximate cost is given by the above formula. (That is, consider the approximation with parameter $h=-lambda g$.)
$endgroup$
– copper.hat
Dec 25 '18 at 1:40
add a comment |
0
active
oldest
votes
Your Answer
StackExchange.ifUsing("editor", function () {
return StackExchange.using("mathjaxEditing", function () {
StackExchange.MarkdownEditor.creationCallbacks.add(function (editor, postfix) {
StackExchange.mathjaxEditing.prepareWmdForMathJax(editor, postfix, [["$", "$"], ["\\(","\\)"]]);
});
});
}, "mathjax-editing");
StackExchange.ready(function() {
var channelOptions = {
tags: "".split(" "),
id: "69"
};
initTagRenderer("".split(" "), "".split(" "), channelOptions);
StackExchange.using("externalEditor", function() {
// Have to fire editor after snippets, if snippets enabled
if (StackExchange.settings.snippets.snippetsEnabled) {
StackExchange.using("snippets", function() {
createEditor();
});
}
else {
createEditor();
}
});
function createEditor() {
StackExchange.prepareEditor({
heartbeatType: 'answer',
autoActivateHeartbeat: false,
convertImagesToLinks: true,
noModals: true,
showLowRepImageUploadWarning: true,
reputationToPostImages: 10,
bindNavPrevention: true,
postfix: "",
imageUploader: {
brandingHtml: "Powered by u003ca class="icon-imgur-white" href="https://imgur.com/"u003eu003c/au003e",
contentPolicyHtml: "User contributions licensed under u003ca href="https://creativecommons.org/licenses/by-sa/3.0/"u003ecc by-sa 3.0 with attribution requiredu003c/au003e u003ca href="https://stackoverflow.com/legal/content-policy"u003e(content policy)u003c/au003e",
allowUrls: true
},
noCode: true, onDemand: true,
discardSelector: ".discard-answer"
,immediatelyShowMarkdownHelp:true
});
}
});
Sign up or log in
StackExchange.ready(function () {
StackExchange.helpers.onClickDraftSave('#login-link');
});
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Post as a guest
Required, but never shown
StackExchange.ready(
function () {
StackExchange.openid.initPostLogin('.new-post-login', 'https%3a%2f%2fmath.stackexchange.com%2fquestions%2f3051712%2fnewtons-method-for-a-step-size-to-move-in-the-direction-of-the-gradient%23new-answer', 'question_page');
}
);
Post as a guest
Required, but never shown
0
active
oldest
votes
0
active
oldest
votes
active
oldest
votes
active
oldest
votes
Thanks for contributing an answer to Mathematics Stack Exchange!
- Please be sure to answer the question. Provide details and share your research!
But avoid …
- Asking for help, clarification, or responding to other answers.
- Making statements based on opinion; back them up with references or personal experience.
Use MathJax to format equations. MathJax reference.
To learn more, see our tips on writing great answers.
Sign up or log in
StackExchange.ready(function () {
StackExchange.helpers.onClickDraftSave('#login-link');
});
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Post as a guest
Required, but never shown
StackExchange.ready(
function () {
StackExchange.openid.initPostLogin('.new-post-login', 'https%3a%2f%2fmath.stackexchange.com%2fquestions%2f3051712%2fnewtons-method-for-a-step-size-to-move-in-the-direction-of-the-gradient%23new-answer', 'question_page');
}
);
Post as a guest
Required, but never shown
Sign up or log in
StackExchange.ready(function () {
StackExchange.helpers.onClickDraftSave('#login-link');
});
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Post as a guest
Required, but never shown
Sign up or log in
StackExchange.ready(function () {
StackExchange.helpers.onClickDraftSave('#login-link');
});
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Post as a guest
Required, but never shown
Sign up or log in
StackExchange.ready(function () {
StackExchange.helpers.onClickDraftSave('#login-link');
});
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Post as a guest
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown
$begingroup$
You may also be interested in Steffensen's method, another Newton variant that has shrinking step size as we approach the root.
$endgroup$
– J.G.
Dec 24 '18 at 23:37
$begingroup$
If you approximate the cost near $x$ by $f(x+h) approx langle g, h rangle >+ { 1over 2} langle h, H h rangle$ ($g$ is the gradient of $f$, $H$ is the Hessian) and you consider a step in the direction $-g$ then the step size that extremises the approximate cost is given by the above formula. (That is, consider the approximation with parameter $h=-lambda g$.)
$endgroup$
– copper.hat
Dec 25 '18 at 1:40