Newton's Method for a step size to move in the direction of the gradient












0












$begingroup$


I am reading this article that talks about Newton's method that can give us an ideal step size to move in the direction of the gradient.



I do not understand what $epsilon$ is in the following part of the article:
enter image description here



I am confused on what $g$ is and what this math expression represents.
Can someone help me to understand it?



Thank you!










share|cite|improve this question









$endgroup$












  • $begingroup$
    You may also be interested in Steffensen's method, another Newton variant that has shrinking step size as we approach the root.
    $endgroup$
    – J.G.
    Dec 24 '18 at 23:37










  • $begingroup$
    If you approximate the cost near $x$ by $f(x+h) approx langle g, h rangle >+ { 1over 2} langle h, H h rangle$ ($g$ is the gradient of $f$, $H$ is the Hessian) and you consider a step in the direction $-g$ then the step size that extremises the approximate cost is given by the above formula. (That is, consider the approximation with parameter $h=-lambda g$.)
    $endgroup$
    – copper.hat
    Dec 25 '18 at 1:40
















0












$begingroup$


I am reading this article that talks about Newton's method that can give us an ideal step size to move in the direction of the gradient.



I do not understand what $epsilon$ is in the following part of the article:
enter image description here



I am confused on what $g$ is and what this math expression represents.
Can someone help me to understand it?



Thank you!










share|cite|improve this question









$endgroup$












  • $begingroup$
    You may also be interested in Steffensen's method, another Newton variant that has shrinking step size as we approach the root.
    $endgroup$
    – J.G.
    Dec 24 '18 at 23:37










  • $begingroup$
    If you approximate the cost near $x$ by $f(x+h) approx langle g, h rangle >+ { 1over 2} langle h, H h rangle$ ($g$ is the gradient of $f$, $H$ is the Hessian) and you consider a step in the direction $-g$ then the step size that extremises the approximate cost is given by the above formula. (That is, consider the approximation with parameter $h=-lambda g$.)
    $endgroup$
    – copper.hat
    Dec 25 '18 at 1:40














0












0








0





$begingroup$


I am reading this article that talks about Newton's method that can give us an ideal step size to move in the direction of the gradient.



I do not understand what $epsilon$ is in the following part of the article:
enter image description here



I am confused on what $g$ is and what this math expression represents.
Can someone help me to understand it?



Thank you!










share|cite|improve this question









$endgroup$




I am reading this article that talks about Newton's method that can give us an ideal step size to move in the direction of the gradient.



I do not understand what $epsilon$ is in the following part of the article:
enter image description here



I am confused on what $g$ is and what this math expression represents.
Can someone help me to understand it?



Thank you!







calculus gradient-descent hessian-matrix






share|cite|improve this question













share|cite|improve this question











share|cite|improve this question




share|cite|improve this question










asked Dec 24 '18 at 23:29









YohanRothYohanRoth

6471715




6471715












  • $begingroup$
    You may also be interested in Steffensen's method, another Newton variant that has shrinking step size as we approach the root.
    $endgroup$
    – J.G.
    Dec 24 '18 at 23:37










  • $begingroup$
    If you approximate the cost near $x$ by $f(x+h) approx langle g, h rangle >+ { 1over 2} langle h, H h rangle$ ($g$ is the gradient of $f$, $H$ is the Hessian) and you consider a step in the direction $-g$ then the step size that extremises the approximate cost is given by the above formula. (That is, consider the approximation with parameter $h=-lambda g$.)
    $endgroup$
    – copper.hat
    Dec 25 '18 at 1:40


















  • $begingroup$
    You may also be interested in Steffensen's method, another Newton variant that has shrinking step size as we approach the root.
    $endgroup$
    – J.G.
    Dec 24 '18 at 23:37










  • $begingroup$
    If you approximate the cost near $x$ by $f(x+h) approx langle g, h rangle >+ { 1over 2} langle h, H h rangle$ ($g$ is the gradient of $f$, $H$ is the Hessian) and you consider a step in the direction $-g$ then the step size that extremises the approximate cost is given by the above formula. (That is, consider the approximation with parameter $h=-lambda g$.)
    $endgroup$
    – copper.hat
    Dec 25 '18 at 1:40
















$begingroup$
You may also be interested in Steffensen's method, another Newton variant that has shrinking step size as we approach the root.
$endgroup$
– J.G.
Dec 24 '18 at 23:37




$begingroup$
You may also be interested in Steffensen's method, another Newton variant that has shrinking step size as we approach the root.
$endgroup$
– J.G.
Dec 24 '18 at 23:37












$begingroup$
If you approximate the cost near $x$ by $f(x+h) approx langle g, h rangle >+ { 1over 2} langle h, H h rangle$ ($g$ is the gradient of $f$, $H$ is the Hessian) and you consider a step in the direction $-g$ then the step size that extremises the approximate cost is given by the above formula. (That is, consider the approximation with parameter $h=-lambda g$.)
$endgroup$
– copper.hat
Dec 25 '18 at 1:40




$begingroup$
If you approximate the cost near $x$ by $f(x+h) approx langle g, h rangle >+ { 1over 2} langle h, H h rangle$ ($g$ is the gradient of $f$, $H$ is the Hessian) and you consider a step in the direction $-g$ then the step size that extremises the approximate cost is given by the above formula. (That is, consider the approximation with parameter $h=-lambda g$.)
$endgroup$
– copper.hat
Dec 25 '18 at 1:40










0






active

oldest

votes












Your Answer





StackExchange.ifUsing("editor", function () {
return StackExchange.using("mathjaxEditing", function () {
StackExchange.MarkdownEditor.creationCallbacks.add(function (editor, postfix) {
StackExchange.mathjaxEditing.prepareWmdForMathJax(editor, postfix, [["$", "$"], ["\\(","\\)"]]);
});
});
}, "mathjax-editing");

StackExchange.ready(function() {
var channelOptions = {
tags: "".split(" "),
id: "69"
};
initTagRenderer("".split(" "), "".split(" "), channelOptions);

StackExchange.using("externalEditor", function() {
// Have to fire editor after snippets, if snippets enabled
if (StackExchange.settings.snippets.snippetsEnabled) {
StackExchange.using("snippets", function() {
createEditor();
});
}
else {
createEditor();
}
});

function createEditor() {
StackExchange.prepareEditor({
heartbeatType: 'answer',
autoActivateHeartbeat: false,
convertImagesToLinks: true,
noModals: true,
showLowRepImageUploadWarning: true,
reputationToPostImages: 10,
bindNavPrevention: true,
postfix: "",
imageUploader: {
brandingHtml: "Powered by u003ca class="icon-imgur-white" href="https://imgur.com/"u003eu003c/au003e",
contentPolicyHtml: "User contributions licensed under u003ca href="https://creativecommons.org/licenses/by-sa/3.0/"u003ecc by-sa 3.0 with attribution requiredu003c/au003e u003ca href="https://stackoverflow.com/legal/content-policy"u003e(content policy)u003c/au003e",
allowUrls: true
},
noCode: true, onDemand: true,
discardSelector: ".discard-answer"
,immediatelyShowMarkdownHelp:true
});


}
});














draft saved

draft discarded


















StackExchange.ready(
function () {
StackExchange.openid.initPostLogin('.new-post-login', 'https%3a%2f%2fmath.stackexchange.com%2fquestions%2f3051712%2fnewtons-method-for-a-step-size-to-move-in-the-direction-of-the-gradient%23new-answer', 'question_page');
}
);

Post as a guest















Required, but never shown

























0






active

oldest

votes








0






active

oldest

votes









active

oldest

votes






active

oldest

votes
















draft saved

draft discarded




















































Thanks for contributing an answer to Mathematics Stack Exchange!


  • Please be sure to answer the question. Provide details and share your research!

But avoid



  • Asking for help, clarification, or responding to other answers.

  • Making statements based on opinion; back them up with references or personal experience.


Use MathJax to format equations. MathJax reference.


To learn more, see our tips on writing great answers.




draft saved


draft discarded














StackExchange.ready(
function () {
StackExchange.openid.initPostLogin('.new-post-login', 'https%3a%2f%2fmath.stackexchange.com%2fquestions%2f3051712%2fnewtons-method-for-a-step-size-to-move-in-the-direction-of-the-gradient%23new-answer', 'question_page');
}
);

Post as a guest















Required, but never shown





















































Required, but never shown














Required, but never shown












Required, but never shown







Required, but never shown

































Required, but never shown














Required, but never shown












Required, but never shown







Required, but never shown







Popular posts from this blog

Bundesstraße 106

Verónica Boquete

Ida-Boy-Ed-Garten