In $mathbb{R}^n$, is the dot product the only inner product?












3












$begingroup$


Just what the title says. I'm reading, from various resources, that the inner product is a generalization of the dot product. However, the only example of inner product I can find, is still the dot product.



Does another "inner product" exist on the $mathbb{R}^n$ space?










share|cite|improve this question











$endgroup$








  • 1




    $begingroup$
    This might be useful.
    $endgroup$
    – StackTD
    Dec 14 '18 at 11:07










  • $begingroup$
    @StackTD: thanks!!! Exactly what I was looking for
    $endgroup$
    – blue_note
    Dec 14 '18 at 11:15






  • 1




    $begingroup$
    And related: Is there a classification of the inner products on $mathbb{R}^n$ up to isomorphism?.
    $endgroup$
    – StackTD
    Dec 14 '18 at 11:37










  • $begingroup$
    @StackTD: thanks again
    $endgroup$
    – blue_note
    Dec 14 '18 at 11:38










  • $begingroup$
    As some other answers point out, there are infinitely many inner products (i.e., symmetric, positive-definite bilinear forms) on $Bbb R^n$. But for any of them one can choose a basis of $Bbb R^n$ with respect to which the bilinear form is the standard one: $({bf x}, {bf y}) = x_1 y_1 + cdots + x_n y_n$. So up to isomorphism there is only one inner product on $Bbb R^n$. (This is a special case of Sylvester's Law of Inertia.)
    $endgroup$
    – Travis
    Dec 25 '18 at 0:51
















3












$begingroup$


Just what the title says. I'm reading, from various resources, that the inner product is a generalization of the dot product. However, the only example of inner product I can find, is still the dot product.



Does another "inner product" exist on the $mathbb{R}^n$ space?










share|cite|improve this question











$endgroup$








  • 1




    $begingroup$
    This might be useful.
    $endgroup$
    – StackTD
    Dec 14 '18 at 11:07










  • $begingroup$
    @StackTD: thanks!!! Exactly what I was looking for
    $endgroup$
    – blue_note
    Dec 14 '18 at 11:15






  • 1




    $begingroup$
    And related: Is there a classification of the inner products on $mathbb{R}^n$ up to isomorphism?.
    $endgroup$
    – StackTD
    Dec 14 '18 at 11:37










  • $begingroup$
    @StackTD: thanks again
    $endgroup$
    – blue_note
    Dec 14 '18 at 11:38










  • $begingroup$
    As some other answers point out, there are infinitely many inner products (i.e., symmetric, positive-definite bilinear forms) on $Bbb R^n$. But for any of them one can choose a basis of $Bbb R^n$ with respect to which the bilinear form is the standard one: $({bf x}, {bf y}) = x_1 y_1 + cdots + x_n y_n$. So up to isomorphism there is only one inner product on $Bbb R^n$. (This is a special case of Sylvester's Law of Inertia.)
    $endgroup$
    – Travis
    Dec 25 '18 at 0:51














3












3








3


2



$begingroup$


Just what the title says. I'm reading, from various resources, that the inner product is a generalization of the dot product. However, the only example of inner product I can find, is still the dot product.



Does another "inner product" exist on the $mathbb{R}^n$ space?










share|cite|improve this question











$endgroup$




Just what the title says. I'm reading, from various resources, that the inner product is a generalization of the dot product. However, the only example of inner product I can find, is still the dot product.



Does another "inner product" exist on the $mathbb{R}^n$ space?







vector-spaces inner-product-space






share|cite|improve this question















share|cite|improve this question













share|cite|improve this question




share|cite|improve this question








edited Dec 14 '18 at 15:19









Andrews

9281318




9281318










asked Dec 14 '18 at 11:02









blue_noteblue_note

51948




51948








  • 1




    $begingroup$
    This might be useful.
    $endgroup$
    – StackTD
    Dec 14 '18 at 11:07










  • $begingroup$
    @StackTD: thanks!!! Exactly what I was looking for
    $endgroup$
    – blue_note
    Dec 14 '18 at 11:15






  • 1




    $begingroup$
    And related: Is there a classification of the inner products on $mathbb{R}^n$ up to isomorphism?.
    $endgroup$
    – StackTD
    Dec 14 '18 at 11:37










  • $begingroup$
    @StackTD: thanks again
    $endgroup$
    – blue_note
    Dec 14 '18 at 11:38










  • $begingroup$
    As some other answers point out, there are infinitely many inner products (i.e., symmetric, positive-definite bilinear forms) on $Bbb R^n$. But for any of them one can choose a basis of $Bbb R^n$ with respect to which the bilinear form is the standard one: $({bf x}, {bf y}) = x_1 y_1 + cdots + x_n y_n$. So up to isomorphism there is only one inner product on $Bbb R^n$. (This is a special case of Sylvester's Law of Inertia.)
    $endgroup$
    – Travis
    Dec 25 '18 at 0:51














  • 1




    $begingroup$
    This might be useful.
    $endgroup$
    – StackTD
    Dec 14 '18 at 11:07










  • $begingroup$
    @StackTD: thanks!!! Exactly what I was looking for
    $endgroup$
    – blue_note
    Dec 14 '18 at 11:15






  • 1




    $begingroup$
    And related: Is there a classification of the inner products on $mathbb{R}^n$ up to isomorphism?.
    $endgroup$
    – StackTD
    Dec 14 '18 at 11:37










  • $begingroup$
    @StackTD: thanks again
    $endgroup$
    – blue_note
    Dec 14 '18 at 11:38










  • $begingroup$
    As some other answers point out, there are infinitely many inner products (i.e., symmetric, positive-definite bilinear forms) on $Bbb R^n$. But for any of them one can choose a basis of $Bbb R^n$ with respect to which the bilinear form is the standard one: $({bf x}, {bf y}) = x_1 y_1 + cdots + x_n y_n$. So up to isomorphism there is only one inner product on $Bbb R^n$. (This is a special case of Sylvester's Law of Inertia.)
    $endgroup$
    – Travis
    Dec 25 '18 at 0:51








1




1




$begingroup$
This might be useful.
$endgroup$
– StackTD
Dec 14 '18 at 11:07




$begingroup$
This might be useful.
$endgroup$
– StackTD
Dec 14 '18 at 11:07












$begingroup$
@StackTD: thanks!!! Exactly what I was looking for
$endgroup$
– blue_note
Dec 14 '18 at 11:15




$begingroup$
@StackTD: thanks!!! Exactly what I was looking for
$endgroup$
– blue_note
Dec 14 '18 at 11:15




1




1




$begingroup$
And related: Is there a classification of the inner products on $mathbb{R}^n$ up to isomorphism?.
$endgroup$
– StackTD
Dec 14 '18 at 11:37




$begingroup$
And related: Is there a classification of the inner products on $mathbb{R}^n$ up to isomorphism?.
$endgroup$
– StackTD
Dec 14 '18 at 11:37












$begingroup$
@StackTD: thanks again
$endgroup$
– blue_note
Dec 14 '18 at 11:38




$begingroup$
@StackTD: thanks again
$endgroup$
– blue_note
Dec 14 '18 at 11:38












$begingroup$
As some other answers point out, there are infinitely many inner products (i.e., symmetric, positive-definite bilinear forms) on $Bbb R^n$. But for any of them one can choose a basis of $Bbb R^n$ with respect to which the bilinear form is the standard one: $({bf x}, {bf y}) = x_1 y_1 + cdots + x_n y_n$. So up to isomorphism there is only one inner product on $Bbb R^n$. (This is a special case of Sylvester's Law of Inertia.)
$endgroup$
– Travis
Dec 25 '18 at 0:51




$begingroup$
As some other answers point out, there are infinitely many inner products (i.e., symmetric, positive-definite bilinear forms) on $Bbb R^n$. But for any of them one can choose a basis of $Bbb R^n$ with respect to which the bilinear form is the standard one: $({bf x}, {bf y}) = x_1 y_1 + cdots + x_n y_n$. So up to isomorphism there is only one inner product on $Bbb R^n$. (This is a special case of Sylvester's Law of Inertia.)
$endgroup$
– Travis
Dec 25 '18 at 0:51










3 Answers
3






active

oldest

votes


















3












$begingroup$

Yes and are all in the form
$$
langle x, yrangle=x^Tcdot Acdot y
$$

where $x^T$ is the transpose vector of $x$ and $A$ is a $ntimes n$ symmetric definite positive matrix.



In fact let $langle x, yrangle$ a generic inner product on $mathbb R^n$ then for every $y$ the function
$$
f_y:xrightarrow langle x, yrangle
$$

is linear from $mathbb R^n$ to $mathbb R$ then exists a vector $alpha(y)inmathbb R^n$ such that
$$
langle x, yrangle = alpha(y)^Tcdot x
$$



Observe that
$$
langle x, ay+by'rangle = alangle x, yrangle + blangle x, y'rangleRightarrow alpha(ay+by')=aalpha(y)+balpha(y')
$$

then $alpha$ is a linear operator from $mathbb R^n$ in itself then exists an $ntimes n$ matrix $A$ such that
$$
alpha(y)=Acdot y
$$

so
$$
alpha(y)^Tcdot x=y^Tcdot A^Tcdot x
$$



Now remember that $langle x, yrangle=langle y, xrangle$ then you can easly prove that $A^T=A$ and $A$ must be symmetric.



Why now $A$ must be definite positive? Because $langle x, xranglegeq 0$ and holds equality if and only if $x=0$. Applying it to the initial formula we obtain the definition of a definite positive matrix.






share|cite|improve this answer











$endgroup$













  • $begingroup$
    We need $A$ to be positive definite also.
    $endgroup$
    – littleO
    Dec 14 '18 at 11:26










  • $begingroup$
    Oh yes, I forgotten it!
    $endgroup$
    – P De Donato
    Dec 14 '18 at 11:28










  • $begingroup$
    Note that in some contexts, one only requires that an inner product be nondegenerate (and not necessarily positive definite).
    $endgroup$
    – Travis
    Dec 25 '18 at 0:52










  • $begingroup$
    If it's semidefinite positive it becomes a semi-inner product and not a inner product.
    $endgroup$
    – P De Donato
    Dec 28 '18 at 11:30



















1












$begingroup$

The dot product on $mathcal{R}^n$ is defined as follows:



$$(a,b) = a^i b^j (e_i,e_j) = a^i b^j delta_{ij} = a^i b^i ,$$



where $a,b in mathcal{R}^n$ and $e_i,e_j$ standard basis vectors. I used Einstein summation convention here.



In general we can express $a,b$ in a different basis i.e. $a=tilde{a}^i tilde{e}_i$ and $b = tilde{b}^i tilde{e}_i$ so now not the standard basis but an arbitrary basis of $mathcal{R}^n$ assuming still $(,)$ is positive-definite. This then gives:



$$(a,b) = tilde{a}^i tilde{b}^j (tilde{e}_i,tilde{e}_j) = tilde{a}^i tilde{b}^j A_{ij} equiv a^T A b.$$



Note that $A$ now is not the identity matrix like in the standard inner product.






share|cite|improve this answer











$endgroup$





















    0












    $begingroup$

    The tensor metric allows for the vector norm to remain constant under change of basis vectors, and is an example of inner product (page 15). In the simple setting of basis vectors constant in direction and magnitude from point to point in $mathbb R^2,$ here is an example:



    Changing basis vectors from Euclidean orthonormal basis $smallleft{vec e_1=begin{bmatrix}1\0 end{bmatrix},vec e_2=begin{bmatrix}0\1 end{bmatrix}right}$ to



    $$ left{vec u_1=color{red}2vec e_1 + color{red}1vec e_2,quad vec u_2=color{blue}{-frac 1 2 }vec e_2+color{blue}{frac 1 4} vec e_2right}tag 1$$ would result in a different norm of the vector $vec v=begin{bmatrix} 1\1end{bmatrix}_text{e basis},$ i.e $Vert vec v Vert^2=v_1^2 + v_2^2 = 2,$ when expressed with respect to the new basis vectors.



    In this regard, since vector components transform contravariantly, the change to the new coordinate system would be given by the backward transformation matrix:



    $$B=begin{bmatrix} frac 1 4 & frac 1 2\-1&2end{bmatrix}$$



    i.e. the inverse of the forward transformation for the basis vectors as defined in $(1),$ which in matrix form corresponds to



    $$F=begin{bmatrix} color{red} 2 & color{blue}{-frac 1 2}\color{red}1& color{blue}{frac 1 4}end{bmatrix}.$$



    Hence, the same vector $vec v$ expressed in the new basis vectors is



    $$vec v_{text{u basis}}=begin{bmatrix} frac 1 4 & frac 1 2\-1&2end{bmatrix}begin{bmatrix} 1\1end{bmatrix}=begin{bmatrix} frac 3 4\1end{bmatrix}.$$



    And the norm would entail an inner product with the new metric tensor. In the orthonormal Euclidean basis this is simply the identity matrix. Now the matrix tensor is a $(0,2)-text{tensor},$ and transforms covariantly:



    $$g_{text{ in u basis}}=(F^top F) I= begin{bmatrix} 2 & 1\-frac 1 2& frac 1 4end{bmatrix}begin{bmatrix} 2 & -frac 1 2\1& frac 1 4end{bmatrix}begin{bmatrix} 1 & 0\0& 1end{bmatrix}=begin{bmatrix} 5 & -frac 3 4\- frac 3 4& frac{5}{16}end{bmatrix}$$



    The actual multiplication of basis vectors to obtain the metric tensor is explained in this presentation by @eigenchris. This metric tensor indeed renders the right norm of $vec v:$



    $$begin{bmatrix}frac 3 4 & 1end{bmatrix}begin{bmatrix} 5 & -frac 3 4\- frac 3 4& frac{5}{16}end{bmatrix}begin{bmatrix}frac 3 4 \ 1end{bmatrix}=2$$



    following the operations here.






    share|cite|improve this answer











    $endgroup$













      Your Answer





      StackExchange.ifUsing("editor", function () {
      return StackExchange.using("mathjaxEditing", function () {
      StackExchange.MarkdownEditor.creationCallbacks.add(function (editor, postfix) {
      StackExchange.mathjaxEditing.prepareWmdForMathJax(editor, postfix, [["$", "$"], ["\\(","\\)"]]);
      });
      });
      }, "mathjax-editing");

      StackExchange.ready(function() {
      var channelOptions = {
      tags: "".split(" "),
      id: "69"
      };
      initTagRenderer("".split(" "), "".split(" "), channelOptions);

      StackExchange.using("externalEditor", function() {
      // Have to fire editor after snippets, if snippets enabled
      if (StackExchange.settings.snippets.snippetsEnabled) {
      StackExchange.using("snippets", function() {
      createEditor();
      });
      }
      else {
      createEditor();
      }
      });

      function createEditor() {
      StackExchange.prepareEditor({
      heartbeatType: 'answer',
      autoActivateHeartbeat: false,
      convertImagesToLinks: true,
      noModals: true,
      showLowRepImageUploadWarning: true,
      reputationToPostImages: 10,
      bindNavPrevention: true,
      postfix: "",
      imageUploader: {
      brandingHtml: "Powered by u003ca class="icon-imgur-white" href="https://imgur.com/"u003eu003c/au003e",
      contentPolicyHtml: "User contributions licensed under u003ca href="https://creativecommons.org/licenses/by-sa/3.0/"u003ecc by-sa 3.0 with attribution requiredu003c/au003e u003ca href="https://stackoverflow.com/legal/content-policy"u003e(content policy)u003c/au003e",
      allowUrls: true
      },
      noCode: true, onDemand: true,
      discardSelector: ".discard-answer"
      ,immediatelyShowMarkdownHelp:true
      });


      }
      });














      draft saved

      draft discarded


















      StackExchange.ready(
      function () {
      StackExchange.openid.initPostLogin('.new-post-login', 'https%3a%2f%2fmath.stackexchange.com%2fquestions%2f3039219%2fin-mathbbrn-is-the-dot-product-the-only-inner-product%23new-answer', 'question_page');
      }
      );

      Post as a guest















      Required, but never shown

























      3 Answers
      3






      active

      oldest

      votes








      3 Answers
      3






      active

      oldest

      votes









      active

      oldest

      votes






      active

      oldest

      votes









      3












      $begingroup$

      Yes and are all in the form
      $$
      langle x, yrangle=x^Tcdot Acdot y
      $$

      where $x^T$ is the transpose vector of $x$ and $A$ is a $ntimes n$ symmetric definite positive matrix.



      In fact let $langle x, yrangle$ a generic inner product on $mathbb R^n$ then for every $y$ the function
      $$
      f_y:xrightarrow langle x, yrangle
      $$

      is linear from $mathbb R^n$ to $mathbb R$ then exists a vector $alpha(y)inmathbb R^n$ such that
      $$
      langle x, yrangle = alpha(y)^Tcdot x
      $$



      Observe that
      $$
      langle x, ay+by'rangle = alangle x, yrangle + blangle x, y'rangleRightarrow alpha(ay+by')=aalpha(y)+balpha(y')
      $$

      then $alpha$ is a linear operator from $mathbb R^n$ in itself then exists an $ntimes n$ matrix $A$ such that
      $$
      alpha(y)=Acdot y
      $$

      so
      $$
      alpha(y)^Tcdot x=y^Tcdot A^Tcdot x
      $$



      Now remember that $langle x, yrangle=langle y, xrangle$ then you can easly prove that $A^T=A$ and $A$ must be symmetric.



      Why now $A$ must be definite positive? Because $langle x, xranglegeq 0$ and holds equality if and only if $x=0$. Applying it to the initial formula we obtain the definition of a definite positive matrix.






      share|cite|improve this answer











      $endgroup$













      • $begingroup$
        We need $A$ to be positive definite also.
        $endgroup$
        – littleO
        Dec 14 '18 at 11:26










      • $begingroup$
        Oh yes, I forgotten it!
        $endgroup$
        – P De Donato
        Dec 14 '18 at 11:28










      • $begingroup$
        Note that in some contexts, one only requires that an inner product be nondegenerate (and not necessarily positive definite).
        $endgroup$
        – Travis
        Dec 25 '18 at 0:52










      • $begingroup$
        If it's semidefinite positive it becomes a semi-inner product and not a inner product.
        $endgroup$
        – P De Donato
        Dec 28 '18 at 11:30
















      3












      $begingroup$

      Yes and are all in the form
      $$
      langle x, yrangle=x^Tcdot Acdot y
      $$

      where $x^T$ is the transpose vector of $x$ and $A$ is a $ntimes n$ symmetric definite positive matrix.



      In fact let $langle x, yrangle$ a generic inner product on $mathbb R^n$ then for every $y$ the function
      $$
      f_y:xrightarrow langle x, yrangle
      $$

      is linear from $mathbb R^n$ to $mathbb R$ then exists a vector $alpha(y)inmathbb R^n$ such that
      $$
      langle x, yrangle = alpha(y)^Tcdot x
      $$



      Observe that
      $$
      langle x, ay+by'rangle = alangle x, yrangle + blangle x, y'rangleRightarrow alpha(ay+by')=aalpha(y)+balpha(y')
      $$

      then $alpha$ is a linear operator from $mathbb R^n$ in itself then exists an $ntimes n$ matrix $A$ such that
      $$
      alpha(y)=Acdot y
      $$

      so
      $$
      alpha(y)^Tcdot x=y^Tcdot A^Tcdot x
      $$



      Now remember that $langle x, yrangle=langle y, xrangle$ then you can easly prove that $A^T=A$ and $A$ must be symmetric.



      Why now $A$ must be definite positive? Because $langle x, xranglegeq 0$ and holds equality if and only if $x=0$. Applying it to the initial formula we obtain the definition of a definite positive matrix.






      share|cite|improve this answer











      $endgroup$













      • $begingroup$
        We need $A$ to be positive definite also.
        $endgroup$
        – littleO
        Dec 14 '18 at 11:26










      • $begingroup$
        Oh yes, I forgotten it!
        $endgroup$
        – P De Donato
        Dec 14 '18 at 11:28










      • $begingroup$
        Note that in some contexts, one only requires that an inner product be nondegenerate (and not necessarily positive definite).
        $endgroup$
        – Travis
        Dec 25 '18 at 0:52










      • $begingroup$
        If it's semidefinite positive it becomes a semi-inner product and not a inner product.
        $endgroup$
        – P De Donato
        Dec 28 '18 at 11:30














      3












      3








      3





      $begingroup$

      Yes and are all in the form
      $$
      langle x, yrangle=x^Tcdot Acdot y
      $$

      where $x^T$ is the transpose vector of $x$ and $A$ is a $ntimes n$ symmetric definite positive matrix.



      In fact let $langle x, yrangle$ a generic inner product on $mathbb R^n$ then for every $y$ the function
      $$
      f_y:xrightarrow langle x, yrangle
      $$

      is linear from $mathbb R^n$ to $mathbb R$ then exists a vector $alpha(y)inmathbb R^n$ such that
      $$
      langle x, yrangle = alpha(y)^Tcdot x
      $$



      Observe that
      $$
      langle x, ay+by'rangle = alangle x, yrangle + blangle x, y'rangleRightarrow alpha(ay+by')=aalpha(y)+balpha(y')
      $$

      then $alpha$ is a linear operator from $mathbb R^n$ in itself then exists an $ntimes n$ matrix $A$ such that
      $$
      alpha(y)=Acdot y
      $$

      so
      $$
      alpha(y)^Tcdot x=y^Tcdot A^Tcdot x
      $$



      Now remember that $langle x, yrangle=langle y, xrangle$ then you can easly prove that $A^T=A$ and $A$ must be symmetric.



      Why now $A$ must be definite positive? Because $langle x, xranglegeq 0$ and holds equality if and only if $x=0$. Applying it to the initial formula we obtain the definition of a definite positive matrix.






      share|cite|improve this answer











      $endgroup$



      Yes and are all in the form
      $$
      langle x, yrangle=x^Tcdot Acdot y
      $$

      where $x^T$ is the transpose vector of $x$ and $A$ is a $ntimes n$ symmetric definite positive matrix.



      In fact let $langle x, yrangle$ a generic inner product on $mathbb R^n$ then for every $y$ the function
      $$
      f_y:xrightarrow langle x, yrangle
      $$

      is linear from $mathbb R^n$ to $mathbb R$ then exists a vector $alpha(y)inmathbb R^n$ such that
      $$
      langle x, yrangle = alpha(y)^Tcdot x
      $$



      Observe that
      $$
      langle x, ay+by'rangle = alangle x, yrangle + blangle x, y'rangleRightarrow alpha(ay+by')=aalpha(y)+balpha(y')
      $$

      then $alpha$ is a linear operator from $mathbb R^n$ in itself then exists an $ntimes n$ matrix $A$ such that
      $$
      alpha(y)=Acdot y
      $$

      so
      $$
      alpha(y)^Tcdot x=y^Tcdot A^Tcdot x
      $$



      Now remember that $langle x, yrangle=langle y, xrangle$ then you can easly prove that $A^T=A$ and $A$ must be symmetric.



      Why now $A$ must be definite positive? Because $langle x, xranglegeq 0$ and holds equality if and only if $x=0$. Applying it to the initial formula we obtain the definition of a definite positive matrix.







      share|cite|improve this answer














      share|cite|improve this answer



      share|cite|improve this answer








      edited Dec 25 '18 at 0:33

























      answered Dec 14 '18 at 11:24









      P De DonatoP De Donato

      4897




      4897












      • $begingroup$
        We need $A$ to be positive definite also.
        $endgroup$
        – littleO
        Dec 14 '18 at 11:26










      • $begingroup$
        Oh yes, I forgotten it!
        $endgroup$
        – P De Donato
        Dec 14 '18 at 11:28










      • $begingroup$
        Note that in some contexts, one only requires that an inner product be nondegenerate (and not necessarily positive definite).
        $endgroup$
        – Travis
        Dec 25 '18 at 0:52










      • $begingroup$
        If it's semidefinite positive it becomes a semi-inner product and not a inner product.
        $endgroup$
        – P De Donato
        Dec 28 '18 at 11:30


















      • $begingroup$
        We need $A$ to be positive definite also.
        $endgroup$
        – littleO
        Dec 14 '18 at 11:26










      • $begingroup$
        Oh yes, I forgotten it!
        $endgroup$
        – P De Donato
        Dec 14 '18 at 11:28










      • $begingroup$
        Note that in some contexts, one only requires that an inner product be nondegenerate (and not necessarily positive definite).
        $endgroup$
        – Travis
        Dec 25 '18 at 0:52










      • $begingroup$
        If it's semidefinite positive it becomes a semi-inner product and not a inner product.
        $endgroup$
        – P De Donato
        Dec 28 '18 at 11:30
















      $begingroup$
      We need $A$ to be positive definite also.
      $endgroup$
      – littleO
      Dec 14 '18 at 11:26




      $begingroup$
      We need $A$ to be positive definite also.
      $endgroup$
      – littleO
      Dec 14 '18 at 11:26












      $begingroup$
      Oh yes, I forgotten it!
      $endgroup$
      – P De Donato
      Dec 14 '18 at 11:28




      $begingroup$
      Oh yes, I forgotten it!
      $endgroup$
      – P De Donato
      Dec 14 '18 at 11:28












      $begingroup$
      Note that in some contexts, one only requires that an inner product be nondegenerate (and not necessarily positive definite).
      $endgroup$
      – Travis
      Dec 25 '18 at 0:52




      $begingroup$
      Note that in some contexts, one only requires that an inner product be nondegenerate (and not necessarily positive definite).
      $endgroup$
      – Travis
      Dec 25 '18 at 0:52












      $begingroup$
      If it's semidefinite positive it becomes a semi-inner product and not a inner product.
      $endgroup$
      – P De Donato
      Dec 28 '18 at 11:30




      $begingroup$
      If it's semidefinite positive it becomes a semi-inner product and not a inner product.
      $endgroup$
      – P De Donato
      Dec 28 '18 at 11:30











      1












      $begingroup$

      The dot product on $mathcal{R}^n$ is defined as follows:



      $$(a,b) = a^i b^j (e_i,e_j) = a^i b^j delta_{ij} = a^i b^i ,$$



      where $a,b in mathcal{R}^n$ and $e_i,e_j$ standard basis vectors. I used Einstein summation convention here.



      In general we can express $a,b$ in a different basis i.e. $a=tilde{a}^i tilde{e}_i$ and $b = tilde{b}^i tilde{e}_i$ so now not the standard basis but an arbitrary basis of $mathcal{R}^n$ assuming still $(,)$ is positive-definite. This then gives:



      $$(a,b) = tilde{a}^i tilde{b}^j (tilde{e}_i,tilde{e}_j) = tilde{a}^i tilde{b}^j A_{ij} equiv a^T A b.$$



      Note that $A$ now is not the identity matrix like in the standard inner product.






      share|cite|improve this answer











      $endgroup$


















        1












        $begingroup$

        The dot product on $mathcal{R}^n$ is defined as follows:



        $$(a,b) = a^i b^j (e_i,e_j) = a^i b^j delta_{ij} = a^i b^i ,$$



        where $a,b in mathcal{R}^n$ and $e_i,e_j$ standard basis vectors. I used Einstein summation convention here.



        In general we can express $a,b$ in a different basis i.e. $a=tilde{a}^i tilde{e}_i$ and $b = tilde{b}^i tilde{e}_i$ so now not the standard basis but an arbitrary basis of $mathcal{R}^n$ assuming still $(,)$ is positive-definite. This then gives:



        $$(a,b) = tilde{a}^i tilde{b}^j (tilde{e}_i,tilde{e}_j) = tilde{a}^i tilde{b}^j A_{ij} equiv a^T A b.$$



        Note that $A$ now is not the identity matrix like in the standard inner product.






        share|cite|improve this answer











        $endgroup$
















          1












          1








          1





          $begingroup$

          The dot product on $mathcal{R}^n$ is defined as follows:



          $$(a,b) = a^i b^j (e_i,e_j) = a^i b^j delta_{ij} = a^i b^i ,$$



          where $a,b in mathcal{R}^n$ and $e_i,e_j$ standard basis vectors. I used Einstein summation convention here.



          In general we can express $a,b$ in a different basis i.e. $a=tilde{a}^i tilde{e}_i$ and $b = tilde{b}^i tilde{e}_i$ so now not the standard basis but an arbitrary basis of $mathcal{R}^n$ assuming still $(,)$ is positive-definite. This then gives:



          $$(a,b) = tilde{a}^i tilde{b}^j (tilde{e}_i,tilde{e}_j) = tilde{a}^i tilde{b}^j A_{ij} equiv a^T A b.$$



          Note that $A$ now is not the identity matrix like in the standard inner product.






          share|cite|improve this answer











          $endgroup$



          The dot product on $mathcal{R}^n$ is defined as follows:



          $$(a,b) = a^i b^j (e_i,e_j) = a^i b^j delta_{ij} = a^i b^i ,$$



          where $a,b in mathcal{R}^n$ and $e_i,e_j$ standard basis vectors. I used Einstein summation convention here.



          In general we can express $a,b$ in a different basis i.e. $a=tilde{a}^i tilde{e}_i$ and $b = tilde{b}^i tilde{e}_i$ so now not the standard basis but an arbitrary basis of $mathcal{R}^n$ assuming still $(,)$ is positive-definite. This then gives:



          $$(a,b) = tilde{a}^i tilde{b}^j (tilde{e}_i,tilde{e}_j) = tilde{a}^i tilde{b}^j A_{ij} equiv a^T A b.$$



          Note that $A$ now is not the identity matrix like in the standard inner product.







          share|cite|improve this answer














          share|cite|improve this answer



          share|cite|improve this answer








          edited Dec 25 '18 at 14:36

























          answered Dec 14 '18 at 11:33









          DaniDani

          28510




          28510























              0












              $begingroup$

              The tensor metric allows for the vector norm to remain constant under change of basis vectors, and is an example of inner product (page 15). In the simple setting of basis vectors constant in direction and magnitude from point to point in $mathbb R^2,$ here is an example:



              Changing basis vectors from Euclidean orthonormal basis $smallleft{vec e_1=begin{bmatrix}1\0 end{bmatrix},vec e_2=begin{bmatrix}0\1 end{bmatrix}right}$ to



              $$ left{vec u_1=color{red}2vec e_1 + color{red}1vec e_2,quad vec u_2=color{blue}{-frac 1 2 }vec e_2+color{blue}{frac 1 4} vec e_2right}tag 1$$ would result in a different norm of the vector $vec v=begin{bmatrix} 1\1end{bmatrix}_text{e basis},$ i.e $Vert vec v Vert^2=v_1^2 + v_2^2 = 2,$ when expressed with respect to the new basis vectors.



              In this regard, since vector components transform contravariantly, the change to the new coordinate system would be given by the backward transformation matrix:



              $$B=begin{bmatrix} frac 1 4 & frac 1 2\-1&2end{bmatrix}$$



              i.e. the inverse of the forward transformation for the basis vectors as defined in $(1),$ which in matrix form corresponds to



              $$F=begin{bmatrix} color{red} 2 & color{blue}{-frac 1 2}\color{red}1& color{blue}{frac 1 4}end{bmatrix}.$$



              Hence, the same vector $vec v$ expressed in the new basis vectors is



              $$vec v_{text{u basis}}=begin{bmatrix} frac 1 4 & frac 1 2\-1&2end{bmatrix}begin{bmatrix} 1\1end{bmatrix}=begin{bmatrix} frac 3 4\1end{bmatrix}.$$



              And the norm would entail an inner product with the new metric tensor. In the orthonormal Euclidean basis this is simply the identity matrix. Now the matrix tensor is a $(0,2)-text{tensor},$ and transforms covariantly:



              $$g_{text{ in u basis}}=(F^top F) I= begin{bmatrix} 2 & 1\-frac 1 2& frac 1 4end{bmatrix}begin{bmatrix} 2 & -frac 1 2\1& frac 1 4end{bmatrix}begin{bmatrix} 1 & 0\0& 1end{bmatrix}=begin{bmatrix} 5 & -frac 3 4\- frac 3 4& frac{5}{16}end{bmatrix}$$



              The actual multiplication of basis vectors to obtain the metric tensor is explained in this presentation by @eigenchris. This metric tensor indeed renders the right norm of $vec v:$



              $$begin{bmatrix}frac 3 4 & 1end{bmatrix}begin{bmatrix} 5 & -frac 3 4\- frac 3 4& frac{5}{16}end{bmatrix}begin{bmatrix}frac 3 4 \ 1end{bmatrix}=2$$



              following the operations here.






              share|cite|improve this answer











              $endgroup$


















                0












                $begingroup$

                The tensor metric allows for the vector norm to remain constant under change of basis vectors, and is an example of inner product (page 15). In the simple setting of basis vectors constant in direction and magnitude from point to point in $mathbb R^2,$ here is an example:



                Changing basis vectors from Euclidean orthonormal basis $smallleft{vec e_1=begin{bmatrix}1\0 end{bmatrix},vec e_2=begin{bmatrix}0\1 end{bmatrix}right}$ to



                $$ left{vec u_1=color{red}2vec e_1 + color{red}1vec e_2,quad vec u_2=color{blue}{-frac 1 2 }vec e_2+color{blue}{frac 1 4} vec e_2right}tag 1$$ would result in a different norm of the vector $vec v=begin{bmatrix} 1\1end{bmatrix}_text{e basis},$ i.e $Vert vec v Vert^2=v_1^2 + v_2^2 = 2,$ when expressed with respect to the new basis vectors.



                In this regard, since vector components transform contravariantly, the change to the new coordinate system would be given by the backward transformation matrix:



                $$B=begin{bmatrix} frac 1 4 & frac 1 2\-1&2end{bmatrix}$$



                i.e. the inverse of the forward transformation for the basis vectors as defined in $(1),$ which in matrix form corresponds to



                $$F=begin{bmatrix} color{red} 2 & color{blue}{-frac 1 2}\color{red}1& color{blue}{frac 1 4}end{bmatrix}.$$



                Hence, the same vector $vec v$ expressed in the new basis vectors is



                $$vec v_{text{u basis}}=begin{bmatrix} frac 1 4 & frac 1 2\-1&2end{bmatrix}begin{bmatrix} 1\1end{bmatrix}=begin{bmatrix} frac 3 4\1end{bmatrix}.$$



                And the norm would entail an inner product with the new metric tensor. In the orthonormal Euclidean basis this is simply the identity matrix. Now the matrix tensor is a $(0,2)-text{tensor},$ and transforms covariantly:



                $$g_{text{ in u basis}}=(F^top F) I= begin{bmatrix} 2 & 1\-frac 1 2& frac 1 4end{bmatrix}begin{bmatrix} 2 & -frac 1 2\1& frac 1 4end{bmatrix}begin{bmatrix} 1 & 0\0& 1end{bmatrix}=begin{bmatrix} 5 & -frac 3 4\- frac 3 4& frac{5}{16}end{bmatrix}$$



                The actual multiplication of basis vectors to obtain the metric tensor is explained in this presentation by @eigenchris. This metric tensor indeed renders the right norm of $vec v:$



                $$begin{bmatrix}frac 3 4 & 1end{bmatrix}begin{bmatrix} 5 & -frac 3 4\- frac 3 4& frac{5}{16}end{bmatrix}begin{bmatrix}frac 3 4 \ 1end{bmatrix}=2$$



                following the operations here.






                share|cite|improve this answer











                $endgroup$
















                  0












                  0








                  0





                  $begingroup$

                  The tensor metric allows for the vector norm to remain constant under change of basis vectors, and is an example of inner product (page 15). In the simple setting of basis vectors constant in direction and magnitude from point to point in $mathbb R^2,$ here is an example:



                  Changing basis vectors from Euclidean orthonormal basis $smallleft{vec e_1=begin{bmatrix}1\0 end{bmatrix},vec e_2=begin{bmatrix}0\1 end{bmatrix}right}$ to



                  $$ left{vec u_1=color{red}2vec e_1 + color{red}1vec e_2,quad vec u_2=color{blue}{-frac 1 2 }vec e_2+color{blue}{frac 1 4} vec e_2right}tag 1$$ would result in a different norm of the vector $vec v=begin{bmatrix} 1\1end{bmatrix}_text{e basis},$ i.e $Vert vec v Vert^2=v_1^2 + v_2^2 = 2,$ when expressed with respect to the new basis vectors.



                  In this regard, since vector components transform contravariantly, the change to the new coordinate system would be given by the backward transformation matrix:



                  $$B=begin{bmatrix} frac 1 4 & frac 1 2\-1&2end{bmatrix}$$



                  i.e. the inverse of the forward transformation for the basis vectors as defined in $(1),$ which in matrix form corresponds to



                  $$F=begin{bmatrix} color{red} 2 & color{blue}{-frac 1 2}\color{red}1& color{blue}{frac 1 4}end{bmatrix}.$$



                  Hence, the same vector $vec v$ expressed in the new basis vectors is



                  $$vec v_{text{u basis}}=begin{bmatrix} frac 1 4 & frac 1 2\-1&2end{bmatrix}begin{bmatrix} 1\1end{bmatrix}=begin{bmatrix} frac 3 4\1end{bmatrix}.$$



                  And the norm would entail an inner product with the new metric tensor. In the orthonormal Euclidean basis this is simply the identity matrix. Now the matrix tensor is a $(0,2)-text{tensor},$ and transforms covariantly:



                  $$g_{text{ in u basis}}=(F^top F) I= begin{bmatrix} 2 & 1\-frac 1 2& frac 1 4end{bmatrix}begin{bmatrix} 2 & -frac 1 2\1& frac 1 4end{bmatrix}begin{bmatrix} 1 & 0\0& 1end{bmatrix}=begin{bmatrix} 5 & -frac 3 4\- frac 3 4& frac{5}{16}end{bmatrix}$$



                  The actual multiplication of basis vectors to obtain the metric tensor is explained in this presentation by @eigenchris. This metric tensor indeed renders the right norm of $vec v:$



                  $$begin{bmatrix}frac 3 4 & 1end{bmatrix}begin{bmatrix} 5 & -frac 3 4\- frac 3 4& frac{5}{16}end{bmatrix}begin{bmatrix}frac 3 4 \ 1end{bmatrix}=2$$



                  following the operations here.






                  share|cite|improve this answer











                  $endgroup$



                  The tensor metric allows for the vector norm to remain constant under change of basis vectors, and is an example of inner product (page 15). In the simple setting of basis vectors constant in direction and magnitude from point to point in $mathbb R^2,$ here is an example:



                  Changing basis vectors from Euclidean orthonormal basis $smallleft{vec e_1=begin{bmatrix}1\0 end{bmatrix},vec e_2=begin{bmatrix}0\1 end{bmatrix}right}$ to



                  $$ left{vec u_1=color{red}2vec e_1 + color{red}1vec e_2,quad vec u_2=color{blue}{-frac 1 2 }vec e_2+color{blue}{frac 1 4} vec e_2right}tag 1$$ would result in a different norm of the vector $vec v=begin{bmatrix} 1\1end{bmatrix}_text{e basis},$ i.e $Vert vec v Vert^2=v_1^2 + v_2^2 = 2,$ when expressed with respect to the new basis vectors.



                  In this regard, since vector components transform contravariantly, the change to the new coordinate system would be given by the backward transformation matrix:



                  $$B=begin{bmatrix} frac 1 4 & frac 1 2\-1&2end{bmatrix}$$



                  i.e. the inverse of the forward transformation for the basis vectors as defined in $(1),$ which in matrix form corresponds to



                  $$F=begin{bmatrix} color{red} 2 & color{blue}{-frac 1 2}\color{red}1& color{blue}{frac 1 4}end{bmatrix}.$$



                  Hence, the same vector $vec v$ expressed in the new basis vectors is



                  $$vec v_{text{u basis}}=begin{bmatrix} frac 1 4 & frac 1 2\-1&2end{bmatrix}begin{bmatrix} 1\1end{bmatrix}=begin{bmatrix} frac 3 4\1end{bmatrix}.$$



                  And the norm would entail an inner product with the new metric tensor. In the orthonormal Euclidean basis this is simply the identity matrix. Now the matrix tensor is a $(0,2)-text{tensor},$ and transforms covariantly:



                  $$g_{text{ in u basis}}=(F^top F) I= begin{bmatrix} 2 & 1\-frac 1 2& frac 1 4end{bmatrix}begin{bmatrix} 2 & -frac 1 2\1& frac 1 4end{bmatrix}begin{bmatrix} 1 & 0\0& 1end{bmatrix}=begin{bmatrix} 5 & -frac 3 4\- frac 3 4& frac{5}{16}end{bmatrix}$$



                  The actual multiplication of basis vectors to obtain the metric tensor is explained in this presentation by @eigenchris. This metric tensor indeed renders the right norm of $vec v:$



                  $$begin{bmatrix}frac 3 4 & 1end{bmatrix}begin{bmatrix} 5 & -frac 3 4\- frac 3 4& frac{5}{16}end{bmatrix}begin{bmatrix}frac 3 4 \ 1end{bmatrix}=2$$



                  following the operations here.







                  share|cite|improve this answer














                  share|cite|improve this answer



                  share|cite|improve this answer








                  edited Dec 16 '18 at 1:49

























                  answered Dec 15 '18 at 22:57









                  Antoni ParelladaAntoni Parellada

                  3,06221341




                  3,06221341






























                      draft saved

                      draft discarded




















































                      Thanks for contributing an answer to Mathematics Stack Exchange!


                      • Please be sure to answer the question. Provide details and share your research!

                      But avoid



                      • Asking for help, clarification, or responding to other answers.

                      • Making statements based on opinion; back them up with references or personal experience.


                      Use MathJax to format equations. MathJax reference.


                      To learn more, see our tips on writing great answers.




                      draft saved


                      draft discarded














                      StackExchange.ready(
                      function () {
                      StackExchange.openid.initPostLogin('.new-post-login', 'https%3a%2f%2fmath.stackexchange.com%2fquestions%2f3039219%2fin-mathbbrn-is-the-dot-product-the-only-inner-product%23new-answer', 'question_page');
                      }
                      );

                      Post as a guest















                      Required, but never shown





















































                      Required, but never shown














                      Required, but never shown












                      Required, but never shown







                      Required, but never shown

































                      Required, but never shown














                      Required, but never shown












                      Required, but never shown







                      Required, but never shown







                      Popular posts from this blog

                      Bundesstraße 106

                      Verónica Boquete

                      Ida-Boy-Ed-Garten