Sum of two vectors is the vector that has components equal to sum of components [closed]
$begingroup$
Consider geometric vectors for example, where the addition is defined by parallelogram law. How can I prove that Sum of two vectors is the vector that has components equal to sum of components of individual vectors with respect to some basis? In fact, how can I do the same for any vector space where the addition defined is consistent with the defintion of vector space but having a non obvious formulation such as parallelogram law?
algebra-precalculus euclidean-geometry
$endgroup$
closed as off-topic by Kavi Rama Murthy, Saad, Adrian Keister, Lord_Farin, José Carlos Santos Dec 21 '18 at 16:41
This question appears to be off-topic. The users who voted to close gave this specific reason:
- "This question is missing context or other details: Please provide additional context, which ideally explains why the question is relevant to you and our community. Some forms of context include: background and motivation, relevant definitions, source, possible strategies, your current progress, why the question is interesting or important, etc." – Saad, Adrian Keister, Lord_Farin, José Carlos Santos
If this question can be reworded to fit the rules in the help center, please edit the question.
add a comment |
$begingroup$
Consider geometric vectors for example, where the addition is defined by parallelogram law. How can I prove that Sum of two vectors is the vector that has components equal to sum of components of individual vectors with respect to some basis? In fact, how can I do the same for any vector space where the addition defined is consistent with the defintion of vector space but having a non obvious formulation such as parallelogram law?
algebra-precalculus euclidean-geometry
$endgroup$
closed as off-topic by Kavi Rama Murthy, Saad, Adrian Keister, Lord_Farin, José Carlos Santos Dec 21 '18 at 16:41
This question appears to be off-topic. The users who voted to close gave this specific reason:
- "This question is missing context or other details: Please provide additional context, which ideally explains why the question is relevant to you and our community. Some forms of context include: background and motivation, relevant definitions, source, possible strategies, your current progress, why the question is interesting or important, etc." – Saad, Adrian Keister, Lord_Farin, José Carlos Santos
If this question can be reworded to fit the rules in the help center, please edit the question.
$begingroup$
How does the parallelogram rule define the sum of vectors? Surely you need to prove that adding two vectors forms a parallelogram to prove that addition of vectors (defined as “translate the second vector until it starts where the first ends”) is commutative?
$endgroup$
– Dan Robertson
Dec 21 '18 at 11:49
$begingroup$
Let us say that the sum of two vectors is given by the diagonal formed by the two vectors. Then the definition satisfies all properties of vector space. But how can you guarantee that Sum of two vectors is the vector that has components equal to sum of components with respect to some basis?
$endgroup$
– Akhil
Dec 21 '18 at 12:26
add a comment |
$begingroup$
Consider geometric vectors for example, where the addition is defined by parallelogram law. How can I prove that Sum of two vectors is the vector that has components equal to sum of components of individual vectors with respect to some basis? In fact, how can I do the same for any vector space where the addition defined is consistent with the defintion of vector space but having a non obvious formulation such as parallelogram law?
algebra-precalculus euclidean-geometry
$endgroup$
Consider geometric vectors for example, where the addition is defined by parallelogram law. How can I prove that Sum of two vectors is the vector that has components equal to sum of components of individual vectors with respect to some basis? In fact, how can I do the same for any vector space where the addition defined is consistent with the defintion of vector space but having a non obvious formulation such as parallelogram law?
algebra-precalculus euclidean-geometry
algebra-precalculus euclidean-geometry
edited Dec 21 '18 at 12:15
Akhil
asked Dec 21 '18 at 11:42
AkhilAkhil
94
94
closed as off-topic by Kavi Rama Murthy, Saad, Adrian Keister, Lord_Farin, José Carlos Santos Dec 21 '18 at 16:41
This question appears to be off-topic. The users who voted to close gave this specific reason:
- "This question is missing context or other details: Please provide additional context, which ideally explains why the question is relevant to you and our community. Some forms of context include: background and motivation, relevant definitions, source, possible strategies, your current progress, why the question is interesting or important, etc." – Saad, Adrian Keister, Lord_Farin, José Carlos Santos
If this question can be reworded to fit the rules in the help center, please edit the question.
closed as off-topic by Kavi Rama Murthy, Saad, Adrian Keister, Lord_Farin, José Carlos Santos Dec 21 '18 at 16:41
This question appears to be off-topic. The users who voted to close gave this specific reason:
- "This question is missing context or other details: Please provide additional context, which ideally explains why the question is relevant to you and our community. Some forms of context include: background and motivation, relevant definitions, source, possible strategies, your current progress, why the question is interesting or important, etc." – Saad, Adrian Keister, Lord_Farin, José Carlos Santos
If this question can be reworded to fit the rules in the help center, please edit the question.
$begingroup$
How does the parallelogram rule define the sum of vectors? Surely you need to prove that adding two vectors forms a parallelogram to prove that addition of vectors (defined as “translate the second vector until it starts where the first ends”) is commutative?
$endgroup$
– Dan Robertson
Dec 21 '18 at 11:49
$begingroup$
Let us say that the sum of two vectors is given by the diagonal formed by the two vectors. Then the definition satisfies all properties of vector space. But how can you guarantee that Sum of two vectors is the vector that has components equal to sum of components with respect to some basis?
$endgroup$
– Akhil
Dec 21 '18 at 12:26
add a comment |
$begingroup$
How does the parallelogram rule define the sum of vectors? Surely you need to prove that adding two vectors forms a parallelogram to prove that addition of vectors (defined as “translate the second vector until it starts where the first ends”) is commutative?
$endgroup$
– Dan Robertson
Dec 21 '18 at 11:49
$begingroup$
Let us say that the sum of two vectors is given by the diagonal formed by the two vectors. Then the definition satisfies all properties of vector space. But how can you guarantee that Sum of two vectors is the vector that has components equal to sum of components with respect to some basis?
$endgroup$
– Akhil
Dec 21 '18 at 12:26
$begingroup$
How does the parallelogram rule define the sum of vectors? Surely you need to prove that adding two vectors forms a parallelogram to prove that addition of vectors (defined as “translate the second vector until it starts where the first ends”) is commutative?
$endgroup$
– Dan Robertson
Dec 21 '18 at 11:49
$begingroup$
How does the parallelogram rule define the sum of vectors? Surely you need to prove that adding two vectors forms a parallelogram to prove that addition of vectors (defined as “translate the second vector until it starts where the first ends”) is commutative?
$endgroup$
– Dan Robertson
Dec 21 '18 at 11:49
$begingroup$
Let us say that the sum of two vectors is given by the diagonal formed by the two vectors. Then the definition satisfies all properties of vector space. But how can you guarantee that Sum of two vectors is the vector that has components equal to sum of components with respect to some basis?
$endgroup$
– Akhil
Dec 21 '18 at 12:26
$begingroup$
Let us say that the sum of two vectors is given by the diagonal formed by the two vectors. Then the definition satisfies all properties of vector space. But how can you guarantee that Sum of two vectors is the vector that has components equal to sum of components with respect to some basis?
$endgroup$
– Akhil
Dec 21 '18 at 12:26
add a comment |
2 Answers
2
active
oldest
votes
$begingroup$
Let $v,,w$ be vectors, and write $v=sum_i (vcdot e_i)e_i$ etc. for basis elements $e_i$ so $$v+w=sum_i ((v+w)cdot e_i)e_i=sum_i (vcdot e_i+wcdot e_i)e_i=sum_i (vcdot e_i)e_i+sum_i (wcdot e_i)e_i.$$The $i$th component of the first and last expression are $e_i$ coefficients, respectively $(v+w)_i$ and $v_i+w_i$.
Edit: we can do without dot products as long as the components $v_i$ satisfy $v=sum_i v_i e_i$ for some linearly independent basis elements $e_i$ so the $v_i$ in that sum are unique. Then $$sum_i (v+w)_ie_i=v+w=sum_i v_ie_i+sum_i w_i e_i=sum_i (v_i+w_i)e_iimplies (v+w)_i=v_i+w_i.$$
$endgroup$
$begingroup$
Can you do the same for any arbitrary definition of addition for any vector space given that it is consistent with vector space definition?
$endgroup$
– Akhil
Dec 21 '18 at 12:22
$begingroup$
@Akhil If you're asking me to prove components add, components will have to exist. If they do, we're in the above situation. Just about the only generalisation we're allowed is to change the identity from $sum_i e_i e_i^T$ to $sum_{ij} g_{ij} e_i e_j^T$, which just means something other than an orthonormal basis has been chosen. The proof it still works then is an exercise.
$endgroup$
– J.G.
Dec 21 '18 at 12:48
$begingroup$
Is there a more fundamental proof that doesn't involve dot product?
$endgroup$
– Akhil
Dec 21 '18 at 14:00
$begingroup$
@Akhil See my edit.
$endgroup$
– J.G.
Dec 21 '18 at 14:11
$begingroup$
But the meaning of addition changes in the step v+w = ∑vi ei + ∑wi ei = ∑(vi+wi) ei. It changes from the one we defined to the one that adds the components, whose validity is what I'm asking you to prove.
$endgroup$
– Akhil
Dec 21 '18 at 14:26
|
show 2 more comments
$begingroup$
What does the vector $(1,2)$ mean? Well it means the sum of 1 times the first basis vector (say $i$) and 2 times the second (say $j$). So e.g. you want to prove that:
$$(a,b)+(c,d)= (a i + b j) + (c i + d j) = (a + c) i + (b + d) j = (a+c, b+d).$$
So you need to prove that vector addition is commutative, associative, and that scalar multiplication distributes over (vector) addition.
Then everything should fall out.
$endgroup$
add a comment |
2 Answers
2
active
oldest
votes
2 Answers
2
active
oldest
votes
active
oldest
votes
active
oldest
votes
$begingroup$
Let $v,,w$ be vectors, and write $v=sum_i (vcdot e_i)e_i$ etc. for basis elements $e_i$ so $$v+w=sum_i ((v+w)cdot e_i)e_i=sum_i (vcdot e_i+wcdot e_i)e_i=sum_i (vcdot e_i)e_i+sum_i (wcdot e_i)e_i.$$The $i$th component of the first and last expression are $e_i$ coefficients, respectively $(v+w)_i$ and $v_i+w_i$.
Edit: we can do without dot products as long as the components $v_i$ satisfy $v=sum_i v_i e_i$ for some linearly independent basis elements $e_i$ so the $v_i$ in that sum are unique. Then $$sum_i (v+w)_ie_i=v+w=sum_i v_ie_i+sum_i w_i e_i=sum_i (v_i+w_i)e_iimplies (v+w)_i=v_i+w_i.$$
$endgroup$
$begingroup$
Can you do the same for any arbitrary definition of addition for any vector space given that it is consistent with vector space definition?
$endgroup$
– Akhil
Dec 21 '18 at 12:22
$begingroup$
@Akhil If you're asking me to prove components add, components will have to exist. If they do, we're in the above situation. Just about the only generalisation we're allowed is to change the identity from $sum_i e_i e_i^T$ to $sum_{ij} g_{ij} e_i e_j^T$, which just means something other than an orthonormal basis has been chosen. The proof it still works then is an exercise.
$endgroup$
– J.G.
Dec 21 '18 at 12:48
$begingroup$
Is there a more fundamental proof that doesn't involve dot product?
$endgroup$
– Akhil
Dec 21 '18 at 14:00
$begingroup$
@Akhil See my edit.
$endgroup$
– J.G.
Dec 21 '18 at 14:11
$begingroup$
But the meaning of addition changes in the step v+w = ∑vi ei + ∑wi ei = ∑(vi+wi) ei. It changes from the one we defined to the one that adds the components, whose validity is what I'm asking you to prove.
$endgroup$
– Akhil
Dec 21 '18 at 14:26
|
show 2 more comments
$begingroup$
Let $v,,w$ be vectors, and write $v=sum_i (vcdot e_i)e_i$ etc. for basis elements $e_i$ so $$v+w=sum_i ((v+w)cdot e_i)e_i=sum_i (vcdot e_i+wcdot e_i)e_i=sum_i (vcdot e_i)e_i+sum_i (wcdot e_i)e_i.$$The $i$th component of the first and last expression are $e_i$ coefficients, respectively $(v+w)_i$ and $v_i+w_i$.
Edit: we can do without dot products as long as the components $v_i$ satisfy $v=sum_i v_i e_i$ for some linearly independent basis elements $e_i$ so the $v_i$ in that sum are unique. Then $$sum_i (v+w)_ie_i=v+w=sum_i v_ie_i+sum_i w_i e_i=sum_i (v_i+w_i)e_iimplies (v+w)_i=v_i+w_i.$$
$endgroup$
$begingroup$
Can you do the same for any arbitrary definition of addition for any vector space given that it is consistent with vector space definition?
$endgroup$
– Akhil
Dec 21 '18 at 12:22
$begingroup$
@Akhil If you're asking me to prove components add, components will have to exist. If they do, we're in the above situation. Just about the only generalisation we're allowed is to change the identity from $sum_i e_i e_i^T$ to $sum_{ij} g_{ij} e_i e_j^T$, which just means something other than an orthonormal basis has been chosen. The proof it still works then is an exercise.
$endgroup$
– J.G.
Dec 21 '18 at 12:48
$begingroup$
Is there a more fundamental proof that doesn't involve dot product?
$endgroup$
– Akhil
Dec 21 '18 at 14:00
$begingroup$
@Akhil See my edit.
$endgroup$
– J.G.
Dec 21 '18 at 14:11
$begingroup$
But the meaning of addition changes in the step v+w = ∑vi ei + ∑wi ei = ∑(vi+wi) ei. It changes from the one we defined to the one that adds the components, whose validity is what I'm asking you to prove.
$endgroup$
– Akhil
Dec 21 '18 at 14:26
|
show 2 more comments
$begingroup$
Let $v,,w$ be vectors, and write $v=sum_i (vcdot e_i)e_i$ etc. for basis elements $e_i$ so $$v+w=sum_i ((v+w)cdot e_i)e_i=sum_i (vcdot e_i+wcdot e_i)e_i=sum_i (vcdot e_i)e_i+sum_i (wcdot e_i)e_i.$$The $i$th component of the first and last expression are $e_i$ coefficients, respectively $(v+w)_i$ and $v_i+w_i$.
Edit: we can do without dot products as long as the components $v_i$ satisfy $v=sum_i v_i e_i$ for some linearly independent basis elements $e_i$ so the $v_i$ in that sum are unique. Then $$sum_i (v+w)_ie_i=v+w=sum_i v_ie_i+sum_i w_i e_i=sum_i (v_i+w_i)e_iimplies (v+w)_i=v_i+w_i.$$
$endgroup$
Let $v,,w$ be vectors, and write $v=sum_i (vcdot e_i)e_i$ etc. for basis elements $e_i$ so $$v+w=sum_i ((v+w)cdot e_i)e_i=sum_i (vcdot e_i+wcdot e_i)e_i=sum_i (vcdot e_i)e_i+sum_i (wcdot e_i)e_i.$$The $i$th component of the first and last expression are $e_i$ coefficients, respectively $(v+w)_i$ and $v_i+w_i$.
Edit: we can do without dot products as long as the components $v_i$ satisfy $v=sum_i v_i e_i$ for some linearly independent basis elements $e_i$ so the $v_i$ in that sum are unique. Then $$sum_i (v+w)_ie_i=v+w=sum_i v_ie_i+sum_i w_i e_i=sum_i (v_i+w_i)e_iimplies (v+w)_i=v_i+w_i.$$
edited Dec 21 '18 at 14:11
answered Dec 21 '18 at 11:50
J.G.J.G.
32.1k23250
32.1k23250
$begingroup$
Can you do the same for any arbitrary definition of addition for any vector space given that it is consistent with vector space definition?
$endgroup$
– Akhil
Dec 21 '18 at 12:22
$begingroup$
@Akhil If you're asking me to prove components add, components will have to exist. If they do, we're in the above situation. Just about the only generalisation we're allowed is to change the identity from $sum_i e_i e_i^T$ to $sum_{ij} g_{ij} e_i e_j^T$, which just means something other than an orthonormal basis has been chosen. The proof it still works then is an exercise.
$endgroup$
– J.G.
Dec 21 '18 at 12:48
$begingroup$
Is there a more fundamental proof that doesn't involve dot product?
$endgroup$
– Akhil
Dec 21 '18 at 14:00
$begingroup$
@Akhil See my edit.
$endgroup$
– J.G.
Dec 21 '18 at 14:11
$begingroup$
But the meaning of addition changes in the step v+w = ∑vi ei + ∑wi ei = ∑(vi+wi) ei. It changes from the one we defined to the one that adds the components, whose validity is what I'm asking you to prove.
$endgroup$
– Akhil
Dec 21 '18 at 14:26
|
show 2 more comments
$begingroup$
Can you do the same for any arbitrary definition of addition for any vector space given that it is consistent with vector space definition?
$endgroup$
– Akhil
Dec 21 '18 at 12:22
$begingroup$
@Akhil If you're asking me to prove components add, components will have to exist. If they do, we're in the above situation. Just about the only generalisation we're allowed is to change the identity from $sum_i e_i e_i^T$ to $sum_{ij} g_{ij} e_i e_j^T$, which just means something other than an orthonormal basis has been chosen. The proof it still works then is an exercise.
$endgroup$
– J.G.
Dec 21 '18 at 12:48
$begingroup$
Is there a more fundamental proof that doesn't involve dot product?
$endgroup$
– Akhil
Dec 21 '18 at 14:00
$begingroup$
@Akhil See my edit.
$endgroup$
– J.G.
Dec 21 '18 at 14:11
$begingroup$
But the meaning of addition changes in the step v+w = ∑vi ei + ∑wi ei = ∑(vi+wi) ei. It changes from the one we defined to the one that adds the components, whose validity is what I'm asking you to prove.
$endgroup$
– Akhil
Dec 21 '18 at 14:26
$begingroup$
Can you do the same for any arbitrary definition of addition for any vector space given that it is consistent with vector space definition?
$endgroup$
– Akhil
Dec 21 '18 at 12:22
$begingroup$
Can you do the same for any arbitrary definition of addition for any vector space given that it is consistent with vector space definition?
$endgroup$
– Akhil
Dec 21 '18 at 12:22
$begingroup$
@Akhil If you're asking me to prove components add, components will have to exist. If they do, we're in the above situation. Just about the only generalisation we're allowed is to change the identity from $sum_i e_i e_i^T$ to $sum_{ij} g_{ij} e_i e_j^T$, which just means something other than an orthonormal basis has been chosen. The proof it still works then is an exercise.
$endgroup$
– J.G.
Dec 21 '18 at 12:48
$begingroup$
@Akhil If you're asking me to prove components add, components will have to exist. If they do, we're in the above situation. Just about the only generalisation we're allowed is to change the identity from $sum_i e_i e_i^T$ to $sum_{ij} g_{ij} e_i e_j^T$, which just means something other than an orthonormal basis has been chosen. The proof it still works then is an exercise.
$endgroup$
– J.G.
Dec 21 '18 at 12:48
$begingroup$
Is there a more fundamental proof that doesn't involve dot product?
$endgroup$
– Akhil
Dec 21 '18 at 14:00
$begingroup$
Is there a more fundamental proof that doesn't involve dot product?
$endgroup$
– Akhil
Dec 21 '18 at 14:00
$begingroup$
@Akhil See my edit.
$endgroup$
– J.G.
Dec 21 '18 at 14:11
$begingroup$
@Akhil See my edit.
$endgroup$
– J.G.
Dec 21 '18 at 14:11
$begingroup$
But the meaning of addition changes in the step v+w = ∑vi ei + ∑wi ei = ∑(vi+wi) ei. It changes from the one we defined to the one that adds the components, whose validity is what I'm asking you to prove.
$endgroup$
– Akhil
Dec 21 '18 at 14:26
$begingroup$
But the meaning of addition changes in the step v+w = ∑vi ei + ∑wi ei = ∑(vi+wi) ei. It changes from the one we defined to the one that adds the components, whose validity is what I'm asking you to prove.
$endgroup$
– Akhil
Dec 21 '18 at 14:26
|
show 2 more comments
$begingroup$
What does the vector $(1,2)$ mean? Well it means the sum of 1 times the first basis vector (say $i$) and 2 times the second (say $j$). So e.g. you want to prove that:
$$(a,b)+(c,d)= (a i + b j) + (c i + d j) = (a + c) i + (b + d) j = (a+c, b+d).$$
So you need to prove that vector addition is commutative, associative, and that scalar multiplication distributes over (vector) addition.
Then everything should fall out.
$endgroup$
add a comment |
$begingroup$
What does the vector $(1,2)$ mean? Well it means the sum of 1 times the first basis vector (say $i$) and 2 times the second (say $j$). So e.g. you want to prove that:
$$(a,b)+(c,d)= (a i + b j) + (c i + d j) = (a + c) i + (b + d) j = (a+c, b+d).$$
So you need to prove that vector addition is commutative, associative, and that scalar multiplication distributes over (vector) addition.
Then everything should fall out.
$endgroup$
add a comment |
$begingroup$
What does the vector $(1,2)$ mean? Well it means the sum of 1 times the first basis vector (say $i$) and 2 times the second (say $j$). So e.g. you want to prove that:
$$(a,b)+(c,d)= (a i + b j) + (c i + d j) = (a + c) i + (b + d) j = (a+c, b+d).$$
So you need to prove that vector addition is commutative, associative, and that scalar multiplication distributes over (vector) addition.
Then everything should fall out.
$endgroup$
What does the vector $(1,2)$ mean? Well it means the sum of 1 times the first basis vector (say $i$) and 2 times the second (say $j$). So e.g. you want to prove that:
$$(a,b)+(c,d)= (a i + b j) + (c i + d j) = (a + c) i + (b + d) j = (a+c, b+d).$$
So you need to prove that vector addition is commutative, associative, and that scalar multiplication distributes over (vector) addition.
Then everything should fall out.
answered Dec 21 '18 at 11:54
Dan RobertsonDan Robertson
2,581512
2,581512
add a comment |
add a comment |
$begingroup$
How does the parallelogram rule define the sum of vectors? Surely you need to prove that adding two vectors forms a parallelogram to prove that addition of vectors (defined as “translate the second vector until it starts where the first ends”) is commutative?
$endgroup$
– Dan Robertson
Dec 21 '18 at 11:49
$begingroup$
Let us say that the sum of two vectors is given by the diagonal formed by the two vectors. Then the definition satisfies all properties of vector space. But how can you guarantee that Sum of two vectors is the vector that has components equal to sum of components with respect to some basis?
$endgroup$
– Akhil
Dec 21 '18 at 12:26