Linear independence
$begingroup$
In my mind there is a conflict between the "intuitive" definition of linear dependence of vectors i.e., that :
$vec{v_1}=kvec{v_2}$
and the formal definition that says that there must be at least, i.e. 1 is enough, scalar that is non-zero in the linear combination:
$alpha_1vec{v_1}+alpha_2vec{v_2}+...+alpha_nvec{v_n} = vec{0}$
for me that implies that the case where ALL the OTHER $alpha_i$ are = 0 must be taken into account*, i.e. that the equation is:
$0vec{v_1}+0vec{v_2}+ alpha_ivec{v_i}...+0vec{v_n} = vec{0} $
and therefore:
$alpha_ivec{v_i} = vec{0}$
=> $vec{v_i} = vec{0} $ which is useless ...? That is, for me, the definition should say that there are at least 2 non zero $alpha_i$... but clearly is it only 1 that is needed.
what is wrong in my reasoning? (I know of course that the definitions are correct)
- of course there are the cases where more than 1 of the scalars is non-zero in which case there is no problem, but my problem is with the at least one.
vector-spaces definition
$endgroup$
migrated from stats.stackexchange.com Dec 30 '18 at 16:18
This question came from our site for people interested in statistics, machine learning, data analysis, data mining, and data visualization.
add a comment |
$begingroup$
In my mind there is a conflict between the "intuitive" definition of linear dependence of vectors i.e., that :
$vec{v_1}=kvec{v_2}$
and the formal definition that says that there must be at least, i.e. 1 is enough, scalar that is non-zero in the linear combination:
$alpha_1vec{v_1}+alpha_2vec{v_2}+...+alpha_nvec{v_n} = vec{0}$
for me that implies that the case where ALL the OTHER $alpha_i$ are = 0 must be taken into account*, i.e. that the equation is:
$0vec{v_1}+0vec{v_2}+ alpha_ivec{v_i}...+0vec{v_n} = vec{0} $
and therefore:
$alpha_ivec{v_i} = vec{0}$
=> $vec{v_i} = vec{0} $ which is useless ...? That is, for me, the definition should say that there are at least 2 non zero $alpha_i$... but clearly is it only 1 that is needed.
what is wrong in my reasoning? (I know of course that the definitions are correct)
- of course there are the cases where more than 1 of the scalars is non-zero in which case there is no problem, but my problem is with the at least one.
vector-spaces definition
$endgroup$
migrated from stats.stackexchange.com Dec 30 '18 at 16:18
This question came from our site for people interested in statistics, machine learning, data analysis, data mining, and data visualization.
$begingroup$
I have no idea where your intuitive definition came from, but it's just wrong. That's fundamentally your problem. Further, at least one of the $alpha_i$ being non-zero does not in any way imply that all of the others are zero, and there's no reason to think that it would: it's not true, even for the simplest case of real numbers (take $vec{v_1} = 1, vec{v_2} = -1, alpha_1 = alpha_2 = 1$).
$endgroup$
– user3482749
Dec 30 '18 at 19:29
$begingroup$
well the intuitive i mean that $vec{u}=kvec{v}$ i guess you agree with that one?
$endgroup$
– Machupicchu
Dec 30 '18 at 20:24
$begingroup$
i mean it implies that there is the situation that all others are non zero, otherwise one should say at least TWO. dont you think? for the logical point of view. At least one means that it can be one non zero .. or more but the case where only 1 is non zero must be taken into account...
$endgroup$
– Machupicchu
Dec 30 '18 at 20:26
$begingroup$
No, I don't, because it's wrong, as demonstrated. And no, it shouldn't say at least 2, because the only difference between the two is the case in which one vector is the zero vector, and in that case "at least 2" gives the wrong answer.
$endgroup$
– user3482749
Dec 30 '18 at 20:37
$begingroup$
ok i think i undertand now. I was missing the case when there is only one vector
$endgroup$
– Machupicchu
Dec 30 '18 at 20:39
add a comment |
$begingroup$
In my mind there is a conflict between the "intuitive" definition of linear dependence of vectors i.e., that :
$vec{v_1}=kvec{v_2}$
and the formal definition that says that there must be at least, i.e. 1 is enough, scalar that is non-zero in the linear combination:
$alpha_1vec{v_1}+alpha_2vec{v_2}+...+alpha_nvec{v_n} = vec{0}$
for me that implies that the case where ALL the OTHER $alpha_i$ are = 0 must be taken into account*, i.e. that the equation is:
$0vec{v_1}+0vec{v_2}+ alpha_ivec{v_i}...+0vec{v_n} = vec{0} $
and therefore:
$alpha_ivec{v_i} = vec{0}$
=> $vec{v_i} = vec{0} $ which is useless ...? That is, for me, the definition should say that there are at least 2 non zero $alpha_i$... but clearly is it only 1 that is needed.
what is wrong in my reasoning? (I know of course that the definitions are correct)
- of course there are the cases where more than 1 of the scalars is non-zero in which case there is no problem, but my problem is with the at least one.
vector-spaces definition
$endgroup$
In my mind there is a conflict between the "intuitive" definition of linear dependence of vectors i.e., that :
$vec{v_1}=kvec{v_2}$
and the formal definition that says that there must be at least, i.e. 1 is enough, scalar that is non-zero in the linear combination:
$alpha_1vec{v_1}+alpha_2vec{v_2}+...+alpha_nvec{v_n} = vec{0}$
for me that implies that the case where ALL the OTHER $alpha_i$ are = 0 must be taken into account*, i.e. that the equation is:
$0vec{v_1}+0vec{v_2}+ alpha_ivec{v_i}...+0vec{v_n} = vec{0} $
and therefore:
$alpha_ivec{v_i} = vec{0}$
=> $vec{v_i} = vec{0} $ which is useless ...? That is, for me, the definition should say that there are at least 2 non zero $alpha_i$... but clearly is it only 1 that is needed.
what is wrong in my reasoning? (I know of course that the definitions are correct)
- of course there are the cases where more than 1 of the scalars is non-zero in which case there is no problem, but my problem is with the at least one.
vector-spaces definition
vector-spaces definition
edited Dec 30 '18 at 20:29
Machupicchu
asked Dec 30 '18 at 15:54
MachupicchuMachupicchu
279
279
migrated from stats.stackexchange.com Dec 30 '18 at 16:18
This question came from our site for people interested in statistics, machine learning, data analysis, data mining, and data visualization.
migrated from stats.stackexchange.com Dec 30 '18 at 16:18
This question came from our site for people interested in statistics, machine learning, data analysis, data mining, and data visualization.
$begingroup$
I have no idea where your intuitive definition came from, but it's just wrong. That's fundamentally your problem. Further, at least one of the $alpha_i$ being non-zero does not in any way imply that all of the others are zero, and there's no reason to think that it would: it's not true, even for the simplest case of real numbers (take $vec{v_1} = 1, vec{v_2} = -1, alpha_1 = alpha_2 = 1$).
$endgroup$
– user3482749
Dec 30 '18 at 19:29
$begingroup$
well the intuitive i mean that $vec{u}=kvec{v}$ i guess you agree with that one?
$endgroup$
– Machupicchu
Dec 30 '18 at 20:24
$begingroup$
i mean it implies that there is the situation that all others are non zero, otherwise one should say at least TWO. dont you think? for the logical point of view. At least one means that it can be one non zero .. or more but the case where only 1 is non zero must be taken into account...
$endgroup$
– Machupicchu
Dec 30 '18 at 20:26
$begingroup$
No, I don't, because it's wrong, as demonstrated. And no, it shouldn't say at least 2, because the only difference between the two is the case in which one vector is the zero vector, and in that case "at least 2" gives the wrong answer.
$endgroup$
– user3482749
Dec 30 '18 at 20:37
$begingroup$
ok i think i undertand now. I was missing the case when there is only one vector
$endgroup$
– Machupicchu
Dec 30 '18 at 20:39
add a comment |
$begingroup$
I have no idea where your intuitive definition came from, but it's just wrong. That's fundamentally your problem. Further, at least one of the $alpha_i$ being non-zero does not in any way imply that all of the others are zero, and there's no reason to think that it would: it's not true, even for the simplest case of real numbers (take $vec{v_1} = 1, vec{v_2} = -1, alpha_1 = alpha_2 = 1$).
$endgroup$
– user3482749
Dec 30 '18 at 19:29
$begingroup$
well the intuitive i mean that $vec{u}=kvec{v}$ i guess you agree with that one?
$endgroup$
– Machupicchu
Dec 30 '18 at 20:24
$begingroup$
i mean it implies that there is the situation that all others are non zero, otherwise one should say at least TWO. dont you think? for the logical point of view. At least one means that it can be one non zero .. or more but the case where only 1 is non zero must be taken into account...
$endgroup$
– Machupicchu
Dec 30 '18 at 20:26
$begingroup$
No, I don't, because it's wrong, as demonstrated. And no, it shouldn't say at least 2, because the only difference between the two is the case in which one vector is the zero vector, and in that case "at least 2" gives the wrong answer.
$endgroup$
– user3482749
Dec 30 '18 at 20:37
$begingroup$
ok i think i undertand now. I was missing the case when there is only one vector
$endgroup$
– Machupicchu
Dec 30 '18 at 20:39
$begingroup$
I have no idea where your intuitive definition came from, but it's just wrong. That's fundamentally your problem. Further, at least one of the $alpha_i$ being non-zero does not in any way imply that all of the others are zero, and there's no reason to think that it would: it's not true, even for the simplest case of real numbers (take $vec{v_1} = 1, vec{v_2} = -1, alpha_1 = alpha_2 = 1$).
$endgroup$
– user3482749
Dec 30 '18 at 19:29
$begingroup$
I have no idea where your intuitive definition came from, but it's just wrong. That's fundamentally your problem. Further, at least one of the $alpha_i$ being non-zero does not in any way imply that all of the others are zero, and there's no reason to think that it would: it's not true, even for the simplest case of real numbers (take $vec{v_1} = 1, vec{v_2} = -1, alpha_1 = alpha_2 = 1$).
$endgroup$
– user3482749
Dec 30 '18 at 19:29
$begingroup$
well the intuitive i mean that $vec{u}=kvec{v}$ i guess you agree with that one?
$endgroup$
– Machupicchu
Dec 30 '18 at 20:24
$begingroup$
well the intuitive i mean that $vec{u}=kvec{v}$ i guess you agree with that one?
$endgroup$
– Machupicchu
Dec 30 '18 at 20:24
$begingroup$
i mean it implies that there is the situation that all others are non zero, otherwise one should say at least TWO. dont you think? for the logical point of view. At least one means that it can be one non zero .. or more but the case where only 1 is non zero must be taken into account...
$endgroup$
– Machupicchu
Dec 30 '18 at 20:26
$begingroup$
i mean it implies that there is the situation that all others are non zero, otherwise one should say at least TWO. dont you think? for the logical point of view. At least one means that it can be one non zero .. or more but the case where only 1 is non zero must be taken into account...
$endgroup$
– Machupicchu
Dec 30 '18 at 20:26
$begingroup$
No, I don't, because it's wrong, as demonstrated. And no, it shouldn't say at least 2, because the only difference between the two is the case in which one vector is the zero vector, and in that case "at least 2" gives the wrong answer.
$endgroup$
– user3482749
Dec 30 '18 at 20:37
$begingroup$
No, I don't, because it's wrong, as demonstrated. And no, it shouldn't say at least 2, because the only difference between the two is the case in which one vector is the zero vector, and in that case "at least 2" gives the wrong answer.
$endgroup$
– user3482749
Dec 30 '18 at 20:37
$begingroup$
ok i think i undertand now. I was missing the case when there is only one vector
$endgroup$
– Machupicchu
Dec 30 '18 at 20:39
$begingroup$
ok i think i undertand now. I was missing the case when there is only one vector
$endgroup$
– Machupicchu
Dec 30 '18 at 20:39
add a comment |
1 Answer
1
active
oldest
votes
$begingroup$
It turns out that, unless one of the $v_i$'s is the null vector, it never happens that only one of the $alpha_i$'s is $0$. In other words, if $v_1v_2,ldots,v_nneq0$ and if $v_1v_2,ldots,v_n$ are linearly dependent, then there coeffiecients $alpha_1,alpha_2,ldots,alpha_n$ of which at least two are non-zero such that$$alpha_1v_1+alpha_2v_2+cdots+alpha_nv_n=0.$$
On the other hand, asserting that if at least one of the $alpha_i$'s is non-zero then all others are zero is a non sequitur.
$endgroup$
$begingroup$
thanks, i think that answers my question. But still i don't understand why you say that at least one is non zero doesn't imply that the others are? For me at least on includes the case that all but one are zero. Of course it also includes cases where 2, are non zero, ... etc
$endgroup$
– Machupicchu
Dec 30 '18 at 20:23
$begingroup$
for me that means that the definition we read everywhere is at least ambiguous... why don't they say "at least two" instead on at least one?! e.g. here en.wikipedia.org/wiki/Linear_independence
$endgroup$
– Machupicchu
Dec 30 '18 at 20:27
$begingroup$
You are right in general. But the specific case of the situation in which you have non-zero vectors $v_1,v_2,ldots,v_n$, if there are scalars $alpha_1,alpha_2,ldots,alpha_n$ such that $alpha_1v_1+alpha_2v_2+cdots+alpha_nv_n=0$ and if at least one of the scalars is non-zero, then there is necessarily another non-zero scalar. It's as if I tell you that I have $4$ integers whose sum is even and such that at least one of them is odd. Then there is no way that only one of them is odd, since, if this was the case, then the sum of all $4$ of them would be odd, not even.
$endgroup$
– José Carlos Santos
Dec 30 '18 at 20:34
$begingroup$
You can say at least two because that fails, for instance, in the case in which there is a single vector, which happens to be the null vector.
$endgroup$
– José Carlos Santos
Dec 30 '18 at 20:36
$begingroup$
oh ok. so you mean that the case where only 1 $alpha_i$ is = 0 then necessarily it is the case where there is only 1 vector and it is the 0 vector?
$endgroup$
– Machupicchu
Dec 30 '18 at 20:38
|
show 4 more comments
Your Answer
StackExchange.ready(function() {
var channelOptions = {
tags: "".split(" "),
id: "69"
};
initTagRenderer("".split(" "), "".split(" "), channelOptions);
StackExchange.using("externalEditor", function() {
// Have to fire editor after snippets, if snippets enabled
if (StackExchange.settings.snippets.snippetsEnabled) {
StackExchange.using("snippets", function() {
createEditor();
});
}
else {
createEditor();
}
});
function createEditor() {
StackExchange.prepareEditor({
heartbeatType: 'answer',
autoActivateHeartbeat: false,
convertImagesToLinks: true,
noModals: true,
showLowRepImageUploadWarning: true,
reputationToPostImages: 10,
bindNavPrevention: true,
postfix: "",
imageUploader: {
brandingHtml: "Powered by u003ca class="icon-imgur-white" href="https://imgur.com/"u003eu003c/au003e",
contentPolicyHtml: "User contributions licensed under u003ca href="https://creativecommons.org/licenses/by-sa/3.0/"u003ecc by-sa 3.0 with attribution requiredu003c/au003e u003ca href="https://stackoverflow.com/legal/content-policy"u003e(content policy)u003c/au003e",
allowUrls: true
},
noCode: true, onDemand: true,
discardSelector: ".discard-answer"
,immediatelyShowMarkdownHelp:true
});
}
});
Sign up or log in
StackExchange.ready(function () {
StackExchange.helpers.onClickDraftSave('#login-link');
});
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Post as a guest
Required, but never shown
StackExchange.ready(
function () {
StackExchange.openid.initPostLogin('.new-post-login', 'https%3a%2f%2fmath.stackexchange.com%2fquestions%2f3056976%2flinear-independence%23new-answer', 'question_page');
}
);
Post as a guest
Required, but never shown
1 Answer
1
active
oldest
votes
1 Answer
1
active
oldest
votes
active
oldest
votes
active
oldest
votes
$begingroup$
It turns out that, unless one of the $v_i$'s is the null vector, it never happens that only one of the $alpha_i$'s is $0$. In other words, if $v_1v_2,ldots,v_nneq0$ and if $v_1v_2,ldots,v_n$ are linearly dependent, then there coeffiecients $alpha_1,alpha_2,ldots,alpha_n$ of which at least two are non-zero such that$$alpha_1v_1+alpha_2v_2+cdots+alpha_nv_n=0.$$
On the other hand, asserting that if at least one of the $alpha_i$'s is non-zero then all others are zero is a non sequitur.
$endgroup$
$begingroup$
thanks, i think that answers my question. But still i don't understand why you say that at least one is non zero doesn't imply that the others are? For me at least on includes the case that all but one are zero. Of course it also includes cases where 2, are non zero, ... etc
$endgroup$
– Machupicchu
Dec 30 '18 at 20:23
$begingroup$
for me that means that the definition we read everywhere is at least ambiguous... why don't they say "at least two" instead on at least one?! e.g. here en.wikipedia.org/wiki/Linear_independence
$endgroup$
– Machupicchu
Dec 30 '18 at 20:27
$begingroup$
You are right in general. But the specific case of the situation in which you have non-zero vectors $v_1,v_2,ldots,v_n$, if there are scalars $alpha_1,alpha_2,ldots,alpha_n$ such that $alpha_1v_1+alpha_2v_2+cdots+alpha_nv_n=0$ and if at least one of the scalars is non-zero, then there is necessarily another non-zero scalar. It's as if I tell you that I have $4$ integers whose sum is even and such that at least one of them is odd. Then there is no way that only one of them is odd, since, if this was the case, then the sum of all $4$ of them would be odd, not even.
$endgroup$
– José Carlos Santos
Dec 30 '18 at 20:34
$begingroup$
You can say at least two because that fails, for instance, in the case in which there is a single vector, which happens to be the null vector.
$endgroup$
– José Carlos Santos
Dec 30 '18 at 20:36
$begingroup$
oh ok. so you mean that the case where only 1 $alpha_i$ is = 0 then necessarily it is the case where there is only 1 vector and it is the 0 vector?
$endgroup$
– Machupicchu
Dec 30 '18 at 20:38
|
show 4 more comments
$begingroup$
It turns out that, unless one of the $v_i$'s is the null vector, it never happens that only one of the $alpha_i$'s is $0$. In other words, if $v_1v_2,ldots,v_nneq0$ and if $v_1v_2,ldots,v_n$ are linearly dependent, then there coeffiecients $alpha_1,alpha_2,ldots,alpha_n$ of which at least two are non-zero such that$$alpha_1v_1+alpha_2v_2+cdots+alpha_nv_n=0.$$
On the other hand, asserting that if at least one of the $alpha_i$'s is non-zero then all others are zero is a non sequitur.
$endgroup$
$begingroup$
thanks, i think that answers my question. But still i don't understand why you say that at least one is non zero doesn't imply that the others are? For me at least on includes the case that all but one are zero. Of course it also includes cases where 2, are non zero, ... etc
$endgroup$
– Machupicchu
Dec 30 '18 at 20:23
$begingroup$
for me that means that the definition we read everywhere is at least ambiguous... why don't they say "at least two" instead on at least one?! e.g. here en.wikipedia.org/wiki/Linear_independence
$endgroup$
– Machupicchu
Dec 30 '18 at 20:27
$begingroup$
You are right in general. But the specific case of the situation in which you have non-zero vectors $v_1,v_2,ldots,v_n$, if there are scalars $alpha_1,alpha_2,ldots,alpha_n$ such that $alpha_1v_1+alpha_2v_2+cdots+alpha_nv_n=0$ and if at least one of the scalars is non-zero, then there is necessarily another non-zero scalar. It's as if I tell you that I have $4$ integers whose sum is even and such that at least one of them is odd. Then there is no way that only one of them is odd, since, if this was the case, then the sum of all $4$ of them would be odd, not even.
$endgroup$
– José Carlos Santos
Dec 30 '18 at 20:34
$begingroup$
You can say at least two because that fails, for instance, in the case in which there is a single vector, which happens to be the null vector.
$endgroup$
– José Carlos Santos
Dec 30 '18 at 20:36
$begingroup$
oh ok. so you mean that the case where only 1 $alpha_i$ is = 0 then necessarily it is the case where there is only 1 vector and it is the 0 vector?
$endgroup$
– Machupicchu
Dec 30 '18 at 20:38
|
show 4 more comments
$begingroup$
It turns out that, unless one of the $v_i$'s is the null vector, it never happens that only one of the $alpha_i$'s is $0$. In other words, if $v_1v_2,ldots,v_nneq0$ and if $v_1v_2,ldots,v_n$ are linearly dependent, then there coeffiecients $alpha_1,alpha_2,ldots,alpha_n$ of which at least two are non-zero such that$$alpha_1v_1+alpha_2v_2+cdots+alpha_nv_n=0.$$
On the other hand, asserting that if at least one of the $alpha_i$'s is non-zero then all others are zero is a non sequitur.
$endgroup$
It turns out that, unless one of the $v_i$'s is the null vector, it never happens that only one of the $alpha_i$'s is $0$. In other words, if $v_1v_2,ldots,v_nneq0$ and if $v_1v_2,ldots,v_n$ are linearly dependent, then there coeffiecients $alpha_1,alpha_2,ldots,alpha_n$ of which at least two are non-zero such that$$alpha_1v_1+alpha_2v_2+cdots+alpha_nv_n=0.$$
On the other hand, asserting that if at least one of the $alpha_i$'s is non-zero then all others are zero is a non sequitur.
answered Dec 30 '18 at 16:23
José Carlos SantosJosé Carlos Santos
177k24138250
177k24138250
$begingroup$
thanks, i think that answers my question. But still i don't understand why you say that at least one is non zero doesn't imply that the others are? For me at least on includes the case that all but one are zero. Of course it also includes cases where 2, are non zero, ... etc
$endgroup$
– Machupicchu
Dec 30 '18 at 20:23
$begingroup$
for me that means that the definition we read everywhere is at least ambiguous... why don't they say "at least two" instead on at least one?! e.g. here en.wikipedia.org/wiki/Linear_independence
$endgroup$
– Machupicchu
Dec 30 '18 at 20:27
$begingroup$
You are right in general. But the specific case of the situation in which you have non-zero vectors $v_1,v_2,ldots,v_n$, if there are scalars $alpha_1,alpha_2,ldots,alpha_n$ such that $alpha_1v_1+alpha_2v_2+cdots+alpha_nv_n=0$ and if at least one of the scalars is non-zero, then there is necessarily another non-zero scalar. It's as if I tell you that I have $4$ integers whose sum is even and such that at least one of them is odd. Then there is no way that only one of them is odd, since, if this was the case, then the sum of all $4$ of them would be odd, not even.
$endgroup$
– José Carlos Santos
Dec 30 '18 at 20:34
$begingroup$
You can say at least two because that fails, for instance, in the case in which there is a single vector, which happens to be the null vector.
$endgroup$
– José Carlos Santos
Dec 30 '18 at 20:36
$begingroup$
oh ok. so you mean that the case where only 1 $alpha_i$ is = 0 then necessarily it is the case where there is only 1 vector and it is the 0 vector?
$endgroup$
– Machupicchu
Dec 30 '18 at 20:38
|
show 4 more comments
$begingroup$
thanks, i think that answers my question. But still i don't understand why you say that at least one is non zero doesn't imply that the others are? For me at least on includes the case that all but one are zero. Of course it also includes cases where 2, are non zero, ... etc
$endgroup$
– Machupicchu
Dec 30 '18 at 20:23
$begingroup$
for me that means that the definition we read everywhere is at least ambiguous... why don't they say "at least two" instead on at least one?! e.g. here en.wikipedia.org/wiki/Linear_independence
$endgroup$
– Machupicchu
Dec 30 '18 at 20:27
$begingroup$
You are right in general. But the specific case of the situation in which you have non-zero vectors $v_1,v_2,ldots,v_n$, if there are scalars $alpha_1,alpha_2,ldots,alpha_n$ such that $alpha_1v_1+alpha_2v_2+cdots+alpha_nv_n=0$ and if at least one of the scalars is non-zero, then there is necessarily another non-zero scalar. It's as if I tell you that I have $4$ integers whose sum is even and such that at least one of them is odd. Then there is no way that only one of them is odd, since, if this was the case, then the sum of all $4$ of them would be odd, not even.
$endgroup$
– José Carlos Santos
Dec 30 '18 at 20:34
$begingroup$
You can say at least two because that fails, for instance, in the case in which there is a single vector, which happens to be the null vector.
$endgroup$
– José Carlos Santos
Dec 30 '18 at 20:36
$begingroup$
oh ok. so you mean that the case where only 1 $alpha_i$ is = 0 then necessarily it is the case where there is only 1 vector and it is the 0 vector?
$endgroup$
– Machupicchu
Dec 30 '18 at 20:38
$begingroup$
thanks, i think that answers my question. But still i don't understand why you say that at least one is non zero doesn't imply that the others are? For me at least on includes the case that all but one are zero. Of course it also includes cases where 2, are non zero, ... etc
$endgroup$
– Machupicchu
Dec 30 '18 at 20:23
$begingroup$
thanks, i think that answers my question. But still i don't understand why you say that at least one is non zero doesn't imply that the others are? For me at least on includes the case that all but one are zero. Of course it also includes cases where 2, are non zero, ... etc
$endgroup$
– Machupicchu
Dec 30 '18 at 20:23
$begingroup$
for me that means that the definition we read everywhere is at least ambiguous... why don't they say "at least two" instead on at least one?! e.g. here en.wikipedia.org/wiki/Linear_independence
$endgroup$
– Machupicchu
Dec 30 '18 at 20:27
$begingroup$
for me that means that the definition we read everywhere is at least ambiguous... why don't they say "at least two" instead on at least one?! e.g. here en.wikipedia.org/wiki/Linear_independence
$endgroup$
– Machupicchu
Dec 30 '18 at 20:27
$begingroup$
You are right in general. But the specific case of the situation in which you have non-zero vectors $v_1,v_2,ldots,v_n$, if there are scalars $alpha_1,alpha_2,ldots,alpha_n$ such that $alpha_1v_1+alpha_2v_2+cdots+alpha_nv_n=0$ and if at least one of the scalars is non-zero, then there is necessarily another non-zero scalar. It's as if I tell you that I have $4$ integers whose sum is even and such that at least one of them is odd. Then there is no way that only one of them is odd, since, if this was the case, then the sum of all $4$ of them would be odd, not even.
$endgroup$
– José Carlos Santos
Dec 30 '18 at 20:34
$begingroup$
You are right in general. But the specific case of the situation in which you have non-zero vectors $v_1,v_2,ldots,v_n$, if there are scalars $alpha_1,alpha_2,ldots,alpha_n$ such that $alpha_1v_1+alpha_2v_2+cdots+alpha_nv_n=0$ and if at least one of the scalars is non-zero, then there is necessarily another non-zero scalar. It's as if I tell you that I have $4$ integers whose sum is even and such that at least one of them is odd. Then there is no way that only one of them is odd, since, if this was the case, then the sum of all $4$ of them would be odd, not even.
$endgroup$
– José Carlos Santos
Dec 30 '18 at 20:34
$begingroup$
You can say at least two because that fails, for instance, in the case in which there is a single vector, which happens to be the null vector.
$endgroup$
– José Carlos Santos
Dec 30 '18 at 20:36
$begingroup$
You can say at least two because that fails, for instance, in the case in which there is a single vector, which happens to be the null vector.
$endgroup$
– José Carlos Santos
Dec 30 '18 at 20:36
$begingroup$
oh ok. so you mean that the case where only 1 $alpha_i$ is = 0 then necessarily it is the case where there is only 1 vector and it is the 0 vector?
$endgroup$
– Machupicchu
Dec 30 '18 at 20:38
$begingroup$
oh ok. so you mean that the case where only 1 $alpha_i$ is = 0 then necessarily it is the case where there is only 1 vector and it is the 0 vector?
$endgroup$
– Machupicchu
Dec 30 '18 at 20:38
|
show 4 more comments
Thanks for contributing an answer to Mathematics Stack Exchange!
- Please be sure to answer the question. Provide details and share your research!
But avoid …
- Asking for help, clarification, or responding to other answers.
- Making statements based on opinion; back them up with references or personal experience.
Use MathJax to format equations. MathJax reference.
To learn more, see our tips on writing great answers.
Sign up or log in
StackExchange.ready(function () {
StackExchange.helpers.onClickDraftSave('#login-link');
});
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Post as a guest
Required, but never shown
StackExchange.ready(
function () {
StackExchange.openid.initPostLogin('.new-post-login', 'https%3a%2f%2fmath.stackexchange.com%2fquestions%2f3056976%2flinear-independence%23new-answer', 'question_page');
}
);
Post as a guest
Required, but never shown
Sign up or log in
StackExchange.ready(function () {
StackExchange.helpers.onClickDraftSave('#login-link');
});
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Post as a guest
Required, but never shown
Sign up or log in
StackExchange.ready(function () {
StackExchange.helpers.onClickDraftSave('#login-link');
});
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Post as a guest
Required, but never shown
Sign up or log in
StackExchange.ready(function () {
StackExchange.helpers.onClickDraftSave('#login-link');
});
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Post as a guest
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown
$begingroup$
I have no idea where your intuitive definition came from, but it's just wrong. That's fundamentally your problem. Further, at least one of the $alpha_i$ being non-zero does not in any way imply that all of the others are zero, and there's no reason to think that it would: it's not true, even for the simplest case of real numbers (take $vec{v_1} = 1, vec{v_2} = -1, alpha_1 = alpha_2 = 1$).
$endgroup$
– user3482749
Dec 30 '18 at 19:29
$begingroup$
well the intuitive i mean that $vec{u}=kvec{v}$ i guess you agree with that one?
$endgroup$
– Machupicchu
Dec 30 '18 at 20:24
$begingroup$
i mean it implies that there is the situation that all others are non zero, otherwise one should say at least TWO. dont you think? for the logical point of view. At least one means that it can be one non zero .. or more but the case where only 1 is non zero must be taken into account...
$endgroup$
– Machupicchu
Dec 30 '18 at 20:26
$begingroup$
No, I don't, because it's wrong, as demonstrated. And no, it shouldn't say at least 2, because the only difference between the two is the case in which one vector is the zero vector, and in that case "at least 2" gives the wrong answer.
$endgroup$
– user3482749
Dec 30 '18 at 20:37
$begingroup$
ok i think i undertand now. I was missing the case when there is only one vector
$endgroup$
– Machupicchu
Dec 30 '18 at 20:39