# Let v_1,v_2,....,v_k be vectors of Rn such that v=c_1v_1+c_2v_2+...+c_kv_k=d_1v_1+d_2v_2+...+d_kv_k. for some scalars c_1,c_2,....,c_k,d_1,d_2,....,d_k.Prove that if ci != di for some i = 1, 2,....,k, then v_1,v_2,....,v_k are linearly dependent. Question
Vectors and spaces Let $$\displaystyle{v}_{{1}},{v}_{{2}},\ldots.,{v}_{{k}}$$ be vectors of Rn such that
$$\displaystyle{v}={c}_{{1}}{v}_{{1}}+{c}_{{2}}{v}_{{2}}+\ldots+{c}_{{k}}{v}_{{k}}={d}_{{1}}{v}_{{1}}+{d}_{{2}}{v}_{{2}}+\ldots+{d}_{{k}}{v}_{{k}}$$.
for some scalars $$\displaystyle{c}_{{1}},{c}_{{2}},\ldots.,{c}_{{k}},{d}_{{1}},{d}_{{2}},\ldots.,{d}_{{k}}$$.Prove that if $$\displaystyle{c}{i}\ne{d}{i}{f}{\quad\text{or}\quad}{s}{o}{m}{e}{i}={1},{2},\ldots.,{k}$$,
then $$\displaystyle{v}_{{1}},{v}_{{2}},\ldots.,{v}_{{k}}$$ are linearly dependent. 2021-01-18
You haven't mentioned what v is so I'm going to ignore it.
We know that
$$\displaystyle{\sum_{{{i}={1}}}^{{k}}}{c}_{{i}}{v}_{{i}}={\sum_{{{i}={1}}}^{{k}}}{d}_{{i}}{v}_{{i}}$$
for some c_i, d_i(I'm just writing what you wrote, but in a way that saves me some space). With a little rearranging, we thus see that
$$\displaystyle{\sum_{{{i}={1}}}^{{k}}}{\left({c}_{{i}}-{d}_{{i}}\right)}{v}_{{i}}={0}$$
Now suppose that for some $$\displaystyle{j}\in{\left\lbrace{1},{2},\ldots,{k}\right\rbrace}$$, we have $$\displaystyle{c}_{{j}}\ne{d}_{{j}}$$. Then $$\displaystyle{c}_{{j}}-{d}_{{j}}\ne{0}$$.Depending on how your class defines a linearly dependent set of vectors, you might be done at this point.
By supposition, there is a solution to the equation
$$\displaystyle{\sum_{{{i}={1}}}^{{k}}}{a}_{{i}}{v}_{{i}}={0}$$
such that not all of the $$\displaystyle{a}_{{i}}'{s}{a}{r}{e}{0}.{N}{a}{m}{e}{l}{y}{a}_{{1}}={c}_{{1}}-{d}_{{1}},\ldots,{a}_{{k}}={c}_{{k}}-{d}_{{k}}$$.
But let's say your class defines linearly dependent as meaning that at least one of the vectors is expressible as a linear combination of the others. Then we just move all of the terms except the jth one to the RHS.And here's what we get by saying $$\displaystyle{c}_{{j}}-{d}_{{j}}\ne{0}:{w}{e}{c}{a}{n}\div{b}{y}{c}_{{j}}-{d}_{{j}}$$.
Doing so we get
$$\displaystyle{\left({c}_{{j}}-{d}_{{j}}\right)}{v}_{{j}}=-{\left({c}_{{1}}{d}_{{1}}\right)}{v}_{{1}}-\ldots-{\left({c}_{{{j}-{1}}}-{d}_{{{j}-{1}}}\right)}{v}_{{{j}-{1}}}$$
$$\displaystyle-{\left({c}_{{{j}+{1}}}-{d}_{{{j}+{1}}}\right)}{v}_{{{j}+{1}}}-\ldots-{\left({c}_{{k}}-{d}_{{k}}\right)}{v}_{{k}}$$
$$\displaystyle{v}_{{j}}=\frac{{{d}_{{1}}-{c}_{{1}}}}{{{c}_{{j}}-{d}_{{j}}}}{v}_{{1}}+\ldots+\frac{{{d}_{{{j}-{1}}}-{c}_{{{j}-{1}}}}}{{{c}_{{j}}-{d}_{{j}}}}{v}_{{{j}-{1}}}$$
$$\displaystyle+\frac{{{d}_{{{j}+{1}}}-{c}_{{{j}+{1}}}}}{{{c}_{{j}}-{d}_{{j}}}}{v}_{{{j}+{1}}}+\ldots+\frac{{{d}_{{k}}-{c}_{{k}}}}{{{c}_{{j}}-{d}_{{j}}}}{v}_{{k}}$$

### Relevant Questions Let U,V be subspaces of Rn. Suppose that U⊥V. Prove that {u,v} is linearly independent for any nonzero vectors u∈U,v∈V. Let u,v1 and v2 be vectors in R^3, and let c1 and c2 be scalars. If u is orthogonal to both v1 and v2, prove that u is orthogonal to the vector c1v1+c2v2. For any vectors u, v and w, show that the vectors u-v, v-w and w-u form a linearly dependent set. (i)Prove that if{v1,v2}is linearly dependent, then are multiple of each other, that is, there exists a constant c such that v1 = c v2 or v2=cv1.
(ii)Prove that the converse of(i) is also true.That is to say, if there exists a constant c such that v1 = c v2 or v2 = c v1.1, then{v1,v2}is linearly dependent. Let $$\displaystyle\le{f}{t}{\left\lbrace{v}_{{{1}}},\ {v}_{{{2}}},\dot{{s}},\ {v}_{{{n}}}{r}{i}{g}{h}{t}\right\rbrace}$$ be a basis for a vector space V. Prove that if a linear transformation $$\displaystyle{T}\ :\ {V}\rightarrow\ {V}$$ satisfies $$\displaystyle{T}{\left({v}_{{{1}}}\right)}={0}\ \text{for}\ {i}={1},\ {2},\dot{{s}},\ {n},$$ then T is the zero transformation.
Getting Started: To prove that T is the zero transformation, you need to show that $$\displaystyle{T}{\left({v}\right)}={0}$$ for every vector v in V.
(i) Let v be an arbitrary vector in V such that $$\displaystyle{v}={c}_{{{1}}}\ {v}_{{{1}}}\ +\ {c}_{{{2}}}\ {v}_{{{2}}}\ +\ \dot{{s}}\ +\ {c}_{{{n}}}\ {v}_{{{n}}}.$$
(ii) Use the definition and properties of linear transformations to rewrite $$\displaystyle{T}\ {\left({v}\right)}$$ as a linear combination of $$\displaystyle{T}\ {\left({v}_{{{1}}}\right)}$$.
(iii) Use the fact that $$\displaystyle{T}\ {\left({v}_{{i}}\right)}={0}$$ to conclude that $$\displaystyle{T}\ {\left({v}\right)}={0}$$, making T the zero tranformation. Guided Proof Let $${v_{1}, v_{2}, .... V_{n}}$$ be a basis for a vector space V.
Prove that if a linear transformation $$T : V \rightarrow V$$ satisfies
$$T (v_{i}) = 0\ for\ i = 1, 2,..., n,$$ then T is the zero transformation.
To prove that T is the zero transformation, you need to show that $$T(v) = 0$$ for every vector v in V.
(i) Let v be the arbitrary vector in V such that $$v = c_{1} v_{1} + c_{2} v_{2} +\cdots + c_{n} V_{n}$$
(ii) Use the definition and properties of linear transformations to rewrite T(v) as a linear combination of $$T(v_{j})$$ .
(iii) Use the fact that $$T (v_{j}) = 0$$
to conclude that $$T (v) = 0,$$ making T the zero transformation. Let $$\displaystyle{B}={\left\lbrace{v}{1},{v}{2},\ldots,{v}{m}\right\rbrace}$$ be a basis for Rm. Suppose kvm is a linear combination of v1, v2, ...., vm-1 for some scalar k. What can be said about the possible value(s) of k? Assum T: R^m to R^n is a matrix transformation with matrix A. Prove that if the columns of A are linearly independent, then T is one to one (i.e injective). (Hint: Remember that matrix transformations satisfy the linearity properties.
Linearity Properties:
If A is a matrix, v and w are vectors and c is a scalar then
$$A 0 = 0$$
$$A(cv) = cAv$$
$$A(v\ +\ w) = Av\ +\ Aw$$  Find all scalars $$c_{1} , c_{2}, c_{3}$$ such that $$c_{1}(1 , -1, 0) + c_{2}(4, 5, 1) + c_{3}(0, 1, 5) = (3, 2, -19)$$