 # Suppose that I have a set of k values s_1,…,s_k in K and k vectors u_1,…,u_k in K^r for a field K not necessarily closed and k<r . Under which very general circumstances can I find a vector w in K^r such that w * u_k=s_k for each k? Observe that orthogonality would correspond to s_k=0 for all k. janine83fz 2022-08-13 Answered
Suppose that I have a set of k values ${s}_{1},\dots ,{s}_{k}\in K$ and k vectors ${u}_{1},\dots ,{u}_{k}\in {K}^{r}$ for a field K not necessarily closed and k<r . Under which very general circumstances can I find a vector $w\in {K}^{r}$ such that $w\cdot {u}_{k}={s}_{k}$ for each k? Observe that orthogonality would correspond to ${s}_{k}=0$ for all k.
You can still ask an expert for help

• Live experts 24/7
• Questions are typically answered in as fast as 30 minutes
• Personalized clear answers

Solve your problem for the price of one coffee

• Math expert for every subject
• Pay only if we can solve it choltas5j
It is sufficient that ${u}_{1},\dots ,{u}_{k}$ are linearly independent. You can extend this set of vectors to a basis ${u}_{1},\dots ,{u}_{r}\in {K}^{r}$ and write $w=\sum _{k=1}^{r}{w}_{k}{u}_{k}$ with ${w}_{k}\in K$. Then you can define a diagonal inner product $⟨\cdot ,\cdot ⟩$ via $⟨{u}_{i},{u}_{j}⟩={a}_{i}{\delta }_{j}^{i}$, where ${\delta }_{j}^{i}=0$if $i\ne j$ and ${\delta }_{j}^{i}=1$ if $i=j$. The initial equations $⟨w,{u}_{k}⟩={s}_{k}$ will generate you a system of linear equations, where you have the freedom to choose arbitrary ${s}_{k+1},\dots ,{s}_{r}\in K$ to solve it for ${a}_{1},\dots ,{a}_{r}$
If ${u}_{1},\dots ,{u}_{k}$ are not linearly independent the set of equations might already be overdetermined. For instance if ${u}_{1}={u}_{2}$and ${s}_{1}\ne {s}_{2}$ there won't be an inner product satisfying the set of equations.