I know how to use methods like system of linear equations to show that any vector in R^n can b

I know how to use methods like system of linear equations to show that any vector in ${R}^{n}$ can be expressed in the form of unique linear combinations of a given set of n linearly independent vectors, but how to expand this result to that any given set of exactly n linearly independent vectors must form a basis of ${R}^{n}$ space? How to understand it conceptually?
I saw there are similar questions, but the answers in those posts do not really convince me.
You can still ask an expert for help

• Questions are typically answered in as fast as 30 minutes

Solve your problem for the price of one coffee

• Math expert for every subject
• Pay only if we can solve it

I'm not sure if this answers your question but in order for a set of vectors to be a basis of a vector space it has to be linear independant and has to span the entire space. Now if you're given a set of n linearly independant vectors in ${\mathbb{R}}^{n}$ you only have to show that it spans ${\mathbb{R}}^{n}$. But as you said yourself this is indeed the case since you can express any vector as a linear combination of vectors from your set.
If that isn't satisfactory, you could also think about it this way: Suppose you have n linearly independant vectors $\left({v}_{1},\dots ,{v}_{n}\right)$ and another vector w not in the span of your $\left({v}_{1},\dots ,{v}_{n}\right)$. Then $\left({v}_{1},\dots ,{v}_{n}w\right)$ is linear dependant since it's a set of n+1 vectors in a vector space of dimension n. This means there are $\mu ,{\lambda }_{i}\in \mathbb{R},i=1,\dots ,n$ not all 0 with $0=\mu w+\sum _{i=1}^{n}{\lambda }_{i}{v}_{i}$. Now $\mu \ne 0$ because otherwhise $\left({v}_{1},\dots ,{v}_{n}\right)$ would be linear dependant. Therefore $w=-\frac{1}{\mu }\sum _{i=1}^{n}{\lambda }_{i}{v}_{i}$ but this means w is in the span of $\left({v}_{1},\dots ,{v}_{n}\right)$ a contradiction.