I have an overdetemined system of linear equation and want to minimize overall error. Up...
I have an overdetemined system of linear equation and want to minimize overall error. Up to now, not a problem, I could use least squares. The problem is that I know that some equations in my system are more uncertain, while others are exact. Actually, I have a number of equations with different confidence levels ("low confidence","medium confidence", "high confidence" and so on). In a AX=B system, the solution should take this into account and keep unchanged the B coefficients of the "high confidence" equations, while the B coefficients of "low confidence" equations could be changed more drastically than the B coefficients of "mid confidence" equations.