Suppose there is a set of n linear constraints <mo fence="false" stretchy="false">{ <msubsup>

Addison Trujillo

Addison Trujillo

Answered question

2022-07-08

Suppose there is a set of n linear constraints { a i T x + b i 0 } i = 1 n with a i R d , b i R , x R d . How can I find x that maximizes | { i [ n ] a i T x + b i 0 } |? Is this problem solvable or can be approximated within polynomial time?
Some reference materials would also be nice. Thanks!

Answer & Explanation

Karla Hull

Karla Hull

Beginner2022-07-09Added 20 answers

The maximum cardinality feasible subsystem problem is NP-hard
You might also search for the equivalent minimum cardinality IIS set cover problem.
You can solve the problem via mixed integer linear programming as follows. Let binary decision variable indicate whether constraint i is satisfied. The problem is to maximize i z i subject to linear (big-M) constraints
a i T x + b i M i ( 1 z i ) for all  i .
Here, Mi is a (small) constant upper bound on the LHS when z i = 0.

Do you have a similar question?

Recalculate according to your conditions!

New Questions in Multivariable calculus

Ask your question.
Get an expert answer.

Let our experts help you. Answer in as fast as 15 minutes.

Didn't find what you were looking for?