Determining multivariate least squares with constraint

One approach is to reframe it as a minimization problem:

xVec = Array[x, 13];
NMinimize[{Total[(a.xVec - b)^2], Total[xVec] == 1, Thread[xVec >= 0]}, xVec]

You can also get an improved residual by relaxing the equality constraint:

NMinimize[{Total[(a.xVec - b)^2] + (Total[xVec] - 1)^2, Thread[xVec >= 0]}, xVec]

Thanks to Roman for some simplifications.


This can be done as a linear programming problem although the optimization is not least-squares in that case. The idea is to set up new variables, which are absolute values of discrepancies between a.x. and b. This can be done as below.

xvec = Array[x, Length[a[[1]]]];
dvec = Array[d, Length[a]];
lpolys = a.xvec - b;
ineqs = Flatten[{Thread[xvec >= 0], Thread[dvec - lpolys >= 0], 
    Thread[dvec + lpolys >= 0], Total[xvec] == 1}];

Now use NMinimize.

{min, vals} = 
 NMinimize[{Total[dvec], ineqs}, Join[xvec, dvec], PrecisionGoal -> 10]
Total[xvec /. vals]

(* Out[90]= {0.627674050324, {x[1] -> 0.339337833732, 
  x[2] -> 0.292918812222, x[3] -> 0.195908896153, 
  x[4] -> 0.10491683141, x[5] -> 0.0591148361052, x[6] -> 0., 
  x[7] -> 0., x[8] -> 0.00181140763364, x[9] -> 0., x[10] -> 0., 
  x[11] -> 0., x[12] -> 0., x[13] -> 0.00599138274489, d[1] -> 0., 
  d[2] -> 0., d[3] -> 0.502611255236, d[4] -> 0., 
  d[5] -> 0.0178984583702, d[6] -> 0.0717916527728, d[7] -> 0., 
  d[8] -> 0., d[9] -> 0., d[10] -> 0.0353726839451}}

Out[91]= 1. *)

Tags:

Fitting