Advice On Least Squares With Non-Negative Constraints

According to the book "Nonlinear Programming" by Dimitri P. Bertsekas, Least Squares with Non-Negative Constraints is an optimization problem which seeks a solution with a minimum-norm sum-of-squares deviation from a set of constraints, subject to additional constraints that certain elements of the solution must be non-negative. This type of problem arises in a wide range of fields, including nonlinear regression, nonparametric kernel smoothing, linear inverse problems, data-fitting, deconvolution methods, image restoration, and other areas.

In order to solve a least squares problem with non-negative constraints, we must first utilize an iterative algorithm to minimize the sum of squares. One such algorithm, the projected gradient method, is an iterative procedure which performs a general unconstrained optimization, with each step projected onto a convex set of non-negative components. This algorithm has proven successful, and can be implemented as follows:

  1. Set a starting point x0, and compute f(x0) and g(x0)
  2. For k from 0, 1, ... repeat the following steps:
    1. Compute the gradient ∇f(xk)
    2. Mathematically project this onto the box region X and obtain the point xk+1
    3. Check for termination criteron (relative error, etc.)
  3. Output the minimum-norm point x and the value of the objective function f(x)

After obtaining the minimum-norm point and the value of the objective function, we can reuse this data and linear constraint to come up with a true least squares problem, with the non-negativity constraints given. From here, we can then use the standard least squares techniques to finish the problem.

In summary, least squares with non-negative constraints is an optimization problem which seeks a solution with a minimum-norm sum-of-squares deviation from a set of constraints, subject to additional constraints that certain elements of the solution must be non-negative. The solution of such a problem can be found by utilizing an iterative algorithm such as the projected gradient method, followed by the use of standard least squares techniques to finish the problem.