According to the book "Nonlinear Programming" by Dimitri P. Bertsekas, Least Squares with Non-Negative Constraints is an optimization problem which seeks a solution with a minimum-norm sum-of-squares deviation from a set of constraints, subject to additional constraints that certain elements of the solution must be non-negative. This type of problem arises in a wide range of fields, including nonlinear regression, nonparametric kernel smoothing, linear inverse problems, data-fitting, deconvolution methods, image restoration, and other areas.
In order to solve a least squares problem with non-negative constraints, we must first utilize an iterative algorithm to minimize the sum of squares. One such algorithm, the projected gradient method, is an iterative procedure which performs a general unconstrained optimization, with each step projected onto a convex set of non-negative components. This algorithm has proven successful, and can be implemented as follows:
After obtaining the minimum-norm point and the value of the objective function, we can reuse this data and linear constraint to come up with a true least squares problem, with the non-negativity constraints given. From here, we can then use the standard least squares techniques to finish the problem.
In summary, least squares with non-negative constraints is an optimization problem which seeks a solution with a minimum-norm sum-of-squares deviation from a set of constraints, subject to additional constraints that certain elements of the solution must be non-negative. The solution of such a problem can be found by utilizing an iterative algorithm such as the projected gradient method, followed by the use of standard least squares techniques to finish the problem.