Introduction.
Problem statement.
A Separable Nonlinear Least Square criterion (SNLS) is of the form
where
\(x\) and \(y\) stand for two vectors in \(\mathbb{R}^K\) and \(\mathbb{R}^J\), respectively,
\(w_n\) are some real data,
\(F_n\) is a function mapping \(\mathbb{R}^K\) into \(\mathbb{R}^J\).
This package deals with the minimization over \((x, y)\) of a penalized SNLS criterion
where \(r_1\) and \(r_2\) are some penalization terms (potentially non-smooth) weighted by two positive scalars \(\lambda_1\) and \(\lambda_2\), respectively.
Denoting the residuals by
the criterion can also be written as
In a complete matrix form, it can further be written as
where \(\vert \cdot \vert\) is the Euclidean norm in \(\mathbb{R}^n\) and \(\epsilon(x, y)\) is the vector formed by terms \(\epsilon_n(x, y)\).
Model fitting
The NSLS problem may arise when the data is to be fitted by a model mixing linear and non linear parts. For instance, consider a situation where the data \(w = (w_n)_{n=1}^N\) comes from the observation of a unidimensional signal at some time points \((t_n)_{n=1}^N\) of \(\mathbb{R}\). Assume that the data observation can be described by an additive model
where variables \(\epsilon_n\) stand for independent Gaussian noise and the function \(f\) for a signal model depending on parameters \(x\) and \(y\). Further assume that the model \(f\) is of the form
for some function \(\varphi_j(\cdot; x)\) depending on parameters \(x\). Then, fitting the observed data \(w\) with the model \(f\) can be done by minimizing a SNLS of the form (1) for
Such a modeling is generic and occurs in many fields of mathematical engineering. In the gallery of examples, we present some of its applications to the statistical inference of parameters for stochastic processes.
Variable projection method
When there is no penalization (\(\lambda_1 = \lambda_2 = 0\)), the optimization problem amounts to minimizing over \(y\) the function
where \(x^{\ast}(y)\) given by
This is the so-called variable projection methode which was introduced in [3] to reduce the minimization problem to the single variable \(x\).
This package includes an extension of the variable projection to deal with cases when there are penalization and constraints on \(x\). The implemented methods are precisely described in [1] and [2]. It is based on a proximal dual approach, called varprox !