**CML**) is a set of procedures written in the GAUSS programming
language (Schoenberg, 1995) for the estimation of the parameters
of models via the maximum likelihood
method with general constraints on the parameters

**CML** solves the general weighted maximum likelihood
problem

where *N* is the number of observations, is a weight.
is the
probability of given , a vector of parameters,
subject to the linear constraints,

the nonlinear constraints

and bounds

and are functions provided by the user and must be differentiable at least once with respect to .

**CML** finds values for the parameters in such that
*L* is maximized using the Sequential Quadratic Programming
method. In this method the parameters are updated in a series of
iterations beginning with a vector of starting values.
Let be the current parameter values. Then the succeeding values are

where is a *direction* vector, and a scalar
*step length*.

Define

and the Jacobians

For the purposes of this exposition, and without loss of generality,
we may assume that the linear constraints
and bounds have been incorporated into *G* and *H*.

The direction, is the solution to the quadratic program

This solution requires that be positive definite (In practice, CML incorporates a slight modification in the quadratic programming solution which relaxes this requirement to non-negative definite).

In practice, linear constraints are specified separately from the
*G* and *H* because their Jacobians are known and easy to compute. And
the bounds are more easily handled separately from the linear
inequality constraints.

The SQP method requires the calculation of a Hessian, ,
and various gradients and Jacobians, , , and .
**CML** computes these numerically if procedures to compute them
are not supplied.

**Descent Algorithms.**
The Hessian may be very expensive to compute at every iteration, and
poor start values may produce an ill-conditioned Hessian. For these
reasons alternative algorithms are provided in **CML** for updating
the Hessian rather than computing it directly at each iteration.
These algorithms, as well as step length methods, may
be modified during the execution of **CML**.

The most reliable method given a good starting point is the
Gauss-Newton method where the Hessian is directly computed.
This method is time-consuming, however, especially if it must
be computed numerically which is most often the case. **CML** also
offers two quasi-Newton methods, the BFGS (Broyden,
Fletcher, Goldfarb, and Shanno), and the DFP (Davidon, Fletcher, and
Powell). These latter methods require far less computation and
are generally more useful when the starting points is poor.

In practice a combination of the two methods is the most successful.

**Line Search Methods.**
Define the merit function

where is the *j*-th row of *G*, is the -th row of *H*,
is the vector of Lagrangean coefficients of the equality constraints, and
the Lagrangean coefficients of the inequality constraints.
The line search finds a value of that minimizes or
decreases

where is a constant. Given and *d*, this is
function of a single variable .
Line search methods attempt to find a value for that
decreases *m*. Several methods are available in **CML**,
STEPBT, a polynomial fitting method, BRENT and BHHHSTEP, golden section
methods, and HALF, a step-halving method.

Fri Sep 12 09:47:41 PDT 1997