next up previous
Next: Confidence Limits by Inversion Up: Simulation of Bayesian Posterior Previous: Introduction

CML

CML) is a set of procedures written in the GAUSS programming language (Schoenberg, 1995) for the estimation of the parameters of models via the maximum likelihood method with general constraints on the parameters

CML solves the general weighted maximum likelihood problem

displaymath434

where N is the number of observations, tex2html_wrap_inline448 is a weight. tex2html_wrap_inline450 is the probability of tex2html_wrap_inline452 given tex2html_wrap_inline454 , a vector of parameters, subject to the linear constraints,

eqnarray52

the nonlinear constraints

eqnarray54

and bounds

displaymath435

tex2html_wrap_inline456 and tex2html_wrap_inline458 are functions provided by the user and must be differentiable at least once with respect to tex2html_wrap_inline454 .

CML finds values for the parameters in tex2html_wrap_inline454 such that L is maximized using the Sequential Quadratic Programming method. In this method the parameters are updated in a series of iterations beginning with a vector of starting values. Let tex2html_wrap_inline466 be the current parameter values. Then the succeeding values are

displaymath436

where tex2html_wrap_inline468 is a tex2html_wrap_inline470 direction vector, and tex2html_wrap_inline472 a scalar step length.

Define

eqnarray64

and the Jacobians

eqnarray70

For the purposes of this exposition, and without loss of generality, we may assume that the linear constraints and bounds have been incorporated into G and H.

The direction, tex2html_wrap_inline468 is the solution to the quadratic program

displaymath437

eqnarray82

This solution requires that tex2html_wrap_inline480 be positive definite (In practice, CML incorporates a slight modification in the quadratic programming solution which relaxes this requirement to non-negative definite).

In practice, linear constraints are specified separately from the G and H because their Jacobians are known and easy to compute. And the bounds are more easily handled separately from the linear inequality constraints.

The SQP method requires the calculation of a Hessian, tex2html_wrap_inline480 , and various gradients and Jacobians, tex2html_wrap_inline488 , tex2html_wrap_inline490 , and tex2html_wrap_inline492 . CML computes these numerically if procedures to compute them are not supplied.

Descent Algorithms. The Hessian may be very expensive to compute at every iteration, and poor start values may produce an ill-conditioned Hessian. For these reasons alternative algorithms are provided in CML for updating the Hessian rather than computing it directly at each iteration. These algorithms, as well as step length methods, may be modified during the execution of CML.

The most reliable method given a good starting point is the Gauss-Newton method where the Hessian is directly computed. This method is time-consuming, however, especially if it must be computed numerically which is most often the case. CML also offers two quasi-Newton methods, the BFGS (Broyden, Fletcher, Goldfarb, and Shanno), and the DFP (Davidon, Fletcher, and Powell). These latter methods require far less computation and are generally more useful when the starting points is poor.

In practice a combination of the two methods is the most successful.

Line Search Methods. Define the merit function

displaymath438

where tex2html_wrap_inline494 is the j-th row of G, tex2html_wrap_inline500 is the tex2html_wrap_inline502 -th row of H, tex2html_wrap_inline506 is the vector of Lagrangean coefficients of the equality constraints, and tex2html_wrap_inline508 the Lagrangean coefficients of the inequality constraints. The line search finds a value of tex2html_wrap_inline472 that minimizes or decreases

displaymath439

where tex2html_wrap_inline472 is a constant. Given tex2html_wrap_inline454 and d, this is function of a single variable tex2html_wrap_inline472 . Line search methods attempt to find a value for tex2html_wrap_inline472 that decreases m. Several methods are available in CML, STEPBT, a polynomial fitting method, BRENT and BHHHSTEP, golden section methods, and HALF, a step-halving method.


next up previous
Next: Confidence Limits by Inversion Up: Simulation of Bayesian Posterior Previous: Introduction

R. Schoenberg
Fri Sep 12 09:47:41 PDT 1997