Regression.pm | topic started 9/14/2004; 1:56:53 PM last post 9/14/2004; 1:56:53 PM |
Michael Gage - Regression.pm 9/14/2004; 1:56:53 PM (reads: 2865, responses: 0) |
NAMERegression.pm - weighted linear regression package (line+plane fitting)
DESCRIPTIONRegression.pm is a multivariate linear regression package. That is, it estimates the c coefficients for a line-fit of the type y= given a data set of N observations, each with k independent x variables and
one y variable. Naturally, N must be greater than k---and preferably
considerably greater. Any reasonable undergraduate statistics book will
explain what a regression is. Most of the time, the user will provide a
constant ('1') as
USAGEIf the sample data for (x1, x2, y) includes ($x1[$i], $x2[$i], $y[$i]) for 0<=$i<=5, type $reg = Regression->new( 3, "y", [ "const", "x1", "x2" ] ); for($i=0; $i<6; $i++){ $reg->include( $y[$i], [ 1.0, $x1[$i], $x2[$i] ] ); } @coeff= $reg->theta(); $b0 = $coeff[0][0]; $b1 = $coeff[0][1]; $b2 = $coeff[0][2];
ALGORITHM
Original Algorithm (ALGOL-60):W. M. Gentleman, University of Waterloo, "Basic Description
INTERNALSR=Rbar is an upperright triangular matrix, kept in normalized form with implicit 1's on the diagonal. D is a diagonal scaling matrix. These correspond to "standard Regression usage" as X' X = R' D R A backsubsitution routine (in thetacov) allows to invert the R matrix (the inverse is upper-right triangular, too!). Call this matrix H, that is H=R^(-1). (X' X)^(-1) = [(R' D^(1/2)') (D^(1/2) R)]^(-1)
RemarksThis algorithm is the statistical "standard." Insertion of a new observation can be done one obs at any time (WITH A WEIGHT!), and still only takes a low quadratic time. The storage space requirement is of quadratic order (in the indep variables). A practically infinite number of observations can easily be processed!
AUTHORNaturally, Gentleman invented this algorithm. Adaptation by ivo welch. Alan Miller (alan@dmsmelb.mel.dms.CSIRO.AU) pointed out nicer ways to compute the R^2.
Subroutines
newreceives the number of variables on each observations (i.e., an integer) and returns the blessed data structure. Also takes an optional name for this regression to remember, as well as a reference to a k*1 array of names for the X coefficients.
dumpis used for debugging.
prints the estimated coefficients, and R^2 and N.
includereceives one new observation. Call is $blessedregr->include( $yvariable, [ $x1, $x2, $x3 ... $xk ], 1.0 ); where 1.0 is an (optional) weight. Note that inclusion with a weight of -1 can be used to delete an observation. The function returns the number of observations so far included.
thetaestimates and returns the vector of coefficients.
rsq, adjrsq, sigmasq, ybar, sst, k, nThese functions provide common auxiliary information. rsq, adjrsq, sigmasq, sst, and ybar have not been checked but are likely correct. The results are stored for later usage, although this is somewhat unnecessary because the computation is so simple anyway.
DEBUGGING = SAMPLE USAGE CODEThe sample code included with this package demonstrates regression usage. To execute it, just set the constant DEBUGGING at the script head to 1, and do perl Regression.pm The printout should be ****************************************************************
BUGS/PROBLEMS
INSTALLATION and DOCUMENTATIONInstallation consists of moving the file 'Regression.pm' into a subdirectory Statistics of your modules path (e.g., /usr/lib/perl5/site_perl/5.6.0/). The documentation was produced from the module: pod2html -noindex -title "perl weighted least squares regression package" Regression.pm > Regression.html The documentation was slightly modified by Maria Voloshina, University of Rochester.
LICENSEThis module is released for free public use under a GPL license. (C) Ivo Welch, 2001. File path = /ww/webwork/pg/lib/Regression.pm |