Normal view

## Applied regression analysis / N. R. Draper, H. Smith.

Editor: New York : Wiley, c1966Descripción: ix, 407 p. : il. ; 24 cmISBN: 0471221708Otra clasificación: 62Jxx
Contenidos:
```1 Fitting a Straight Line by Least Squares [1]
1.0 Introduction: The Need for Statistical Analysis [1]
1.1 Linear Relationships between Two Variables [4]
1.2 Linear Regression: Fitting a Straight Line [7]
1.3 The Precision of the Estimated Regression [13]
1.4 Examining the Regression Equation [17]
1.5 Lack of Fit and Pure Error [26]
1.6 The Correlation between X and Y [33]
Exercises [35]
2 The Matrix Approach to Linear Regression [44]
2.0 Introduction [44]
2.1 Fitting a Straight Line in Matrix Terms: The Estimates of and B0 and B1 [44]
2.2 The Analysis of Variance in Matrix Terms [53]
2.3 The Variances and Covariance of b0 and b1 from the Matrix Calculation [55]
2.4 The Variance of Ŷ Using the Matrix Development [56]
2.5 Summary of Matrix Approach to Fitting a Straight Line [56]
2.6 The General Regression Situation [58]
2.7 The “Extra Sum of Squares” Principle [67]
2.8 Orthogonal Columns in the X-Matrix [69]
2.9 Partial F-Tests and Sequential F-Tests [71]
2.10 Testing a General Linear Hypothesis in Regression Situations [72]
2.11 Weighted Least Squares [77]
2.12 Bias in Regression Estimates [81]
3 The Examination of Residuals [86]
3.0 Introduction [86]
3.1 Overall Plot [87]
3.2 Time Sequence Plot [88]
3.3 Plot Against Ŷi [90]
3.4 Plot Against the Independent Variables Xji, = 1, 2, ..., n [91]
3.5 Other Residual Plots [91]
3.6 Statistics for Examination of Residuals [92]
3.7 Correlations among the Residuals [93]
3.8 Outliers [94]
3.9 Examining Runs in the Time Sequence Plot of Residuals [95]
Exercises [100]
4 Two Independent Variables [104]
4.0 Introduction [104]
4.1 Multiple Regression with Two Independent Variables as a Sequence of Straight-Line Regressions [107]
4.2 Examining the Regression Equation [115]
Exercises [124]
5 More Complicated Models [128]
5.0 Introduction [128]
5.1 Polynomial Models of Various Orders in the Xj [129]
5.2 Models Involving Transformations Other Than Integer Powers [131]
5.3 The Use of “Dummy” Variables in Multiple Regression [134]
5.4 Preparing the Input Data Matrix for General Regression Problems [142]
5.5 Orthogonal Polynomials [150]
5.6 Transforming X Matrices to Obtain Orthogonal Columns [156]
Exercises [159]
6 Selecting the “Best” Regression Equation [163]
6.0 Introduction [163]
6.1 All Possible Regressions [164]
6.2 The Backward Elimination Procedure [167]
6.3 The Forward Selection Procedure [169]
6.4 The Stepwise Regression Procedure [171]
6.5 Two Variations on the Four Previous Methods [172]
6.6 The Stagewise Regression Procedure [173]
6.7 A Summary of the Least Squares Equations from Methods Described [177]
6.8 Computational Method for Stepwise Regression [178]
Exercises [195]
7 A Specific Problem [217]
7.0 Introduction [217]
7.1 The Problem [217]
7.2 Examination of the Data [217]
7.3 Choosing the First Variable to Enter Regression [220]
7.4 Construction of New Variables [222]
7.5 The Addition of a Cross-Product Term to the Model [224]
7.6 Enlarging the Model [225]
Exercises [227]
8 Multiple Regression and Mathematical Model Building [234]
8.0 Introduction [234]
8.1 Planning the Model Building Process [236]
8.2 Development of the Mathematical Model [239]
8.3 Verification and Maintenance of the Mathematical Model [240]
9 Multiple Regression Applied to Analysis of Variance Problems [243]
9.0 Introduction [243]
9.1 The One-Way Classification [244]
9.2 Regression Treatment of the One-Way Classification Using the Original Model [245]
9.3 Regression Treatment of the One-Way Classification: Independent Normal Equations [251]
9.4 The Two-Way Classification with Equal Numbers of Observations in the Cells [253]
9.5 Regression Treatment of the Two-Way Classification with Equal Numbers of Observations in the Cells [254]
9.6 Example: The Two-Way Classification [258]
Exercises [260]
10 An Introduction to Nonlinear Estimation [263]
10.1 Introduction [263]
10.2 Least Squares in the Nonlinear Case [264]
10.3 Estimating the Parameters of a Nonlinear System [267]
10.4 An Example [275]
10.5 A Note on Reparameterization of the Model [284]
10.6 The Geometry of Linear Least Squares [285]
10.7 The Geometry of Nonlinear Least Squares [295]
Exercises [299]
Nonlinear Bibliography [301]
Percentage Points of the t-Distribution [305]
Percentage Points of the F-Distribution [306]```
Item type Home library Shelving location Call number Materials specified Status Date due Barcode Course reserves
Libros
Libros ordenados por tema 62 D765 (Browse shelf) Available A-3869

Bibliografía: p. 309-316.

1 Fitting a Straight Line by Least Squares [1] --
1.0 Introduction: The Need for Statistical Analysis [1] --
1.1 Linear Relationships between Two Variables [4] --
1.2 Linear Regression: Fitting a Straight Line [7] --
1.3 The Precision of the Estimated Regression [13] --
1.4 Examining the Regression Equation [17] --
1.5 Lack of Fit and Pure Error [26] --
1.6 The Correlation between X and Y [33] --
Exercises [35] --
2 The Matrix Approach to Linear Regression [44] --
2.0 Introduction [44] --
2.1 Fitting a Straight Line in Matrix Terms: The Estimates of and B0 and B1 [44] --
2.2 The Analysis of Variance in Matrix Terms [53] --
2.3 The Variances and Covariance of b0 and b1 from the Matrix Calculation [55] --
2.4 The Variance of Ŷ Using the Matrix Development [56] --
2.5 Summary of Matrix Approach to Fitting a Straight Line [56] --
2.6 The General Regression Situation [58] --
2.7 The “Extra Sum of Squares” Principle [67] --
2.8 Orthogonal Columns in the X-Matrix [69] --
2.9 Partial F-Tests and Sequential F-Tests [71] --
2.10 Testing a General Linear Hypothesis in Regression Situations [72] --
2.11 Weighted Least Squares [77] --
2.12 Bias in Regression Estimates [81] --
3 The Examination of Residuals [86] --
3.0 Introduction [86] --
3.1 Overall Plot [87] --
3.2 Time Sequence Plot [88] --
3.3 Plot Against Ŷi [90] --
3.4 Plot Against the Independent Variables Xji, = 1, 2, ..., n [91] --
3.5 Other Residual Plots [91] --
3.6 Statistics for Examination of Residuals [92] --
3.7 Correlations among the Residuals [93] --
3.8 Outliers [94] --
3.9 Examining Runs in the Time Sequence Plot of Residuals [95] --
Exercises [100] --
4 Two Independent Variables [104] --
4.0 Introduction [104] --
4.1 Multiple Regression with Two Independent Variables as a Sequence of Straight-Line Regressions [107] --
4.2 Examining the Regression Equation [115] --
Exercises [124] --
5 More Complicated Models [128] --
5.0 Introduction [128] --
5.1 Polynomial Models of Various Orders in the Xj [129] --
5.2 Models Involving Transformations Other Than Integer Powers [131] --
5.3 The Use of “Dummy” Variables in Multiple Regression [134] --
5.4 Preparing the Input Data Matrix for General Regression Problems [142] --
5.5 Orthogonal Polynomials [150] --
5.6 Transforming X Matrices to Obtain Orthogonal Columns [156] --
Exercises [159] --
6 Selecting the “Best” Regression Equation [163] --
6.0 Introduction [163] --
6.1 All Possible Regressions [164] --
6.2 The Backward Elimination Procedure [167] --
6.3 The Forward Selection Procedure [169] --
6.4 The Stepwise Regression Procedure [171] --
6.5 Two Variations on the Four Previous Methods [172] --
6.6 The Stagewise Regression Procedure [173] --
6.7 A Summary of the Least Squares Equations from Methods Described [177] --
6.8 Computational Method for Stepwise Regression [178] --
Exercises [195] --
7 A Specific Problem [217] --
7.0 Introduction [217] --
7.1 The Problem [217] --
7.2 Examination of the Data [217] --
7.3 Choosing the First Variable to Enter Regression [220] --
7.4 Construction of New Variables [222] --
7.5 The Addition of a Cross-Product Term to the Model [224] --
7.6 Enlarging the Model [225] --
Exercises [227] --
8 Multiple Regression and Mathematical Model Building [234] --
8.0 Introduction [234] --
8.1 Planning the Model Building Process [236] --
8.2 Development of the Mathematical Model [239] --
8.3 Verification and Maintenance of the Mathematical Model [240] --
9 Multiple Regression Applied to Analysis of Variance Problems [243] --
9.0 Introduction [243] --
9.1 The One-Way Classification [244] --
9.2 Regression Treatment of the One-Way Classification Using the Original Model [245] --
9.3 Regression Treatment of the One-Way Classification: Independent Normal Equations [251] --
9.4 The Two-Way Classification with Equal Numbers of Observations in the Cells [253] --
9.5 Regression Treatment of the Two-Way Classification with Equal Numbers of Observations in the Cells [254] --
9.6 Example: The Two-Way Classification [258] --
Exercises [260] --
10 An Introduction to Nonlinear Estimation [263] --
10.1 Introduction [263] --
10.2 Least Squares in the Nonlinear Case [264] --
10.3 Estimating the Parameters of a Nonlinear System [267] --
10.4 An Example [275] --
10.5 A Note on Reparameterization of the Model [284] --
10.6 The Geometry of Linear Least Squares [285] --
10.7 The Geometry of Nonlinear Least Squares [295] --
Exercises [299] --
Nonlinear Bibliography [301] --
Percentage Points of the t-Distribution [305] --
Percentage Points of the F-Distribution [306] --

MR, 35 #2415

There are no comments on this title.