Applied regression analysis / N. R. Draper, H. Smith.

Por: Draper, Norman RichardColaborador(es): Smith, Harry, 1923-Series Wiley series in probability and mathematical statisticsEditor: New York : Wiley, c1981Edición: 2nd edDescripción: xiv, 709 p. : il. ; 24 cmISBN: 0471029955Tema(s): Regression analysisOtra clasificación: 62-01 (62J05) Recursos en línea: Contributor biographical information | Publisher description | Table of contents only
Contenidos:
1 Fitting a Straight Line by Least Squares [1]
1.0 Introduction: The Need for Statistical Analysis [1]
1.1 Straight Line Relationships between Two Variables [5]
1.2 Linear Regression: Fitting a Straight Line [8]
1.3 The Precision of the Estimated Regression [17]
1.4 Examining the Regression Equation [22]
1.5 Lack of Fit and Pure Error [33]
1.6 The Correlation between X and Y [43]
1.7 Inverse Regression (Straight Line Case) [47]
1.8 Some Practical Implications of Chapter 1 [51]
Exercises [55]
2 The Matrix Approach to Linear Regression [70]
2.0 Introduction [70]
2.1 Fitting a Straight Line in Matrix Terms: The Estimates of Bo and B1 [70]
2.2 The Analysis of Variance in Matrix Terms [80]
2.3 The Variances and Covariance of b0 and b1 from the Matrix Calculation [82]
2.4 Variance of Ŷ Using the Matrix Development [83]
2.5 Summary of Matrix Approach to Fitting a Straight Line [84]
2.6 The General Regression Situation [85]
2.7 The “Extra Sum of Squares” Principle [97]
2.8 Orthogonal Columns in the X-Matrix [98]
2.9 Partial F-Tests and Sequential F-Tests [101]
2.10 Testing a General Linear Hypothesis in Regression Situations [102]
2.11 Weighted Least Square [108]
2.12 Bias in Regression Estimates [117]
2.13 Restricted Least Squares [122]
2.14 Some Notes on Errors in the Predictors (As Well as in the Response) [122]
2.15 Inverse Regression (Multiple Predictor Case) [125]
Appendix 2A Selected Useful Matrix Results [126]
Appendix 2B Expected Value of Extra Sum of Squares [128]
Appendix 2C How Significant Should My Regression Be? [129]
Appendix 2D Lagrange's Undetermined Multipliers [134]
Exercises [136]
3 The Examination of Residuals [141]
3.0 Introduction [141]
3.1 Overall Plot [142]
3.2 Time Sequence Plot [145]
3.3 Plot Against Ŷi [147]
3.4 Plot Against the Predictor Variables Xji, i = 1, 2,..., n [148]
3.5 Other Residuals Plots [149]
3.6 Statistics for Examination of Residuals [150]
3.7 Correlations among the Residuals [151]
3.8 Outliers [152]
3.9 Serial Correlation in Residuals [153]
3.10. Examining Runs in the Time Sequence Plot of Residuals [157]
3.11 The Durbin-Watson Test for a Certain Type of Serial Correlation [162]
3.12 Detection of Influential Observations [169]
Appendix 3A. Normal and Half-Normal Plots [177]
Exercises [183]
4 Two Predictor Variables [193]
4.0 Introduction [193]
4.1 Multiple Regression with Two Predictor Variables as a Sequence of Straight-Line Regressions [196]
4.2 Examining the Regression Equation [204]
Exercises [212]
5 More Complicated Models [218]
5.0 Introduction [218]
5.1 Polynomial Models of Various Orders in the Xj [219]
5.2 Models Involving Transformations Other Than Integer Powers [221]
5.3 Families of Transformations [225]
5.4 The Use of “Dummy” Variables in Multiple Regression [241]
5.5 Centering and Scaling; Performing the Regression in Correlation Form [257]
5.6 Orthogonal Polynomials [266]
5.7 Transforming X Matrices to Obtain Orthogonal Columns [275]
5.8 Regression Analysis of Summary Data [278]
Exercises [280]
6 Selecting the “Best” Regression Equation [294]
6.0 Introduction [294]
6.1 All Possible Regressions [296]
6.2 “Best Subset” Regression [303]
6.3 The Backward Elimination Procedure [305]
6.4 The Stepwise Regression Procedure [307]
6.5 A Drawback to Understand but not be Overly Concerned About [311]
6.6 Variations on the Previous Methods [312]
6.7 Ridge Regression [313]
6.8 PRESS [325]
6.9 Principal Component Regression [327]
6.10 Latent Root Regression [332]
6.11 The Stagewise Regression Procedure [337]
6.12 Summary [341]
6.13 Computational Method for Stepwise Regression [342]
6.14 Robust Regression [342]
6.15 Some Comments on Statistical Computer Packages [344]
Appendix 6A Canonical Form of Ridge Regression [349]
Exercises [352]
7 Two Specific Problems [380]
7.0 Introduction [380]
7.1 The First Problem [380]
7.2 Examination of the Data [381]
7.3 Choosing the First Variable to Enter Regression [383]
7.4 Construction of New Variables [386]
7.5 The Addition of a Cross-Product Term to the Model [386]
7.6 Enlarging the Model [388]
7.7 The Second Problem. Worked Examples of Second-Order Surface Fitting for k = 3 and k = 2 Variables [390]
Exercises [404]
8 Multiple Regression and Mathematical Model Building [412]
8.0 Introduction [412]
8.1 Planning the Model Building Process [414]
8.2 Development of the Mathematical Model [418]
8.3 Validation and Maintenance of the Mathematical Model [419]
9 Multiple Regression Applied to Analysis of Variance Problems [423]
9.0 Introduction [423]
9.1 The One-Way Classification : An Example [424]
9.2 Regression Treatment of the One-Way Classification Example [427]
9.3 The One-Way Classification [431]
9.4 Regression Treatment of the One-Way Classification Using the Original Model [432]
9.5 Regression Treatment of the One-Way Classification: Independent Normal Equations [437]
9.6 The Two-Way Classification with Equal Numbers of Observations in the Cells: An Example [439]
9.7 Regression Treatment of the Two-Way Classification Example [441]
9.8 The Two-Way Classification with Equal Numbers of Observations in the Cells [445]
9.9 Regression Treatment of the Two-Way Classification with Equal Numbers of Observations in the Cells [447]
9.10 Example: The Two-Way Classification [451]
9.11 Comments [453]
Exercises [454]
10 An Introduction to Nonlinear Estimation [458]
10.0 Introduction [458]
10.1 Least Squares in the Nonlinear Case [459]
10.2 Estimating the Parameters of a Nonlinear System [462]
10.3 An Example [475]
10.4 A Note on Reparameterization of the Model [488]
10.5 The Geometry of Linear Least Squares [489]
10.6 The Geometry of Nonlinear Least Squares [500]
10.7 Nonlinear Growth Models [505]
10.8 Nonlinear Models: Other Work [513]
Exercises [517]
Normal Distribution [530]
Percentage Points of the t-Distribution [532]
Percentage Points of the F-Distribution [533]
    Average rating: 0.0 (0 votes)
Item type Home library Shelving location Call number Materials specified Status Date due Barcode Course reserves
Libros Libros Instituto de Matemática, CONICET-UNS
Libros ordenados por tema 62 D765-2 (Browse shelf) Available A-5945

ANÁLISIS DE REGRESIÓN


Incluye índice.

Bibliografía: p. 675-699.

1 Fitting a Straight Line by Least Squares [1] --
1.0 Introduction: The Need for Statistical Analysis [1] --
1.1 Straight Line Relationships between Two Variables [5] --
1.2 Linear Regression: Fitting a Straight Line [8] --
1.3 The Precision of the Estimated Regression [17] --
1.4 Examining the Regression Equation [22] --
1.5 Lack of Fit and Pure Error [33] --
1.6 The Correlation between X and Y [43] --
1.7 Inverse Regression (Straight Line Case) [47] --
1.8 Some Practical Implications of Chapter 1 [51] --
Exercises [55] --
2 The Matrix Approach to Linear Regression [70] --
2.0 Introduction [70] --
2.1 Fitting a Straight Line in Matrix Terms: The Estimates of Bo and B1 [70] --
2.2 The Analysis of Variance in Matrix Terms [80] --
2.3 The Variances and Covariance of b0 and b1 from the Matrix Calculation [82] --
2.4 Variance of Ŷ Using the Matrix Development [83] --
2.5 Summary of Matrix Approach to Fitting a Straight Line [84] --
2.6 The General Regression Situation [85] --
2.7 The “Extra Sum of Squares” Principle [97] --
2.8 Orthogonal Columns in the X-Matrix [98] --
2.9 Partial F-Tests and Sequential F-Tests [101] --
2.10 Testing a General Linear Hypothesis in Regression Situations [102] --
2.11 Weighted Least Square [108] --
2.12 Bias in Regression Estimates [117] --
2.13 Restricted Least Squares [122] --
2.14 Some Notes on Errors in the Predictors (As Well as in the Response) [122] --
2.15 Inverse Regression (Multiple Predictor Case) [125] --
Appendix 2A Selected Useful Matrix Results [126] --
Appendix 2B Expected Value of Extra Sum of Squares [128] --
Appendix 2C How Significant Should My Regression Be? [129] --
Appendix 2D Lagrange's Undetermined Multipliers [134] --
Exercises [136] --
3 The Examination of Residuals [141] --
3.0 Introduction [141] --
3.1 Overall Plot [142] --
3.2 Time Sequence Plot [145] --
3.3 Plot Against Ŷi [147] --
3.4 Plot Against the Predictor Variables Xji, i = 1, 2,..., n [148] --
3.5 Other Residuals Plots [149] --
3.6 Statistics for Examination of Residuals [150] --
3.7 Correlations among the Residuals [151] --
3.8 Outliers [152] --
3.9 Serial Correlation in Residuals [153] --
3.10. Examining Runs in the Time Sequence Plot of Residuals [157] --
3.11 The Durbin-Watson Test for a Certain Type of Serial Correlation [162] --
3.12 Detection of Influential Observations [169] --
Appendix 3A. Normal and Half-Normal Plots [177] --
Exercises [183] --
4 Two Predictor Variables [193] --
4.0 Introduction [193] --
4.1 Multiple Regression with Two Predictor Variables as a Sequence of Straight-Line Regressions [196] --
4.2 Examining the Regression Equation [204] --
Exercises [212] --
5 More Complicated Models [218] --
5.0 Introduction [218] --
5.1 Polynomial Models of Various Orders in the Xj [219] --
5.2 Models Involving Transformations Other Than Integer Powers [221] --
5.3 Families of Transformations [225] --
5.4 The Use of “Dummy” Variables in Multiple Regression [241] --
5.5 Centering and Scaling; Performing the Regression in Correlation Form [257] --
5.6 Orthogonal Polynomials [266] --
5.7 Transforming X Matrices to Obtain Orthogonal Columns [275] --
5.8 Regression Analysis of Summary Data [278] --
Exercises [280] --
6 Selecting the “Best” Regression Equation [294] --
6.0 Introduction [294] --
6.1 All Possible Regressions [296] --
6.2 “Best Subset” Regression [303] --
6.3 The Backward Elimination Procedure [305] --
6.4 The Stepwise Regression Procedure [307] --
6.5 A Drawback to Understand but not be Overly Concerned About [311] --
6.6 Variations on the Previous Methods [312] --
6.7 Ridge Regression [313] --
6.8 PRESS [325] --
6.9 Principal Component Regression [327] --
6.10 Latent Root Regression [332] --
6.11 The Stagewise Regression Procedure [337] --
6.12 Summary [341] --
6.13 Computational Method for Stepwise Regression [342] --
6.14 Robust Regression [342] --
6.15 Some Comments on Statistical Computer Packages [344] --
Appendix 6A Canonical Form of Ridge Regression [349] --
Exercises [352] --
7 Two Specific Problems [380] --
7.0 Introduction [380] --
7.1 The First Problem [380] --
7.2 Examination of the Data [381] --
7.3 Choosing the First Variable to Enter Regression [383] --
7.4 Construction of New Variables [386] --
7.5 The Addition of a Cross-Product Term to the Model [386] --
7.6 Enlarging the Model [388] --
7.7 The Second Problem. Worked Examples of Second-Order Surface Fitting for k = 3 and k = 2 Variables [390] --
Exercises [404] --
8 Multiple Regression and Mathematical Model Building [412] --
8.0 Introduction [412] --
8.1 Planning the Model Building Process [414] --
8.2 Development of the Mathematical Model [418] --
8.3 Validation and Maintenance of the Mathematical Model [419] --
9 Multiple Regression Applied to Analysis of Variance Problems [423] --
9.0 Introduction [423] --
9.1 The One-Way Classification : An Example [424] --
9.2 Regression Treatment of the One-Way Classification Example [427] --
9.3 The One-Way Classification [431] --
9.4 Regression Treatment of the One-Way Classification Using the Original Model [432] --
9.5 Regression Treatment of the One-Way Classification: Independent Normal Equations [437] --
9.6 The Two-Way Classification with Equal Numbers of Observations in the Cells: An Example [439] --
9.7 Regression Treatment of the Two-Way Classification Example [441] --
9.8 The Two-Way Classification with Equal Numbers of Observations in the Cells [445] --
9.9 Regression Treatment of the Two-Way Classification with Equal Numbers of Observations in the Cells [447] --
9.10 Example: The Two-Way Classification [451] --
9.11 Comments [453] --
Exercises [454] --
10 An Introduction to Nonlinear Estimation [458] --
10.0 Introduction [458] --
10.1 Least Squares in the Nonlinear Case [459] --
10.2 Estimating the Parameters of a Nonlinear System [462] --
10.3 An Example [475] --
10.4 A Note on Reparameterization of the Model [488] --
10.5 The Geometry of Linear Least Squares [489] --
10.6 The Geometry of Nonlinear Least Squares [500] --
10.7 Nonlinear Growth Models [505] --
10.8 Nonlinear Models: Other Work [513] --
Exercises [517] --
Normal Distribution [530] --
Percentage Points of the t-Distribution [532] --
Percentage Points of the F-Distribution [533] --

MR, 82f:62002

There are no comments on this title.

to post a comment.

Click on an image to view it in the image viewer

¿Necesita ayuda?

Si necesita ayuda para encontrar información, puede visitar personalmente la biblioteca en Av. Alem 1253 Bahía Blanca, llamarnos por teléfono al 291 459 5116, o enviarnos un mensaje a biblioteca.antonio.monteiro@gmail.com

Powered by Koha