Applied linear regression models / John Neter, Michael H. Kutner, Christopher J. Nachtsheim, William Wasserman.
Series The Irwin series in statisticsEditor: Chicago ; Buenos Aires : Irwin, c1996Edición: 3rd edDescripción: xv, 720 p. : il. ; 27 cmISBN: 025608601XOtra clasificación: 62J05Part I Simple Linear Regression 1 Linear Regression with One Predictor Variable [3] 1.1 Relations between Variables [3] 1.2 Regression Models and Their Uses [6] 1.3 Simple Linear Regression Model with Distribution of Error Terms Unspecified [10] 1.4 Data for Regression Analysis [14] 1.5 Overview of Steps in Regression Analysis [15] 1.6 Estimation of Regression Function [17] 1.7 Estimation of Error Terms Variance σ2 [27] 1.8 Normal Error Regression Model [29] 2 Inferences in Regression Analysis [44] 2.1 Inferences concerning B1 [44] 2.2 Inferences concerning B0 [53] 2.3 Some Considerations on Making Inferences concerning BO and B1 [54] 2.4 Interval Estimation of E{ Yh} [56] 2.5 Prediction of New Observation [61] 2.6 Confidence Band for Regression Line [67] 2.7 Analysis of Variance Approach to Regression Analysis [69] 2.8 General Linear Test Approach [78] 2.9 Descriptive Measures of Association between X and Y in Regression Model [80] 2.10 Considerations in Applying Regression Analysis [84] 2.11 Case when X is Random [85] 3 Diagnostics and Remedial Measures [95] 3.1 Diagnostics for Predictor Variable [95] 3.2 Residuals [97] 3.3 Diagnostics for Residuals [98] 3.4 Overview of Tests Involving Residuals [110] 3.5 Correlation Test for Normality [111] 3.6 Tests for Constancy of Error Variance [112] 3.7 F Test for Lack of Fit [115] 3.8 Overview of Remedial Measures [124] 3.9 Transformations [126] 3.10 Exploration of Shape of Regression Function [135] 3.11 Case Example—-Plutonium Measurement [138] 4 Simultaneous Inferences and Other Topics in Regression Analysis [152] 4.1 Joint Estimation of B0 and [152] 4.2 Simultaneous Estimation of Mean Responses [155] 4.3 Simultaneous Prediction Intervals for New Observations [158] 4.4 Regression through Origin [159] 4.5 Effects of Measurement Errors [164] 4.6 Inverse Predictions [167] 4.7 Choice of X Levels [169] 5 Matrix Approach to Simple Linear Regression Analysis [176] 5.1 Matrices [176] 5.2 Matrix Addition and Subtraction [180] 5.3 Matrix Multiplication [182] 5.4 Special Types of Matrices [185] 5.5 Linear Dependence and Rank of Matrix [188] 5.6 Inverse of a Matrix [189] 5.7 Some Basic Theorems for Matrices [194] 5.8 Random Vectors and Matrices [194] 5.9 Simple Linear Regression Model in Matrix Terms [198] 5.10 Least Squares Estimation of Regression Parameters [200] 5.11 Fitted Values and Residuals [202] 5.12 Analysis of Variance Results [205] 5.13 Inferences in Regression Analysis [208] Part II Multiple Linear Regression 6 Multiple Regression—I [217] 6.1 Multiple Regression Models [217] 6.2 General Linear Regression Model in Matrix Terms [225] 6.3 Estimation of Regression Coefficients [227] 6.4 Fitted Values and Residuals [227] 6.5 Analysis of Variance Results [228] 6.6 Inferences about Regression Parameters [231] 6.7 Estimation of Mean Response and Prediction of New Observation [233] 6.8 Diagnostics and Remedial Measures [236] 6.9 An Example—Multiple Regression with Two Predictor Variables [241] 7 Multiple Regression—II [260] 7.1 Extra Sums of Squares [260] 7.2 Uses of Extra Sums of Squares in Tests for Regression Coefficients [268] 7.3 Summary of Tests concerning Regression Coefficients [271] 7.4 Coefficients of Partial Determination [274] 7.5 Standardized Multiple Regression Model [277] 7.6 Multicollinearity and Its Effects [285] 7.7 Polynomial Regression Models [296] 7.8 Interaction Regression Models [308] 7.9 Constrained Regression [315] 8 Building the Regression Model I: Selection of Predictor Variables [327] 8.1 Overview of Model-Building Process [327] 8.2 Surgical Unit Example [334] 8.3 All-Possible-Regressions Procedure for Variables Reduction [336] 8.4 Forward Stepwise Regression and Other Automatic Search Procedures for Variables Reduction [347] 8.5 Some Final Comments on Model Building for Exploratory Observational Studies [353] 9 Building the Regression Model II: Diagnostics [361] 9.1 Model Adequacy for a Predictor Variable—Partial Regression Plots [361] 9.2 Identifying Outlying Y Observations—Studentized Deleted Residuals [368] 9.3 Identifying Outlying X Observations—Hat Matrix Leverage Values [375] 9.4 Identifying Influential Cases—DFFITS, Cook’s Distance, and DFBETAS Measures [378] 9.5 Multicollinearity Diagnostics—Variance Inflation Factor [385] 9.6 Surgical Unit Example—Continued [388] 10 Building the Regression Model III: Remedial Measures and Validation [400] 10.1 Unequal Error Variances Remedial Measures—Weighted Least Squares [400] 10.2 Multicollinearity Remedial Measures—Ridge Regression [410] 10.3 Remedial Measures for Influential Cases—Robust Regression [416] 10.4 Remedial Measures for Unknown Response Function—Nonparametric Regression [425] 10.5 Remedial Measures for Evaluating Precision in Nonstandard Situations—Bootstrapping [429] 10.6 Model Validation [434] 10.7 Case Example—Mathematics Proficiency [439] 11 Qualitative Predictor Variables [455] 11.1 One Qualitative Predictor Variable [455] 11.2 Model Containing Interaction Effects [461] 11.3 More Complex Models [464] 11.4 Comparison of Two or More Regression Functions [468] 11.5 Other Uses of Indicator Variables [474] 11.6 Some Considerations in Using Indicator Variables [480] 11.7 Case Example—MNDOT Traffic Estimation [483] 12 Autocorrelation in Time Series Data [497] 12.1 Problems of Autocorrelation [497] 12.2 First-Order Autoregressive Error Model [501] 12.3 Durbin-Watson Test for Autocorrelation [504] 12.4 Remedial Measures for Autocorrelation [507] 12.5 Forecasting with Autocorrelated Error Terms [517] Part III Nonlinear Regression 13 Introduction to Nonlinear Regression [531] 13.1 Linear and Nonlinear Regression Models [531] 13.2 Example [535] 13.3 Least Squares Estimation in Nonlinear Regression [536] 13.4 Model Building and Diagnostics [547] 13.5 Inferences about Nonlinear Regression Parameters [548] 13.6 Learning Curve Example [555] 14 Logistic Regression, Poisson Regression, and Generalized Linear Models [567] 14.1 Regression Models with Binary Response Variable [567] 14.2 Simple Logistic Response Function [570] 14.3 Simple Logistic Regression [573] 14.4 Multiple Logistic Regression [580] 14.5 Model Building: Selection of Predictor Variables [585] 14.6 Diagnostics [590] 14.7 Inferences about Logistic Regression Parameters [599] 14.8 Inferences about Mean Response [602] 14.9 Prediction of a New Observation [605] 14.10 Polytomous Logistic Regression [608] 14.11 Poisson Regression [609] 14.12 Generalized Linear Models [614] Part IV Correlation Analysis 15 Normal Correlation Models [631] 15.1 Distinction between Regression and Correlation Models [631] 15.2 Bivariate Normal Distribution [632] 15.3 Conditional Inferences [636] 15.4 Inferences on Correlation Coefficients [640] 15.5 Multivariate Normal Distribution [645] 15.6 Spearman Rank Correlation Coefficient [651]
Item type | Home library | Shelving location | Call number | Materials specified | Status | Date due | Barcode | Course reserves |
---|---|---|---|---|---|---|---|---|
Libros | Instituto de Matemática, CONICET-UNS | Libros ordenados por tema | 62 N469-3 (Browse shelf) | Available | A-7369 |
Incluye referencias bibliográficas (p. 708-714) e índice.
Part I --
Simple Linear Regression --
1 Linear Regression with One Predictor Variable [3] --
1.1 Relations between Variables [3] --
1.2 Regression Models and Their Uses [6] --
1.3 Simple Linear Regression Model with Distribution of Error Terms Unspecified [10] --
1.4 Data for Regression Analysis [14] --
1.5 Overview of Steps in Regression Analysis [15] --
1.6 Estimation of Regression Function [17] --
1.7 Estimation of Error Terms Variance σ2 [27] --
1.8 Normal Error Regression Model [29] --
2 Inferences in Regression Analysis [44] --
2.1 Inferences concerning B1 [44] --
2.2 Inferences concerning B0 [53] --
2.3 Some Considerations on Making Inferences concerning BO and B1 [54] --
2.4 Interval Estimation of E{ Yh} [56] --
2.5 Prediction of New Observation [61] --
2.6 Confidence Band for Regression Line [67] --
2.7 Analysis of Variance Approach to Regression Analysis [69] --
2.8 General Linear Test Approach [78] --
2.9 Descriptive Measures of Association between X and Y in Regression Model [80] --
2.10 Considerations in Applying Regression Analysis [84] --
2.11 Case when X is Random [85] --
3 Diagnostics and Remedial Measures [95] --
3.1 Diagnostics for Predictor Variable [95] --
3.2 Residuals [97] --
3.3 Diagnostics for Residuals [98] --
3.4 Overview of Tests Involving Residuals [110] --
3.5 Correlation Test for Normality [111] --
3.6 Tests for Constancy of Error Variance [112] --
3.7 F Test for Lack of Fit [115] --
3.8 Overview of Remedial Measures [124] --
3.9 Transformations [126] --
3.10 Exploration of Shape of Regression Function [135] --
3.11 Case Example—-Plutonium Measurement [138] --
4 Simultaneous Inferences and Other Topics in Regression Analysis [152] --
4.1 Joint Estimation of B0 and [152] --
4.2 Simultaneous Estimation of Mean Responses [155] --
4.3 Simultaneous Prediction Intervals for New Observations [158] --
4.4 Regression through Origin [159] --
4.5 Effects of Measurement Errors [164] --
4.6 Inverse Predictions [167] --
4.7 Choice of X Levels [169] --
5 Matrix Approach to Simple Linear Regression Analysis [176] --
5.1 Matrices [176] --
5.2 Matrix Addition and Subtraction [180] --
5.3 Matrix Multiplication [182] --
5.4 Special Types of Matrices [185] --
5.5 Linear Dependence and Rank of Matrix [188] --
5.6 Inverse of a Matrix [189] --
5.7 Some Basic Theorems for Matrices [194] --
5.8 Random Vectors and Matrices [194] --
5.9 Simple Linear Regression Model in Matrix Terms [198] --
5.10 Least Squares Estimation of Regression Parameters [200] --
5.11 Fitted Values and Residuals [202] --
5.12 Analysis of Variance Results [205] --
5.13 Inferences in Regression Analysis [208] --
Part II --
Multiple Linear Regression --
6 Multiple Regression—I [217] --
6.1 Multiple Regression Models [217] --
6.2 General Linear Regression Model in Matrix Terms [225] --
6.3 Estimation of Regression Coefficients [227] --
6.4 Fitted Values and Residuals [227] --
6.5 Analysis of Variance Results [228] --
6.6 Inferences about Regression Parameters [231] --
6.7 Estimation of Mean Response and Prediction of New Observation [233] --
6.8 Diagnostics and Remedial Measures [236] --
6.9 An Example—Multiple Regression with Two Predictor Variables [241] --
7 Multiple Regression—II [260] --
7.1 Extra Sums of Squares [260] --
7.2 Uses of Extra Sums of Squares in Tests for Regression Coefficients [268] --
7.3 Summary of Tests concerning Regression Coefficients [271] --
7.4 Coefficients of Partial Determination [274] --
7.5 Standardized Multiple Regression Model [277] --
7.6 Multicollinearity and Its Effects [285] --
7.7 Polynomial Regression Models [296] --
7.8 Interaction Regression Models [308] --
7.9 Constrained Regression [315] --
8 Building the Regression Model I: Selection of Predictor Variables [327] --
8.1 Overview of Model-Building Process [327] --
8.2 Surgical Unit Example [334] --
8.3 All-Possible-Regressions Procedure for Variables Reduction [336] --
8.4 Forward Stepwise Regression and Other Automatic Search Procedures for Variables Reduction [347] --
8.5 Some Final Comments on Model Building for Exploratory Observational Studies [353] --
9 Building the Regression Model II: Diagnostics [361] --
9.1 Model Adequacy for a Predictor Variable—Partial Regression Plots [361] --
9.2 Identifying Outlying Y Observations—Studentized Deleted Residuals [368] --
9.3 Identifying Outlying X Observations—Hat Matrix Leverage Values [375] --
9.4 Identifying Influential Cases—DFFITS, Cook’s Distance, and DFBETAS Measures [378] --
9.5 Multicollinearity Diagnostics—Variance Inflation Factor [385] --
9.6 Surgical Unit Example—Continued [388] --
10 Building the Regression Model III: Remedial Measures and Validation [400] --
10.1 Unequal Error Variances Remedial Measures—Weighted Least Squares [400] --
10.2 Multicollinearity Remedial Measures—Ridge Regression [410] --
10.3 Remedial Measures for Influential Cases—Robust Regression [416] --
10.4 Remedial Measures for Unknown Response Function—Nonparametric Regression [425] --
10.5 Remedial Measures for Evaluating Precision in Nonstandard Situations—Bootstrapping [429] --
10.6 Model Validation [434] --
10.7 Case Example—Mathematics Proficiency [439] --
11 Qualitative Predictor Variables [455] --
11.1 One Qualitative Predictor Variable [455] --
11.2 Model Containing Interaction Effects [461] --
11.3 More Complex Models [464] --
11.4 Comparison of Two or More Regression Functions [468] --
11.5 Other Uses of Indicator Variables [474] --
11.6 Some Considerations in Using Indicator Variables [480] --
11.7 Case Example—MNDOT Traffic Estimation [483] --
12 Autocorrelation in Time Series Data [497] --
12.1 Problems of Autocorrelation [497] --
12.2 First-Order Autoregressive Error Model [501] --
12.3 Durbin-Watson Test for Autocorrelation [504] --
12.4 Remedial Measures for Autocorrelation [507] --
12.5 Forecasting with Autocorrelated Error Terms [517] --
Part III --
Nonlinear Regression --
13 Introduction to Nonlinear Regression [531] --
13.1 Linear and Nonlinear Regression Models [531] --
13.2 Example [535] --
13.3 Least Squares Estimation in Nonlinear Regression [536] --
13.4 Model Building and Diagnostics [547] --
13.5 Inferences about Nonlinear Regression Parameters [548] --
13.6 Learning Curve Example [555] --
14 Logistic Regression, Poisson Regression, and Generalized Linear Models [567] --
14.1 Regression Models with Binary Response Variable [567] --
14.2 Simple Logistic Response Function [570] --
14.3 Simple Logistic Regression [573] --
14.4 Multiple Logistic Regression [580] --
14.5 Model Building: Selection of Predictor Variables [585] --
14.6 Diagnostics [590] --
14.7 Inferences about Logistic Regression Parameters [599] --
14.8 Inferences about Mean Response [602] --
14.9 Prediction of a New Observation [605] --
14.10 Polytomous Logistic Regression [608] --
14.11 Poisson Regression [609] --
14.12 Generalized Linear Models [614] --
Part IV --
Correlation Analysis --
15 Normal Correlation Models [631] --
15.1 Distinction between Regression and Correlation Models [631] --
15.2 Bivariate Normal Distribution [632] --
15.3 Conditional Inferences [636] --
15.4 Inferences on Correlation Coefficients [640] --
15.5 Multivariate Normal Distribution [645] --
15.6 Spearman Rank Correlation Coefficient [651] --
MR, REVIEW #
There are no comments on this title.