Normal view

## Applied linear regression models / John Neter, Michael H. Kutner, Christopher J. Nachtsheim, William Wasserman.

Editor: Chicago ; Buenos Aires : Irwin, c1996Edición: 3rd edDescripción: xv, 720 p. : il. ; 27 cmISBN: 025608601XOtra clasificación: 62J05
Contenidos:
```Part I
Simple Linear Regression
1 Linear Regression with One Predictor Variable 
1.1 Relations between Variables 
1.2 Regression Models and Their Uses 
1.3 Simple Linear Regression Model with Distribution of Error Terms Unspecified 
1.4 Data for Regression Analysis 
1.5 Overview of Steps in Regression Analysis 
1.6 Estimation of Regression Function 
1.7 Estimation of Error Terms Variance σ2 
1.8 Normal Error Regression Model 
2 Inferences in Regression Analysis 
2.1 Inferences concerning B1 
2.2 Inferences concerning B0 
2.3 Some Considerations on Making Inferences concerning BO and B1 
2.4 Interval Estimation of E{ Yh} 
2.5 Prediction of New Observation 
2.6 Confidence Band for Regression Line 
2.7 Analysis of Variance Approach to Regression Analysis 
2.8 General Linear Test Approach 
2.9 Descriptive Measures of Association between X and Y in Regression Model 
2.10 Considerations in Applying Regression Analysis 
2.11 Case when X is Random 
3 Diagnostics and Remedial Measures 
3.1 Diagnostics for Predictor Variable 
3.2 Residuals 
3.3 Diagnostics for Residuals 
3.4 Overview of Tests Involving Residuals 
3.5 Correlation Test for Normality 
3.6 Tests for Constancy of Error Variance 
3.7 F Test for Lack of Fit 
3.8 Overview of Remedial Measures 
3.9 Transformations 
3.10 Exploration of Shape of Regression Function 
3.11 Case Example—-Plutonium Measurement 
4 Simultaneous Inferences and Other Topics in Regression Analysis 
4.1 Joint Estimation of B0 and 
4.2 Simultaneous Estimation of Mean Responses 
4.3 Simultaneous Prediction Intervals for New Observations 
4.4 Regression through Origin 
4.5 Effects of Measurement Errors 
4.6 Inverse Predictions 
4.7 Choice of X Levels 
5 Matrix Approach to Simple Linear Regression Analysis 
5.1 Matrices 
5.2 Matrix Addition and Subtraction 
5.3 Matrix Multiplication 
5.4 Special Types of Matrices 
5.5 Linear Dependence and Rank of Matrix 
5.6 Inverse of a Matrix 
5.7 Some Basic Theorems for Matrices 
5.8 Random Vectors and Matrices 
5.9 Simple Linear Regression Model in Matrix Terms 
5.10 Least Squares Estimation of Regression Parameters 
5.11 Fitted Values and Residuals 
5.12 Analysis of Variance Results 
5.13 Inferences in Regression Analysis 
Part II
Multiple Linear Regression
6 Multiple Regression—I 
6.1 Multiple Regression Models 
6.2 General Linear Regression Model in Matrix Terms 
6.3 Estimation of Regression Coefficients 
6.4 Fitted Values and Residuals 
6.5 Analysis of Variance Results 
6.6 Inferences about Regression Parameters 
6.7 Estimation of Mean Response and Prediction of New Observation 
6.8 Diagnostics and Remedial Measures 
6.9 An Example—Multiple Regression with Two Predictor Variables 
7 Multiple Regression—II 
7.1 Extra Sums of Squares 
7.2 Uses of Extra Sums of Squares in Tests for Regression Coefficients 
7.3 Summary of Tests concerning Regression Coefficients 
7.4 Coefficients of Partial Determination 
7.5 Standardized Multiple Regression Model 
7.6 Multicollinearity and Its Effects 
7.7 Polynomial Regression Models 
7.8 Interaction Regression Models 
7.9 Constrained Regression 
8 Building the Regression Model I: Selection of Predictor Variables 
8.1 Overview of Model-Building Process 
8.2 Surgical Unit Example 
8.3 All-Possible-Regressions Procedure for Variables Reduction 
8.4 Forward Stepwise Regression and Other Automatic Search Procedures for Variables Reduction 
8.5 Some Final Comments on Model Building for Exploratory Observational Studies 
9 Building the Regression Model II: Diagnostics 
9.1 Model Adequacy for a Predictor Variable—Partial Regression Plots 
9.2 Identifying Outlying Y Observations—Studentized Deleted Residuals 
9.3 Identifying Outlying X Observations—Hat Matrix Leverage Values 
9.4 Identifying Influential Cases—DFFITS, Cook’s Distance, and DFBETAS Measures 
9.5 Multicollinearity Diagnostics—Variance Inflation Factor 
9.6 Surgical Unit Example—Continued 
10 Building the Regression Model III: Remedial Measures and Validation 
10.1 Unequal Error Variances Remedial Measures—Weighted Least Squares 
10.2 Multicollinearity Remedial Measures—Ridge Regression 
10.3 Remedial Measures for Influential Cases—Robust Regression 
10.4 Remedial Measures for Unknown Response Function—Nonparametric Regression 
10.5 Remedial Measures for Evaluating Precision in Nonstandard Situations—Bootstrapping 
10.6 Model Validation 
10.7 Case Example—Mathematics Proficiency 
11 Qualitative Predictor Variables 
11.1 One Qualitative Predictor Variable 
11.2 Model Containing Interaction Effects 
11.3 More Complex Models 
11.4 Comparison of Two or More Regression Functions 
11.5 Other Uses of Indicator Variables 
11.6 Some Considerations in Using Indicator Variables 
11.7 Case Example—MNDOT Traffic Estimation 
12 Autocorrelation in Time Series Data 
12.1 Problems of Autocorrelation 
12.2 First-Order Autoregressive Error Model 
12.3 Durbin-Watson Test for Autocorrelation 
12.4 Remedial Measures for Autocorrelation 
12.5 Forecasting with Autocorrelated Error Terms 
Part III
Nonlinear Regression
13 Introduction to Nonlinear Regression 
13.1 Linear and Nonlinear Regression Models 
13.2 Example 
13.3 Least Squares Estimation in Nonlinear Regression 
13.4 Model Building and Diagnostics 
13.5 Inferences about Nonlinear Regression Parameters 
13.6 Learning Curve Example 
14 Logistic Regression, Poisson Regression, and Generalized Linear Models 
14.1 Regression Models with Binary Response Variable 
14.2 Simple Logistic Response Function 
14.3 Simple Logistic Regression 
14.4 Multiple Logistic Regression 
14.5 Model Building: Selection of Predictor Variables 
14.6 Diagnostics 
14.7 Inferences about Logistic Regression Parameters 
14.8 Inferences about Mean Response 
14.9 Prediction of a New Observation 
14.10 Polytomous Logistic Regression 
14.11 Poisson Regression 
14.12 Generalized Linear Models 
Part IV
Correlation Analysis
15 Normal Correlation Models 
15.1 Distinction between Regression and Correlation Models 
15.2 Bivariate Normal Distribution 
15.3 Conditional Inferences 
15.4 Inferences on Correlation Coefficients 
15.5 Multivariate Normal Distribution 
15.6 Spearman Rank Correlation Coefficient ``` Average rating: 0.0 (0 votes)
Item type Home library Shelving location Call number Materials specified Status Date due Barcode Course reserves Libros
Libros ordenados por tema 62 N469-3 (Browse shelf) Available A-7369

Incluye referencias bibliográficas (p. 708-714) e índice.

Part I --
Simple Linear Regression --
1 Linear Regression with One Predictor Variable  --
1.1 Relations between Variables  --
1.2 Regression Models and Their Uses  --
1.3 Simple Linear Regression Model with Distribution of Error Terms Unspecified  --
1.4 Data for Regression Analysis  --
1.5 Overview of Steps in Regression Analysis  --
1.6 Estimation of Regression Function  --
1.7 Estimation of Error Terms Variance σ2  --
1.8 Normal Error Regression Model  --
2 Inferences in Regression Analysis  --
2.1 Inferences concerning B1  --
2.2 Inferences concerning B0  --
2.3 Some Considerations on Making Inferences concerning BO and B1  --
2.4 Interval Estimation of E{ Yh}  --
2.5 Prediction of New Observation  --
2.6 Confidence Band for Regression Line  --
2.7 Analysis of Variance Approach to Regression Analysis  --
2.8 General Linear Test Approach  --
2.9 Descriptive Measures of Association between X and Y in Regression Model  --
2.10 Considerations in Applying Regression Analysis  --
2.11 Case when X is Random  --
3 Diagnostics and Remedial Measures  --
3.1 Diagnostics for Predictor Variable  --
3.2 Residuals  --
3.3 Diagnostics for Residuals  --
3.4 Overview of Tests Involving Residuals  --
3.5 Correlation Test for Normality  --
3.6 Tests for Constancy of Error Variance  --
3.7 F Test for Lack of Fit  --
3.8 Overview of Remedial Measures  --
3.9 Transformations  --
3.10 Exploration of Shape of Regression Function  --
3.11 Case Example—-Plutonium Measurement  --
4 Simultaneous Inferences and Other Topics in Regression Analysis  --
4.1 Joint Estimation of B0 and  --
4.2 Simultaneous Estimation of Mean Responses  --
4.3 Simultaneous Prediction Intervals for New Observations  --
4.4 Regression through Origin  --
4.5 Effects of Measurement Errors  --
4.6 Inverse Predictions  --
4.7 Choice of X Levels  --
5 Matrix Approach to Simple Linear Regression Analysis  --
5.1 Matrices  --
5.2 Matrix Addition and Subtraction  --
5.3 Matrix Multiplication  --
5.4 Special Types of Matrices  --
5.5 Linear Dependence and Rank of Matrix  --
5.6 Inverse of a Matrix  --
5.7 Some Basic Theorems for Matrices  --
5.8 Random Vectors and Matrices  --
5.9 Simple Linear Regression Model in Matrix Terms  --
5.10 Least Squares Estimation of Regression Parameters  --
5.11 Fitted Values and Residuals  --
5.12 Analysis of Variance Results  --
5.13 Inferences in Regression Analysis  --
Part II --
Multiple Linear Regression --
6 Multiple Regression—I  --
6.1 Multiple Regression Models  --
6.2 General Linear Regression Model in Matrix Terms  --
6.3 Estimation of Regression Coefficients  --
6.4 Fitted Values and Residuals  --
6.5 Analysis of Variance Results  --
6.6 Inferences about Regression Parameters  --
6.7 Estimation of Mean Response and Prediction of New Observation  --
6.8 Diagnostics and Remedial Measures  --
6.9 An Example—Multiple Regression with Two Predictor Variables  --
7 Multiple Regression—II  --
7.1 Extra Sums of Squares  --
7.2 Uses of Extra Sums of Squares in Tests for Regression Coefficients  --
7.3 Summary of Tests concerning Regression Coefficients  --
7.4 Coefficients of Partial Determination  --
7.5 Standardized Multiple Regression Model  --
7.6 Multicollinearity and Its Effects  --
7.7 Polynomial Regression Models  --
7.8 Interaction Regression Models  --
7.9 Constrained Regression  --
8 Building the Regression Model I: Selection of Predictor Variables  --
8.1 Overview of Model-Building Process  --
8.2 Surgical Unit Example  --
8.3 All-Possible-Regressions Procedure for Variables Reduction  --
8.4 Forward Stepwise Regression and Other Automatic Search Procedures for Variables Reduction  --
8.5 Some Final Comments on Model Building for Exploratory Observational Studies  --
9 Building the Regression Model II: Diagnostics  --
9.1 Model Adequacy for a Predictor Variable—Partial Regression Plots  --
9.2 Identifying Outlying Y Observations—Studentized Deleted Residuals  --
9.3 Identifying Outlying X Observations—Hat Matrix Leverage Values  --
9.4 Identifying Influential Cases—DFFITS, Cook’s Distance, and DFBETAS Measures  --
9.5 Multicollinearity Diagnostics—Variance Inflation Factor  --
9.6 Surgical Unit Example—Continued  --
10 Building the Regression Model III: Remedial Measures and Validation  --
10.1 Unequal Error Variances Remedial Measures—Weighted Least Squares  --
10.2 Multicollinearity Remedial Measures—Ridge Regression  --
10.3 Remedial Measures for Influential Cases—Robust Regression  --
10.4 Remedial Measures for Unknown Response Function—Nonparametric Regression  --
10.5 Remedial Measures for Evaluating Precision in Nonstandard Situations—Bootstrapping  --
10.6 Model Validation  --
10.7 Case Example—Mathematics Proficiency  --
11 Qualitative Predictor Variables  --
11.1 One Qualitative Predictor Variable  --
11.2 Model Containing Interaction Effects  --
11.3 More Complex Models  --
11.4 Comparison of Two or More Regression Functions  --
11.5 Other Uses of Indicator Variables  --
11.6 Some Considerations in Using Indicator Variables  --
11.7 Case Example—MNDOT Traffic Estimation  --
12 Autocorrelation in Time Series Data  --
12.1 Problems of Autocorrelation  --
12.2 First-Order Autoregressive Error Model  --
12.3 Durbin-Watson Test for Autocorrelation  --
12.4 Remedial Measures for Autocorrelation  --
12.5 Forecasting with Autocorrelated Error Terms  --
Part III --
Nonlinear Regression --
13 Introduction to Nonlinear Regression  --
13.1 Linear and Nonlinear Regression Models  --
13.2 Example  --
13.3 Least Squares Estimation in Nonlinear Regression  --
13.4 Model Building and Diagnostics  --
13.5 Inferences about Nonlinear Regression Parameters  --
13.6 Learning Curve Example  --
14 Logistic Regression, Poisson Regression, and Generalized Linear Models  --
14.1 Regression Models with Binary Response Variable  --
14.2 Simple Logistic Response Function  --
14.3 Simple Logistic Regression  --
14.4 Multiple Logistic Regression  --
14.5 Model Building: Selection of Predictor Variables  --
14.6 Diagnostics  --
14.7 Inferences about Logistic Regression Parameters  --
14.8 Inferences about Mean Response  --
14.9 Prediction of a New Observation  --
14.10 Polytomous Logistic Regression  --
14.11 Poisson Regression  --
14.12 Generalized Linear Models  --
Part IV --
Correlation Analysis --
15 Normal Correlation Models  --
15.1 Distinction between Regression and Correlation Models  --
15.2 Bivariate Normal Distribution  --
15.3 Conditional Inferences  --
15.4 Inferences on Correlation Coefficients  --
15.5 Multivariate Normal Distribution  --
15.6 Spearman Rank Correlation Coefficient  --

MR, REVIEW #

There are no comments on this title.