Spectral analysis and time series / M.B. Priestley.
Series Probability and mathematical statisticsEditor: London ; New York : Academic Press, 1981Descripción: 2 v. (xvii, [45], 890 p.) : ill. ; 24 cmISBN: 0125649010 (v. 1); 0125649029 (v. 2)Tema(s): Time-series analysis | Spectral theory (Mathematics)Otra clasificación: *CODIGO*v. 1. Univariate series.-- Contents Preface vii Preface to Volume 2 ix Contents of Volume 2 xvi List of Main Notation xviii Volume [1] Chapter 1. Basic Concepts [1] 1.1. The Nature of Spectral Analysis [1] 1.2. Types of Processes [2] 1.3. Periodic and Non-periodic Functions [3] 1.4. Generalized Harmonic Analysis [7] 1.5. Energy Distribution [8] 1.6. Random Processes [10] 1.7. Stationary Random Processes [14] 1.8. Spectral Analysis of Stationary Random Processes [15] 1.9. Spectral Analysis of Non-stationary Random Processes [17] 1.10. Time Series Analysis: Use of Spectral Analysis in Practice [18] Chapter 2. Elements of Probability Theory [28] 2.1. Introduction [28] 2.2. Some Terminology [28] 2.3. Definition of Probability [30] 2.3.1. Axiomatic approach to probability [31] 2.3.2. The classical definition [33] 2.3.3. Probability spaces [34] 2.4. Conditional Probability and Independence [34] 2.4.1. Independent events [36] 2.5. Random Variables [37] 2.5.1. Defining probabilities for random variables [38] 2.6. Distribution Functions [39] 2.6.1. Decomposition of distribution functions [40] 2.7. Discrete, Continuous and Mixed Distributions [41] 2.8. Means, Variances and Moments [47] 2.8.1. Chebyshev’s inequality [57] 2.9. The Expectation Operator [52] 2.9.1. General transformations [53] 2.10. Generating Functions [55] 2.10.1. Probability generating functions [55] 2.10.2. Moment generating functions [55] 2.10.3. Characteristic functions [57] 2.10.4. Cumulant generating functions [58] 2.11. Some Special Distributions [59] 2.12. Bivariate Distributions [67] 2.12.1. Bivariate cumulative distribution functions [68] 2.12.2. Marginal distributions [69] 2.12.3. Conditional distributions [70] 2.12.4. Independent random variables [71] 2.12.5. Expectation for bivariate distributions [72] 2.12.6. Conditional expectation [73] 2.12.7. Bivariate moments [77] 2.12.8. Bivariate moment generating functions and characteristic functions [82] 2.12.9. Bivariate normal distribution [84] 2.12.10. Transformations of variables [86] 2.13. Multivariate Distributions [87] 2.13.1. Mean and variance of linear combinations [89] 2.13.2. Multivariate normal distribution [90] 2.14. The Law of Large Numbers and the Central Limit Theorem [92] 2.14.1. The law of large numbers [92] 2.14.2. Application to the binomial distribution [94] 2.14.3. The central limit theorem [95] 2.14.4. Properties of means and variances of random samples [96] Chapter 3. Stationary Random Processes 3.1. Probablistic Description of Random Processes [100] 3.1.1. Realizations and ensembles [101] 3.1.2. Specification of a random process [101] 3.2. Stationary Processes [104] 3.3. The Autocovariance and Autocorrelation Functions [106] 3.3.1. General properties of R(r) and p(r) [108] 3.3.2. Positive semi-definite property of R(t) and p(r) [109] 3.3.3. Complex valued processes [110] 3.3.4. Use of the autocorrelation function process in practice [111] 3.4. Stationary and Evolutionary Processes [112] 3.4.1. Gaussian (normal) processes [113] 3.5. Special Discrete Parameter Models [114] 3.5.1.Purely random processes: “white noise” [114] 3.5.2.First order autoregressive processes (linear Markov processes) [116] 3.5.3.Second order autoregressive processes [123] 3.5.4.Autoregressive processes of general order [132] 3.5.5.Moving average processes [135] 3.5.6.Mixed autoregressive/moving average processes [138] 3.5.7.The general linear process [141] 3.5.8.Harmonic processes [147] Stochastic Limiting Operations [150] 3.6.1.Stochastic continuity [151] 3:6.2.Stochastic differentiability [153] 3.6.3.Integration [154] 3.6.4.Interpretation of the derivatives of the autocorrelation function [155] 3.7 Standard Continuous Parameter Models [156] 3.7.1.Purely random processes (continuous “white noise”) [156] 3.7.2.First order autoregressive processes [158] 3.7.3.Examples of first order autoregressive processes [167] 3.7.4.Second order autoregressive processes [169] 3.7.5.Autogressive processes of general order [174] 3.7.6.Moving average processes [174] 3.7.7.Mixed autoregressive/moving average processes [176] 3.7.8.General linear processes [177] 3.7.9.Harmonic processes [179] 3.7.10.Filtered Poisson process, shot noise and Campbell’s theorem [179] Chapter 4. Spectral Analysis [184] 4.1.Introduction [184] 4.2.Fourier Series for Functions with Periodicity 2 [184] 4.2.1. The L2 theory for Fourier series [189] 4.2.2. Geometrical interpretation of Fourier series Hilbert space [190] 4.3.Fourier Series for Functions of General Periodicity [194] 4.4.Spectral Analysis of Periodic Functions [194] 4.4.1. Functions of general periodicity [197] 4.5.Non-periodic Functions: Fourier Integral [198] 4.5.1. Nature of conditions for the existence of Fourier series and Fourier integrals [202] 4.6.Spectral Analysis of Non-periodic Functions [204] 4.7.Spectral Analysis of Stationary Processes [206] 4.8.Relationship between the Spectral Density Function and the Autocovariance and Autocorrelation Functions [210] 4.8.1. Normalized power spectra [215] 4.8.2. The Wiener-Khintchine theorem [218] 4.8.3. Discrete parameter processes [222] 4.9. Decomposition of the Integrated Spectrum [226] 4.10. Examples of Spectfa for some Simple Models [233] 4.11. Spectral Representation of Stationary Processes [243] 4.12. Linear Transformations and Filters [263] 4.12.1. Filter terminology: gain and phase [270] 4.12.2. Operational forms of filter relationships: calculation of spectra [276] 4.12.3. Transformation of the autocovariance function [280] 4.12.4. Processes with rational spectra [283] 4.12.5. Axiomatic treatment of filters [285] Chapter 5. Estimation in the Time Domain [291] 5.1. Time Series Analysis [291] 5.2. Basic Ideas of Statistical Inference [292] 5.2.1. Point estimation [295] 5.2.2. General methods of estimation [304] 5.3. Estimation of Autocovariance and Autocorrelation Functions [317] 5.3.1. Form of the data [317] 5.3.2. Estimation of the mean [318] 5.3.3. Estimation of the autocovariance function [321] 5.3.4. Estimation of the'autocorrelation function [330] 5.3.5. Asymptotic distribution of the sample mean autocovariances and autocorrelations [337] 5.3.6. The ergodic property [340] 5.3.7. Continuous parameter processes [343] 5.4. Estimation of Parameters in Standard Models [345] 5.4.1. Estimation of parameters in autoregressive models [346] 5.4.2. Estimation of parameters in moving average models [354] 5.4.3. Estimation of parameters in mixed ARMA models [359] 5.4.4. Confidence intervals for the parameters [364] 5.4.5. Determining the order of the model [370] 5.4.6. Continuous parameter models [380] 5.5. Analysis of the Canadian Lynx Data [384] Chapter 6. Estimation in the Frequency Domain [389] 6.1. Discrete spectra [390] 6.1.1. Estimation [391] 6.1.2. Periodgram analysis [394] 6.1.3. Sampling properties of the periodogram [397] 6.1.4. Tests for periodogram ordinates [406] 6.2. Continuous Spectra [415] 6.2.1. Finite Fourier transforms [418] 6.2.2. Properties of the periodogram of a linear process [420] 6.2.3. Consistent estimates of the spectral density function: spectral windows [432] 6.2.4. Sampling properties of spectral estimates [449] 6.2.5. Estimation of the integrated spectrum [471] 6.2.6. Goodness-of-flt tests [475] 6.2.7. Continuous parameter processes [494] 6.3. Mixed Spectra [499] Chapter 7. Spectral Analysis in Practice [502] 7.1. Setting up a Spectral Analysis [502] 7.1.1. The aliasing effect [504] 7.2. Measures of Precision of Spectral Estimates [570] 7.3. Resolvability and Bandwidth [513] 7.3.1. Role of spectral bandwidth [513] 7.3.2. Role of window bandwidth [517] 7.4. Design Relations for Spectral Estimation: Choice of Window Parameters, Record Length and Frequency Interval [528] 7.4.1. Prewhitening and tapering [556] 7.5. Choice of Window [563] 7.6. Computation of Spectral Estimates: The Fast Fourier Transform [575] 7.7. Trend Removal and Seasonal Adjustment: Regression Analysis and Digital Filters [537] 7.8. Autoregressive, ARMA, and Maximum Entropy Spectral Estimation; CAT Criterion [600] 7.9. Some Examples [607] Chapter 8. Analysis of Processes with Mixed Spectra [613] 8.1. Nature of the Problem [613] 8.2. Types of Models [614] 8.3. Tests Based on the Periodogram [616] 8.4. The P(A) Test [626] 8.5. Analysis of Simulated Series [642] 8.6. Estimation of the Spectral Density Function [648] Appendix Ai
v. 2. Multivariate series, prediction and control. Volume [2] Chapter 9. Multivariate and Multidimensional Processes [654] 9.1. Correlation and Spectral Properties of Multivariate Stationary Processes [655] 9.2. Linear Relationships [669] 9.2.1. Linear relationships with added noise [671] 9.2.2. The Box-Jenkins “transfer function” model [676] 9.3. Multiple and Partial Coherency [681] 9.4. Multivariate AR, MA, and ARMA Models [685] 9.5. Estimation of Cross-spectra [692] 9.5.1. Estimation of co-spectra and quadrature spectra [701] 9.5.2. Estimation of cross-amplitude and phase spectra [702] 9.5.3. Estimation of coherency 9.5.4. Practical considerations: alignment techniques [706] 9.6. Numerical Examples [712] 9.7. Correlation and Spectral Properties of Multidimensional Processes [718] 9.7.1. Two-dimensional mixed spectra [724] Chapter 10. Prediction, Filtering and Control [727] 10.1. The Prediction Problem [722] 10.1.1. Spectral factorization and linear representations [730] 10.1.2. Geometric representation of linear prediction [736] 10.1.3. The Kolmogorov approach [738] 10.1.4. The Wiener approach [748] 10.1.5. The Wold decomposition [755] 10.1.6. Prediction for multivariate processes [760] 10.2. The Box-Jenkins Approach to Forecasting [762] 10.3. The Filtering Problem [773] 10.3.1. The Wiener filter [775] 10.4. Linear Control Systems [780] 10.4.1. Minimum variance control [781] 10.4.2. System identification [786] 10.4.3. Multivariate systems [791] 10.4.4. State space representations [797] 10.5. Multivariate Time Series Model Fitting [800] 10.5.1. Identifiability of multivariate ARMA models [801] 10.5.2. Fitting state space models via canonical correlations analysis [804] 10.6. State Space Approach to Forecasting and Filtering Problems: Kalman Filtering [807] Chapter 11. Non-stationarity and Non-linearity [816] 11.1. Spectral Analysis of Non-stationary Processes: Basic Considerations [817] 11.2. The Theory of Evolutionary Spectra [821] 11.2.1. Estimation of evolutionary spectra [837] 11.2.2. Complex demodulation [848] 11.3. Some Other Definitions of Time-dependent “Spectra” [855] 11.4. Prediction, Filtering and Control for Non-stationary Processes [859] 11.5. General Non-linear Models [866] 11.5.1. Volterra expansions and polyspectra [869] 11.6. Some Special Non-linear Models: Bilinear, Threshold, and Exponential Models [877] References Ri Author Index li Subject Index Ivii
Item type | Home library | Shelving location | Call number | Materials specified | Status | Date due | Barcode |
---|---|---|---|---|---|---|---|
Libros | Instituto de Matemática, CONICET-UNS | Libros ordenados por tema | 62 P949 (Browse shelf) | v.1 y v. 2 | Available | A-9346 |
Includes bibliographies and indexes.
v. 1. Univariate series.--
Contents --
Preface vii --
Preface to Volume 2 ix --
Contents of Volume 2 xvi --
List of Main Notation xviii --
Volume [1] --
Chapter 1. Basic Concepts [1] --
1.1. The Nature of Spectral Analysis [1] --
1.2. Types of Processes [2] --
1.3. Periodic and Non-periodic Functions [3] --
1.4. Generalized Harmonic Analysis [7] --
1.5. Energy Distribution [8] --
1.6. Random Processes [10] --
1.7. Stationary Random Processes [14] --
1.8. Spectral Analysis of Stationary Random Processes [15] --
1.9. Spectral Analysis of Non-stationary Random --
Processes [17] --
1.10. Time Series Analysis: Use of Spectral Analysis in Practice [18] --
Chapter 2. Elements of Probability Theory [28] --
2.1. Introduction [28] --
2.2. Some Terminology [28] --
2.3. Definition of Probability [30] --
2.3.1. Axiomatic approach to probability [31] --
2.3.2. The classical definition [33] --
2.3.3. Probability spaces [34] --
2.4. Conditional Probability and Independence [34] --
2.4.1. Independent events [36] --
2.5. Random Variables [37] --
2.5.1. Defining probabilities for random variables [38] --
2.6. Distribution Functions [39] --
2.6.1. Decomposition of distribution functions [40] --
2.7. Discrete, Continuous and Mixed Distributions [41] --
2.8. Means, Variances and Moments [47] --
2.8.1. Chebyshev’s inequality [57] --
2.9. The Expectation Operator [52] --
2.9.1. General transformations [53] --
2.10. Generating Functions [55] --
2.10.1. Probability generating functions [55] --
2.10.2. Moment generating functions [55] --
2.10.3. Characteristic functions [57] --
2.10.4. Cumulant generating functions [58] --
2.11. Some Special Distributions [59] --
2.12. Bivariate Distributions [67] --
2.12.1. Bivariate cumulative distribution functions [68] --
2.12.2. Marginal distributions [69] --
2.12.3. Conditional distributions [70] --
2.12.4. Independent random variables [71] --
2.12.5. Expectation for bivariate distributions [72] --
2.12.6. Conditional expectation [73] --
2.12.7. Bivariate moments [77] --
2.12.8. Bivariate moment generating functions and characteristic functions [82] --
2.12.9. Bivariate normal distribution [84] --
2.12.10. Transformations of variables [86] --
2.13. Multivariate Distributions [87] --
2.13.1. Mean and variance of linear combinations [89] --
2.13.2. Multivariate normal distribution [90] --
2.14. The Law of Large Numbers and the Central Limit Theorem [92] --
2.14.1. The law of large numbers [92] --
2.14.2. Application to the binomial distribution [94] --
2.14.3. The central limit theorem [95] --
2.14.4. Properties of means and variances of random samples [96] --
Chapter 3. Stationary Random Processes --
3.1. Probablistic Description of Random Processes [100] --
3.1.1. Realizations and ensembles [101] --
3.1.2. Specification of a random process [101] --
3.2. Stationary Processes [104] --
3.3. The Autocovariance and Autocorrelation Functions [106] --
3.3.1. General properties of R(r) and p(r) [108] --
3.3.2. Positive semi-definite property of R(t) and p(r) [109] --
3.3.3. Complex valued processes [110] --
3.3.4. Use of the autocorrelation function process in practice [111] --
3.4. Stationary and Evolutionary Processes [112] --
3.4.1. Gaussian (normal) processes [113] --
3.5. Special Discrete Parameter Models [114] --
3.5.1.Purely random processes: “white noise” [114] --
3.5.2.First order autoregressive processes (linear Markov processes) [116] --
3.5.3.Second order autoregressive processes [123] --
3.5.4.Autoregressive processes of general order [132] --
3.5.5.Moving average processes [135] --
3.5.6.Mixed autoregressive/moving average processes [138] --
3.5.7.The general linear process [141] --
3.5.8.Harmonic processes [147] --
Stochastic Limiting Operations [150] --
3.6.1.Stochastic continuity [151] --
3:6.2.Stochastic differentiability [153] --
3.6.3.Integration [154] --
3.6.4.Interpretation of the derivatives of the autocorrelation function [155] --
3.7 Standard Continuous Parameter Models [156] --
3.7.1.Purely random processes (continuous “white noise”) [156] --
3.7.2.First order autoregressive processes [158] --
3.7.3.Examples of first order autoregressive processes [167] --
3.7.4.Second order autoregressive processes [169] --
3.7.5.Autogressive processes of general order [174] --
3.7.6.Moving average processes [174] --
3.7.7.Mixed autoregressive/moving average processes [176] --
3.7.8.General linear processes [177] --
3.7.9.Harmonic processes [179] --
3.7.10.Filtered Poisson process, shot noise and Campbell’s theorem [179] --
Chapter 4. Spectral Analysis [184] --
4.1.Introduction [184] --
4.2.Fourier Series for Functions with Periodicity 2 [184] --
4.2.1. The L2 theory for Fourier series [189] --
4.2.2. Geometrical interpretation of Fourier series Hilbert space [190] --
4.3.Fourier Series for Functions of General Periodicity [194] --
4.4.Spectral Analysis of Periodic Functions [194] --
4.4.1. Functions of general periodicity [197] --
4.5.Non-periodic Functions: Fourier Integral [198] --
4.5.1. Nature of conditions for the existence of Fourier series and Fourier integrals [202] --
4.6.Spectral Analysis of Non-periodic Functions [204] --
4.7.Spectral Analysis of Stationary Processes [206] --
4.8.Relationship between the Spectral Density Function and the Autocovariance and Autocorrelation Functions [210] --
4.8.1. Normalized power spectra [215] --
4.8.2. The Wiener-Khintchine theorem [218] --
4.8.3. Discrete parameter processes [222] --
4.9. Decomposition of the Integrated Spectrum [226] --
4.10. Examples of Spectfa for some Simple Models [233] --
4.11. Spectral Representation of Stationary Processes [243] --
4.12. Linear Transformations and Filters [263] --
4.12.1. Filter terminology: gain and phase [270] --
4.12.2. Operational forms of filter relationships: calculation of spectra [276] --
4.12.3. Transformation of the autocovariance function [280] --
4.12.4. Processes with rational spectra [283] --
4.12.5. Axiomatic treatment of filters [285] --
Chapter 5. Estimation in the Time Domain [291] --
5.1. Time Series Analysis [291] --
5.2. Basic Ideas of Statistical Inference [292] --
5.2.1. Point estimation [295] --
5.2.2. General methods of estimation [304] --
5.3. Estimation of Autocovariance and Autocorrelation Functions [317] --
5.3.1. Form of the data [317] --
5.3.2. Estimation of the mean [318] --
5.3.3. Estimation of the autocovariance function [321] --
5.3.4. Estimation of the'autocorrelation function [330] --
5.3.5. Asymptotic distribution of the sample mean autocovariances and autocorrelations [337] --
5.3.6. The ergodic property [340] --
5.3.7. Continuous parameter processes [343] --
5.4. Estimation of Parameters in Standard Models [345] --
5.4.1. Estimation of parameters in autoregressive models [346] --
5.4.2. Estimation of parameters in moving average models [354] --
5.4.3. Estimation of parameters in mixed ARMA models [359] --
5.4.4. Confidence intervals for the parameters [364] --
5.4.5. Determining the order of the model [370] --
5.4.6. Continuous parameter models [380] --
5.5. Analysis of the Canadian Lynx Data [384] --
Chapter 6. Estimation in the Frequency Domain [389] --
6.1. Discrete spectra [390] --
6.1.1. Estimation [391] --
6.1.2. Periodgram analysis [394] --
6.1.3. Sampling properties of the periodogram [397] --
6.1.4. Tests for periodogram ordinates [406] --
6.2. Continuous Spectra [415] --
6.2.1. Finite Fourier transforms [418] --
6.2.2. Properties of the periodogram of a linear process [420] --
6.2.3. Consistent estimates of the spectral density function: spectral windows [432] --
6.2.4. Sampling properties of spectral estimates [449] --
6.2.5. Estimation of the integrated spectrum [471] --
6.2.6. Goodness-of-flt tests [475] --
6.2.7. Continuous parameter processes [494] --
6.3. Mixed Spectra [499] --
Chapter 7. Spectral Analysis in Practice [502] --
7.1. Setting up a Spectral Analysis [502] --
7.1.1. The aliasing effect [504] --
7.2. Measures of Precision of Spectral Estimates [570] --
7.3. Resolvability and Bandwidth [513] --
7.3.1. Role of spectral bandwidth [513] --
7.3.2. Role of window bandwidth [517] --
7.4. Design Relations for Spectral Estimation: Choice of Window Parameters, Record Length and Frequency Interval [528] --
7.4.1. Prewhitening and tapering [556] --
7.5. Choice of Window [563] --
7.6. Computation of Spectral Estimates: The Fast Fourier Transform [575] --
7.7. Trend Removal and Seasonal Adjustment: Regression Analysis and Digital Filters [537] --
7.8. Autoregressive, ARMA, and Maximum Entropy Spectral Estimation; CAT Criterion [600] --
7.9. Some Examples [607] --
Chapter 8. Analysis of Processes with Mixed Spectra [613] --
8.1. Nature of the Problem [613] --
8.2. Types of Models [614] --
8.3. Tests Based on the Periodogram [616] --
8.4. The P(A) Test [626] --
8.5. Analysis of Simulated Series [642] --
8.6. Estimation of the Spectral Density Function [648] --
Appendix Ai --
v. 2. Multivariate series, prediction and control.
Volume [2] --
Chapter 9. Multivariate and Multidimensional Processes [654] --
9.1. Correlation and Spectral Properties of Multivariate Stationary Processes [655] --
9.2. Linear Relationships [669] --
9.2.1. Linear relationships with added noise [671] --
9.2.2. The Box-Jenkins “transfer function” model [676] --
9.3. Multiple and Partial Coherency [681] --
9.4. Multivariate AR, MA, and ARMA Models [685] --
9.5. Estimation of Cross-spectra [692] --
9.5.1. Estimation of co-spectra and quadrature spectra [701] --
9.5.2. Estimation of cross-amplitude and phase spectra [702] --
9.5.3. Estimation of coherency --
9.5.4. Practical considerations: alignment techniques [706] --
9.6. Numerical Examples [712] --
9.7. Correlation and Spectral Properties of Multidimensional Processes [718] --
9.7.1. Two-dimensional mixed spectra [724] --
Chapter 10. Prediction, Filtering and Control [727] --
10.1. The Prediction Problem [722] --
10.1.1. Spectral factorization and linear representations [730] --
10.1.2. Geometric representation of linear prediction [736] --
10.1.3. The Kolmogorov approach [738] --
10.1.4. The Wiener approach [748] --
10.1.5. The Wold decomposition [755] --
10.1.6. Prediction for multivariate processes [760] --
10.2. The Box-Jenkins Approach to Forecasting [762] --
10.3. The Filtering Problem [773] --
10.3.1. The Wiener filter [775] --
10.4. Linear Control Systems [780] --
10.4.1. Minimum variance control [781] --
10.4.2. System identification [786] --
10.4.3. Multivariate systems [791] --
10.4.4. State space representations [797] --
10.5. Multivariate Time Series Model Fitting [800] --
10.5.1. Identifiability of multivariate ARMA models [801] --
10.5.2. Fitting state space models via canonical correlations analysis [804] --
10.6. State Space Approach to Forecasting and Filtering Problems: Kalman Filtering [807] --
Chapter 11. Non-stationarity and Non-linearity [816] --
11.1. Spectral Analysis of Non-stationary Processes: Basic Considerations [817] --
11.2. The Theory of Evolutionary Spectra [821] --
11.2.1. Estimation of evolutionary spectra [837] --
11.2.2. Complex demodulation [848] --
11.3. Some Other Definitions of Time-dependent “Spectra” [855] --
11.4. Prediction, Filtering and Control for Non-stationary Processes [859] --
11.5. General Non-linear Models [866] --
11.5.1. Volterra expansions and polyspectra [869] --
11.6. Some Special Non-linear Models: Bilinear, Threshold, and Exponential Models [877] --
References Ri --
Author Index li --
Subject Index Ivii --
MR, REVIEW #
There are no comments on this title.