Large Covariance and Autocovariance Matrices: Chapman & Hall/CRC Monographs on Statistics and Applied Probability
Autor Arup Bose, Monika Bhattacharjeeen Limba Engleză Paperback – 18 dec 2020
Large Covariance and Autocovariance Matrices brings together a collection of recent results on sample covariance and autocovariance matrices in high-dimensional models and novel ideas on how to use them for statistical inference in one or more high-dimensional time series models. The prerequisites include knowledge of elementary multivariate analysis, basic time series analysis and basic results in stochastic convergence.
Part I is on different methods of estimation of large covariance matrices and auto-covariance matrices and properties of these estimators. Part II covers the relevant material on random matrix theory and non-commutative probability. Part III provides results on limit spectra and asymptotic normality of traces of symmetric matrix polynomial functions of sample auto-covariance matrices in high-dimensional linear time series models. These are used to develop graphical and significance tests for different hypotheses involving one or more independent high-dimensional linear time series.
The book should be of interest to people in econometrics and statistics (large covariance matrices and high-dimensional time series), mathematics (random matrices and free probability) and computer science (wireless communication). Parts of it can be used in post-graduate courses on high-dimensional statistical inference, high-dimensional random matrices and high-dimensional time series models. It should be particularly attractive to researchers developing statistical methods in high-dimensional time series models.
Arup Bose is a professor at the Indian Statistical Institute, Kolkata, India. He is a distinguished researcher in mathematical statistics and has been working in high-dimensional random matrices for the last fifteen years. He has been editor of Sankhyā for several years and has been on the editorial board of several other journals. He is a Fellow of the Institute of Mathematical Statistics, USA and all three national science academies of India, as well as the recipient of the S.S. Bhatnagar Award and the C.R. Rao Award. His first book Patterned Random Matrices was also published by Chapman & Hall. He has a forthcoming graduate text U-statistics, M-estimates and Resampling (with Snigdhansu Chatterjee) to be published by Hindustan Book Agency.
Monika Bhattacharjee is a post-doctoral fellow at the Informatics Institute, University of Florida. After graduating from St. Xavier's College, Kolkata, she obtained her master’s in 2012 and PhD in 2016 from the Indian Statistical Institute. Her thesis in high-dimensional covariance and auto-covariance matrices, written under the supervision of Dr. Bose, has received high acclaim.
Toate formatele și edițiile | Preț | Express |
---|---|---|
Paperback (1) | 311.01 lei 6-8 săpt. | |
CRC Press – 18 dec 2020 | 311.01 lei 6-8 săpt. | |
Hardback (1) | 627.32 lei 6-8 săpt. | |
CRC Press – 3 iul 2018 | 627.32 lei 6-8 săpt. |
Din seria Chapman & Hall/CRC Monographs on Statistics and Applied Probability
- 20% Preț: 298.92 lei
- 8% Preț: 536.52 lei
- 8% Preț: 383.57 lei
- Preț: 339.35 lei
- 9% Preț: 645.60 lei
- 9% Preț: 732.26 lei
- Preț: 328.46 lei
- 9% Preț: 1209.32 lei
- 9% Preț: 1006.44 lei
- 9% Preț: 643.14 lei
- Preț: 340.83 lei
- 9% Preț: 645.45 lei
- Preț: 358.45 lei
- 9% Preț: 641.83 lei
- 26% Preț: 875.51 lei
- 18% Preț: 1092.89 lei
- 9% Preț: 606.13 lei
- 18% Preț: 1095.97 lei
- 18% Preț: 1094.44 lei
- 25% Preț: 525.57 lei
- 23% Preț: 427.18 lei
- Preț: 477.82 lei
- 25% Preț: 499.04 lei
- 25% Preț: 486.60 lei
- 26% Preț: 875.43 lei
- 18% Preț: 690.44 lei
- 25% Preț: 998.08 lei
- 22% Preț: 412.43 lei
- 23% Preț: 455.50 lei
- 15% Preț: 479.40 lei
- 18% Preț: 744.95 lei
- 15% Preț: 704.09 lei
- 26% Preț: 875.94 lei
- 25% Preț: 629.76 lei
- 18% Preț: 900.46 lei
- 15% Preț: 660.75 lei
- 25% Preț: 1000.51 lei
- 25% Preț: 627.32 lei
- 23% Preț: 369.77 lei
- 18% Preț: 1133.29 lei
- 18% Preț: 1300.08 lei
- 15% Preț: 658.07 lei
- 15% Preț: 662.64 lei
- 15% Preț: 479.40 lei
Preț: 311.01 lei
Preț vechi: 356.16 lei
-13% Nou
Puncte Express: 467
Preț estimativ în valută:
59.53€ • 61.87$ • 49.31£
59.53€ • 61.87$ • 49.31£
Carte tipărită la comandă
Livrare economică 05-19 februarie 25
Preluare comenzi: 021 569.72.76
Specificații
ISBN-13: 9780367734107
ISBN-10: 0367734109
Pagini: 296
Dimensiuni: 156 x 234 mm
Greutate: 0.41 kg
Ediția:1
Editura: CRC Press
Colecția Chapman and Hall/CRC
Seria Chapman & Hall/CRC Monographs on Statistics and Applied Probability
ISBN-10: 0367734109
Pagini: 296
Dimensiuni: 156 x 234 mm
Greutate: 0.41 kg
Ediția:1
Editura: CRC Press
Colecția Chapman and Hall/CRC
Seria Chapman & Hall/CRC Monographs on Statistics and Applied Probability
Cuprins
1. LARGE COVARIANCE MATRIX I
Consistency
Covariance classes and regularization
Covariance classes
Covariance regularization
Bandable Σp
Parameter space
Estimation in U
Minimaxity
Toeplitz Σp
Parameter space
Estimation in Gβ (M ) or Fβ (M0, M )
Minimaxity
Sparse Σp
Parameter space
Estimation in Uτ (q, C0(p), M ) or Gq (Cn,p)
Minimaxity
2. LARGE COVARIANCE MATRIX II
Bandable Σp
Models and examples
Weak dependence
Estimation
Sparse Σp
3. LARGE AUTOCOVARIANCE MATRIX
Models and examples
Estimation of Γ0,p
Estimation of Γu,p
Parameter spaces
Estimation
Estimation in MA(r)
Estimation in IVAR(r)
Gaussian assumption
Simulations
Part II
4. SPECTRAL DISTRIBUTION
LSD
Moment method
Method of Stieltjes transform
Wigner matrix: semi-circle law
Independent matrix: Marˇcenko-Pastur law
Results on Z: p/n → y > 0
Results on Z: p/n → 0
5. NON-COMMUTATIVE PROBABILITY
NCP and its convergence
Essentials of partition theory
M¨obius function
Partition and non-crossing partition
Kreweras complement
Free cumulant; free independence
Moments of free variables
Joint convergence of random matrices
Compound free Poisson
6. GENERALIZED COVARIANCE MATRIX I
Preliminaries
Assumptions
Embedding
NCP convergence
Main idea
Main convergence
LSD of symmetric polynomials
Stieltjes transform
Corollaries
7. GENERALIZED COVARIANCE MATRIX II
Preliminaries
Assumptions
Centering and Scaling
Main idea
NCP convergence
LSD of symmetric polynomials
Stieltjes transform
Corollaries
8. SPECTRA OF AUTOCOVARIANCE MATRIX I
Assumptions
LSD when p/n → y ∈ (0, ∞)
MA(q), q < ∞
MA(∞)
Application to specific cases
LSD when p/n → 0
Application to specific cases
Non-symmetric polynomials
9. SPECTRA OF AUTOCOVARIANCE MATRIX II
Assumptions
LSD when p/n → y ∈ (0, ∞)
MA(q), q < ∞
MA(∞)
LSD when p/n → 0
MA(q), q < ∞
MA(∞)
10. GRAPHICAL INFERENCE
MA order determination
AR order determination
Graphical tests for parameter matrices
11. TESTING WITH TRACE
One sample trace
Two sample trace
Testing
12. SUPPLEMENTARY PROOFS
Proof of Lemma
Proof of Theorem (a)
Proof of Th
Consistency
Covariance classes and regularization
Covariance classes
Covariance regularization
Bandable Σp
Parameter space
Estimation in U
Minimaxity
Toeplitz Σp
Parameter space
Estimation in Gβ (M ) or Fβ (M0, M )
Minimaxity
Sparse Σp
Parameter space
Estimation in Uτ (q, C0(p), M ) or Gq (Cn,p)
Minimaxity
2. LARGE COVARIANCE MATRIX II
Bandable Σp
Models and examples
Weak dependence
Estimation
Sparse Σp
3. LARGE AUTOCOVARIANCE MATRIX
Models and examples
Estimation of Γ0,p
Estimation of Γu,p
Parameter spaces
Estimation
Estimation in MA(r)
Estimation in IVAR(r)
Gaussian assumption
Simulations
Part II
4. SPECTRAL DISTRIBUTION
LSD
Moment method
Method of Stieltjes transform
Wigner matrix: semi-circle law
Independent matrix: Marˇcenko-Pastur law
Results on Z: p/n → y > 0
Results on Z: p/n → 0
5. NON-COMMUTATIVE PROBABILITY
NCP and its convergence
Essentials of partition theory
M¨obius function
Partition and non-crossing partition
Kreweras complement
Free cumulant; free independence
Moments of free variables
Joint convergence of random matrices
Compound free Poisson
6. GENERALIZED COVARIANCE MATRIX I
Preliminaries
Assumptions
Embedding
NCP convergence
Main idea
Main convergence
LSD of symmetric polynomials
Stieltjes transform
Corollaries
7. GENERALIZED COVARIANCE MATRIX II
Preliminaries
Assumptions
Centering and Scaling
Main idea
NCP convergence
LSD of symmetric polynomials
Stieltjes transform
Corollaries
8. SPECTRA OF AUTOCOVARIANCE MATRIX I
Assumptions
LSD when p/n → y ∈ (0, ∞)
MA(q), q < ∞
MA(∞)
Application to specific cases
LSD when p/n → 0
Application to specific cases
Non-symmetric polynomials
9. SPECTRA OF AUTOCOVARIANCE MATRIX II
Assumptions
LSD when p/n → y ∈ (0, ∞)
MA(q), q < ∞
MA(∞)
LSD when p/n → 0
MA(q), q < ∞
MA(∞)
10. GRAPHICAL INFERENCE
MA order determination
AR order determination
Graphical tests for parameter matrices
11. TESTING WITH TRACE
One sample trace
Two sample trace
Testing
12. SUPPLEMENTARY PROOFS
Proof of Lemma
Proof of Theorem (a)
Proof of Th
Notă biografică
Arup Bose is a professor at the Indian Statistical Institute, Kolkata, India. He is a distinguished researcher in mathematical statistics and has been working in high-dimensional random matrices for the last fifteen years. He has been editor of Sankhyā for several years and has been on the editorial board of several other journals. He is a Fellow of the Institute of Mathematical Statistics, USA and all three national science academies of India, as well as the recipient of the S.S. Bhatnagar Award and the C.R. Rao Award. His first book Patterned Random Matrices was also published by Chapman & Hall. He has a forthcoming graduate text U-statistics, M-estimates and Resampling (with Snigdhansu Chatterjee) to be published by Hindustan Book Agency.
Monika Bhattacharjee is a post-doctoral fellow at the Informatics Institute, University of Florida. After graduating from St. Xavier's College, Kolkata, she obtained her master’s in 2012 and PhD in 2016 from the Indian Statistical Institute. Her thesis in high-dimensional covariance and auto-covariance matrices, written under the supervision of Dr. Bose, has received high acclaim.
Monika Bhattacharjee is a post-doctoral fellow at the Informatics Institute, University of Florida. After graduating from St. Xavier's College, Kolkata, she obtained her master’s in 2012 and PhD in 2016 from the Indian Statistical Institute. Her thesis in high-dimensional covariance and auto-covariance matrices, written under the supervision of Dr. Bose, has received high acclaim.
Recenzii
" . . . the authors should be congratulated for producing two highly relevant and well-written books. Statisticians would probably gravitate to LCAM in the first instance and those working in linear algebra would probably gravitate to PRM."
~Jonathan Gillard, Cardiff University
"The book represents a monograph of the authors’ recent results about the theory of large covariance and autocovariance matrices and contains other important results from other research papers and books in this topic. It is very useful for all researchers who use large covariance and autocovariance matrices in their researches. Especially, it is very useful for post-graduate and PhD students in mathematics, statistics, econometrics and computer science. It is a well-written and organized book with a large number of solved examples and many exercises left to readers for homework. I would like to recommend the book to PhD students and researchers who want to learn or use large covariance and autocovariance matrices in their researches."
~ Miroslav M. Ristic (Niš), zbMath
"This book brings together a collection of recent results on estimation of multidimensional time series covariance matrices. In the case where the time series consists of a sequence of independent (Chapter 1) or weakly dependent (Chapter 2) random vectors, the authors call it covariance estimation, whereas in the general case where the time series is only stationary, they call it autocovariance estimation. The framework of the results presented here is the one where the dimension of the observations (as well as the observation window size, otherwise nothing can be said) is high. The prerequisites include knowledge of elementary multivariate analysis, basic time series analysis, and basic results in stochastic convergence.
In Chapter 1, the authors consider the case where we have at our disposal a large time series of iid high-dimensional observations with common covariance
~Jonathan Gillard, Cardiff University
"The book represents a monograph of the authors’ recent results about the theory of large covariance and autocovariance matrices and contains other important results from other research papers and books in this topic. It is very useful for all researchers who use large covariance and autocovariance matrices in their researches. Especially, it is very useful for post-graduate and PhD students in mathematics, statistics, econometrics and computer science. It is a well-written and organized book with a large number of solved examples and many exercises left to readers for homework. I would like to recommend the book to PhD students and researchers who want to learn or use large covariance and autocovariance matrices in their researches."
~ Miroslav M. Ristic (Niš), zbMath
"This book brings together a collection of recent results on estimation of multidimensional time series covariance matrices. In the case where the time series consists of a sequence of independent (Chapter 1) or weakly dependent (Chapter 2) random vectors, the authors call it covariance estimation, whereas in the general case where the time series is only stationary, they call it autocovariance estimation. The framework of the results presented here is the one where the dimension of the observations (as well as the observation window size, otherwise nothing can be said) is high. The prerequisites include knowledge of elementary multivariate analysis, basic time series analysis, and basic results in stochastic convergence.
In Chapter 1, the authors consider the case where we have at our disposal a large time series of iid high-dimensional observations with common covariance
Descriere
The material will be based on very recent advances in the theory and application of large covariance and autocovariance matrices. Technologies and methods in medical sciences, image processing, and other fields generate data where the dimension is large compared to the sample size and may also increase as the next set of measurements become avai