Numerical Bayesian Methods Applied to Signal Processing: Statistics and Computing
Autor Joseph J.K. O Ruanaidh, William J. Fitzgeralden Limba Engleză Hardback – 23 feb 1996
Toate formatele și edițiile | Preț | Express |
---|---|---|
Paperback (1) | 1244.19 lei 6-8 săpt. | |
Springer – 23 oct 2012 | 1244.19 lei 6-8 săpt. | |
Hardback (1) | 1250.96 lei 6-8 săpt. | |
Springer – 23 feb 1996 | 1250.96 lei 6-8 săpt. |
Din seria Statistics and Computing
- 20% Preț: 753.97 lei
- Preț: 228.21 lei
- 20% Preț: 505.19 lei
- Preț: 438.82 lei
- Preț: 379.04 lei
- 20% Preț: 1038.28 lei
- Preț: 426.72 lei
- 20% Preț: 464.79 lei
- 18% Preț: 1364.21 lei
- Preț: 433.53 lei
- Preț: 204.34 lei
- 20% Preț: 1150.45 lei
- 18% Preț: 945.17 lei
- 15% Preț: 630.46 lei
- Preț: 384.86 lei
- Preț: 396.17 lei
- 15% Preț: 627.11 lei
- 20% Preț: 1425.21 lei
- Preț: 369.62 lei
- 23% Preț: 652.30 lei
- 15% Preț: 696.05 lei
- 15% Preț: 681.78 lei
- 18% Preț: 983.05 lei
- 15% Preț: 689.61 lei
- Preț: 376.97 lei
- 15% Preț: 593.93 lei
- 15% Preț: 633.86 lei
- 15% Preț: 577.11 lei
- 18% Preț: 1095.33 lei
- Preț: 493.38 lei
- 18% Preț: 720.50 lei
- 15% Preț: 679.86 lei
Preț: 1250.96 lei
Preț vechi: 1563.70 lei
-20% Nou
Puncte Express: 1876
Preț estimativ în valută:
239.43€ • 251.19$ • 198.62£
239.43€ • 251.19$ • 198.62£
Carte tipărită la comandă
Livrare economică 29 ianuarie-12 februarie 25
Preluare comenzi: 021 569.72.76
Specificații
ISBN-13: 9780387946290
ISBN-10: 0387946292
Pagini: 264
Ilustrații: XIV, 244 p.
Dimensiuni: 155 x 235 x 20 mm
Greutate: 0.56 kg
Ediția:1996
Editura: Springer
Colecția Springer
Seria Statistics and Computing
Locul publicării:New York, NY, United States
ISBN-10: 0387946292
Pagini: 264
Ilustrații: XIV, 244 p.
Dimensiuni: 155 x 235 x 20 mm
Greutate: 0.56 kg
Ediția:1996
Editura: Springer
Colecția Springer
Seria Statistics and Computing
Locul publicării:New York, NY, United States
Public țintă
ResearchCuprins
1 Introduction.- 2 Probabilistic Inference in Signal Processing.- 2.1 Introduction.- 2.2 The likelihood function.- 2.3 Bayesian data analysis.- 2.4 Prior probabilities.- 2.5 The removal of nuisance parameters.- 2.6 Model selection using Bayesian evidence.- 2.7 The general linear model.- 2.8 Interpretations of the general linear model.- 2.9 Example of marginalization.- 2.10 Example of model selection.- 2.11 Concluding remarks.- 3 Numerical Bayesian Inference.- 3.1 The normal approximation.- 3.2 Optimization.- 3.3 Integration.- 3.4 Numerical quadrature.- 3.5 Asymptotic approximations.- 3.6 The Monte Carlo method.- 3.7 The generation of random variates.- 3.8 Evidence using importance sampling.- 3.9 Marginal densities.- 3.10 Opportunities for variance reduction.- 3.11 Summary.- 4 Markov Chain Monte Carlo Methods.- 4.1 Introduction.- 4.2 Background on Markov chains.- 4.3 The canonical distribution.- 4.4 The Gibbs sampler.- 4.5 The Metropolis-Hastings algorithm.- 4.6 Dynamical sampling methods.- 4.7 Implementation of simulated annealing.- 4.8 Other issues.- 4.9 Free energy estimation.- 4.10 Summary.- 5 Retrospective Changepoint Detection.- 5.1 Introduction.- 5.2 The simple Bayesian step detector.- 5.3 The detection of changepoints using the general linear model.- 5.4 Recursive Bayesian estimation.- 5.5 Detection of multiple changepoints.- 5.6 Implementation details.- 5.7 Multiple changepoint results.- 5.8 Concluding Remarks.- 6 Restoration of Missing Samples in Digital Audio Signals.- 6.1 Introduction.- 6.2 Model formulation.- 6.3 The EM algorithm.- 6.4 Gibbs sampling.- 6.5 Implementation issues.- 6.6 Relationship between the three restoration methods.- 6.7 Simulations.- 6.8 Discussion.- 6.9 Concluding remarks.- 7 Integration in Bayesian Data Analysis.- 7.1 Polynomial data.-7.2 Decay problem.- 7.3 General model selection.- 7.4 Summary.- 8 Conclusion.- 8.1 A review of the work.- 8.2 Further work.- A The General Linear Model.- A.1 Integrating out model amplitudes.- A.1.1 Least squares.- A.1.2 Orthogonalization.- A.2 Integrating out the standard deviation.- A.3 Marginal density for a linear coefficient.- A.4 Marginal density for standard deviation.- A.5 Conditional density for a linear coefficient.- A.6 Conditional density for standard deviation.- B Sampling from a Multivariate Gaussian Density.- C Hybrid Monte Carlo Derivations.- C.1 Full Gaussian likelihood.- C.2 Student-t distribution.- C.3 Remark.- D EM Algorithm Derivations.- D.l Expectation.- D.2 Maximization.- E Issues in Sampling Based Approaches to Integration.- E.1 Marginalizing using the conditional density.- E.2 Approximating the conditional density.- E.3 Gibbs sampling from the joint density.- E.4 Reverse importance sampling.- F Detailed Balance.- F.1 Detailed balance in the Gibbs sampler.- F.2 Detailed balance in the Metropolis Hastings algorithm..- F.3 Detailed balance in the Hybrid Monte Carlo algorithm..- F.4 Remarks.- References.