**509,99 zł**

- Wydawca: John Wiley & Sons
- Kategoria: Nauka i nowe technologie
- Język: angielski
- Rok wydania: 2016

Presents the Bayesian approach to statistical signal processing for a variety of useful model sets
This book aims to give readers a unified Bayesian treatment starting from the basics (Baye's rule) to the more advanced (Monte Carlo sampling), evolving to the next-generation model-based techniques (sequential Monte Carlo sampling). This next edition incorporates a new chapter on "Sequential Bayesian Detection," a new section on "Ensemble Kalman Filters" as well as an expansion of Case Studies that detail Bayesian solutions for a variety of applications. These studies illustrate Bayesian approaches to real-world problems incorporating detailed particle filter designs, adaptive particle filters and sequential Bayesian detectors. In addition to these major developments a variety of sections are expanded to "fill-in-the gaps" of the first edition. Here metrics for particle filter (PF) designs with emphasis on classical "sanity testing" lead to ensemble techniques as a basic requirement for performance analysis. The expansion of information theory metrics and their application to PF designs is fully developed and applied. These expansions of the book have been updated to provide a more cohesive discussion of Bayesian processing with examples and applications enabling the comprehension of alternative approaches to solving estimation/detection problems.
The second edition of **Bayesian Signal Processing** features:
* "Classical" Kalman filtering for linear, linearized, and nonlinear systems; "modern" unscented and ensemble Kalman filters: and the "next-generation" Bayesian particle filters
* Sequential Bayesian detection techniques incorporating model-based schemes for a variety of real-world problems
* Practical Bayesian processor designs including comprehensive methods of performance analysis ranging from simple sanity testing and ensemble techniques to sophisticated information metrics
* New case studies on adaptive particle filtering and sequential Bayesian detection are covered detailing more Bayesian approaches to applied problem solving
* MATLAB® notes at the end of each chapter help readers solve complex problems using readily available software commands and point out other software packages available
* Problem sets included to test readers' knowledge and help them put their new skills into practice Bayesian
Signal Processing, Second Edition is written for all students, scientists, and engineers who investigate and apply signal processing to their everyday problems.

Ebooka przeczytasz w aplikacjach Legimi na:

Liczba stron: 835

Wiley Series onAdaptive and Cognitive Dynamic Systems

Editor: Simon Haykin

A complete list of titles in this series appears at the end of this volume.

SECOND EDITION

James V. Candy

Lawrence Livermore National Laboratory University of California, Santa Barbara

Copyright © 2016 by John Wiley & Sons, Inc. All rights reserved.

Published by John Wiley & Sons, Inc., Hoboken, New Jersey. Published simultaneously in Canada.

No part of this publication may be reproduced, stored in a retrieval system, or transmitted in any form or by any means, electronic, mechanical, photocopying, recording, scanning, or otherwise, except as permitted under Section 107 or 108 of the 1976 United States Copyright Act, without either the prior written permission of the Publisher, or authorization through payment of the appropriate per-copy fee to the Copyright Clearance Center, Inc., 222 Rosewood Drive, Danvers, MA 01923, (978) 750-8400, fax (978) 750-4470, or on the web at www.copyright.com. Requests to the Publisher for permission should be addressed to the Permissions Department, John Wiley & Sons, Inc., 111 River Street, Hoboken, NJ 07030, (201) 748-6011, fax (201) 748-6008, or online at http://www.wiley.com/go/permission.

Limit of Liability/Disclaimer of Warranty: While the publisher and author have used their best efforts in preparing this book, they make no representations or warranties with respect to the accuracy or completeness of the contents of this book and specifically disclaim any implied warranties of merchantability or fitness for a particular purpose. No warranty may be created or extended by sales representatives or written sales materials. The advice and strategies contained herein may not be suitable for your situation. You should consult with a professional where appropriate. Neither the publisher nor author shall be liable for any loss of profit or any other commercial damages, including but not limited to special, incidental, consequential, or other damages.

For general information on our other products and services or for technical support, please contact our Customer Care Department within the United States at (800) 762-2974, outside the United States at (317) 572-3993 or fax (317) 572-4002.

Wiley also publishes its books in a variety of electronic formats. Some content that appears in print may not be available in electronic formats. For more information about Wiley products, visit our web site at www.wiley.com.

Library of Congress Cataloging-in-Publication Data

Names: Candy, James V., author. Title: Bayesian signal processing : classical, modern, and particle filtering methods / James V. Candy. Description: Second edition. | Hoboken, New Jersey : John Wiley & Sons Inc., [2016] | Includes index. Identifiers: LCCN 2016019012 | ISBN 9781119125457 (cloth) | ISBN 9781119125488 (epub) Subjects: LCSH: Signal processing--Mathematics. | Bayesian statistical decision theory. Classification: LCC TK5102.9 .C3187 2016 | DDC 621.382/201519542--dc23 LC record available at https://lccn.loc.gov/2016019012

“…no mere man has ever seen, heard or even imagined what wonderful things God has ready for those who love the Lord” (1 Cor. 2:9)

PREFACE TO SECOND EDITION

REFERENCES

PREFACE TO FIRST EDITION

REFERENCES

ACKNOWLEDGMENTS

LIST OF ABBREVIATIONS

1 INTRODUCTION

1.1 INTRODUCTION

1.2 BAYESIAN SIGNAL PROCESSING

1.3 SIMULATION-BASED APPROACH TO BAYESIAN PROCESSING

1.4 BAYESIAN MODEL-BASED SIGNAL PROCESSING

1.5 NOTATION AND TERMINOLOGY

REFERENCES

PROBLEMS

2 BAYESIAN ESTIMATION

2.1 INTRODUCTION

2.2 BATCH BAYESIAN ESTIMATION

2.3 BATCH MAXIMUM LIKELIHOOD ESTIMATION

2.4 BATCH MINIMUM VARIANCE ESTIMATION

2.5 SEQUENTIAL BAYESIAN ESTIMATION

2.6 SUMMARY

REFERENCES

PROBLEMS

NOTES

3 SIMULATION-BASED BAYESIAN METHODS

3.1 INTRODUCTION

3.2 PROBABILITY DENSITY FUNCTION ESTIMATION

3.3 SAMPLING THEORY

3.4 MONTE CARLO APPROACH

3.5 IMPORTANCE SAMPLING

3.6 SEQUENTIAL IMPORTANCE SAMPLING

3.7 SUMMARY

REFERENCES

PROBLEMS

NOTES

4 STATE–SPACE MODELS FOR BAYESIAN PROCESSING

4.1 INTRODUCTION

4.2 CONTINUOUS-TIME STATE–SPACE MODELS

4.3 SAMPLED-DATA STATE–SPACE MODELS

4.4 DISCRETE-TIME STATE–SPACE MODELS

4.5 GAUSS–MARKOV STATE–SPACE MODELS

4.6 INNOVATIONS MODEL

4.7 STATE–SPACE MODEL STRUCTURES

4.8 NONLINEAR (APPROXIMATE) GAUSS–MARKOV STATE–SPACE MODELS

4.9 SUMMARY

REFERENCES

PROBLEMS

NOTES

5 CLASSICAL BAYESIAN STATE–SPACE PROCESSORS

5.1 INTRODUCTION

5.2 BAYESIAN APPROACH TO THE STATE–SPACE

5.3 LINEAR BAYESIAN PROCESSOR (LINEAR KALMAN FILTER)

5.4 LINEARIZED BAYESIAN PROCESSOR (LINEARIZED KALMAN FILTER)

5.5 EXTENDED BAYESIAN PROCESSOR (EXTENDED KALMAN FILTER)

5.6 ITERATED-EXTENDED BAYESIAN PROCESSOR (ITERATED-EXTENDED KALMAN FILTER)

5.7 PRACTICAL ASPECTS OF CLASSICAL BAYESIAN PROCESSORS

5.8 CASE STUDY:

RLC

CIRCUIT PROBLEM

5.9 SUMMARY

REFERENCES

PROBLEMS

NOTES

6 MODERN BAYESIAN STATE–SPACE PROCESSORS

6.1 INTRODUCTION

6.2 SIGMA-POINT (UNSCENTED) TRANSFORMATIONS

6.3 SIGMA-POINT BAYESIAN PROCESSOR (UNSCENTED KALMAN FILTER)

6.4 QUADRATURE BAYESIAN PROCESSORS

6.5 GAUSSIAN SUM (MIXTURE) BAYESIAN PROCESSORS

6.6 CASE STUDY: 2D-TRACKING PROBLEM

6.7 ENSEMBLE BAYESIAN PROCESSORS (ENSEMBLE KALMAN FILTER)

6.8 SUMMARY

REFERENCES

PROBLEMS

NOTES

7 PARTICLE-BASED BAYESIAN STATE–SPACE PROCESSORS

7.1 INTRODUCTION

7.2 BAYESIAN STATE–SPACE PARTICLE FILTERS

7.3 IMPORTANCE PROPOSAL DISTRIBUTIONS

7.4 RESAMPLING

7.5 STATE–SPACE PARTICLE FILTERING TECHNIQUES

7.6 PRACTICAL ASPECTS OF PARTICLE FILTER DESIGN

7.7 CASE STUDY: POPULATION GROWTH PROBLEM

7.8 SUMMARY

REFERENCES

PROBLEMS

NOTES

8 JOINT BAYESIAN STATE/PARAMETRIC PROCESSORS

8.1 INTRODUCTION

8.2 BAYESIAN APPROACH TO JOINT STATE/PARAMETER ESTIMATION

8.3 CLASSICAL/MODERN JOINT BAYESIAN STATE/PARAMETRIC PROCESSORS

8.4 PARTICLE-BASED JOINT BAYESIAN STATE/PARAMETRIC PROCESSORS

8.5 CASE STUDY: RANDOM TARGET TRACKING USING A SYNTHETIC APERTURE TOWED ARRAY

8.6 SUMMARY

REFERENCES

PROBLEMS

NOTES

9 DISCRETE HIDDEN MARKOV MODEL BAYESIAN PROCESSORS

9.1 INTRODUCTION

9.2 HIDDEN MARKOV MODELS

9.3 PROPERTIES OF THE HIDDEN MARKOV MODEL

9.4 HMM OBSERVATION PROBABILITY: EVALUATION PROBLEM

9.5 STATE ESTIMATION IN HMM: THE VITERBI TECHNIQUE

9.6 PARAMETER ESTIMATION IN HMM: THE EM/BAUM–WELCH TECHNIQUE

9.7 CASE STUDY: TIME-REVERSAL DECODING

9.8 SUMMARY

REFERENCES

PROBLEMS

NOTES

10 SEQUENTIAL BAYESIAN DETECTION

10.1 INTRODUCTION

10.2 BINARY DETECTION PROBLEM

10.3 DECISION CRITERIA

10.4 PERFORMANCE METRICS

10.5 SEQUENTIAL DETECTION

10.6 MODEL-BASED SEQUENTIAL DETECTION

10.7 MODEL-BASED CHANGE (ANOMALY) DETECTION

10.8 CASE STUDY: REENTRY VEHICLE CHANGE DETECTION

10.9 SUMMARY

REFERENCES

PROBLEMS

NOTES

11 BAYESIAN PROCESSORS FOR PHYSICS-BASED APPLICATIONS

11.1 OPTIMAL POSITION ESTIMATION FOR THE AUTOMATIC ALIGNMENT

11.2 SEQUENTIAL DETECTION OF BROADBAND OCEAN ACOUSTIC SOURCES

11.3 BAYESIAN PROCESSING FOR BIOTHREATS

11.4 BAYESIAN PROCESSING FOR THE DETECTION OF RADIOACTIVE SOURCES

11.5 SEQUENTIAL THREAT DETECTION: AN X-RAY PHYSICS-BASED APPROACH

11.6 ADAPTIVE PROCESSING FOR SHALLOW OCEAN APPLICATIONS

REFERENCES

Appendix PROBABILITY AND STATISTICS OVERVIEW

A.1 PROBABILITY THEORY

A.2 GAUSSIAN RANDOM VECTORS

A.3 UNCORRELATED TRANSFORMATION: GAUSSIAN RANDOM VECTORS

REFERENCES

NOTE

INDEX

Wiley Series on Adaptive and Cognitive Dynamic Systems

EULA

Chapter 2

Table 2.1

Chapter 3

Table 3.1

Table 3.2

Chapter 4

Table 4.1

Table 4.2

Table 4.3

Chapter 5

Table 5.1

Table 5.2

Table 5.3

Table 5.4

Chapter 6

Table 6.1

Table 6.2

Chapter 7

Table 7.1

Table 7.2

Table 7.3

Table 7.4

Table 7.5

Table 7.6

Chapter 8

Table 8.1

Table 8.2

Chapter 10

Table 10.1

Chapter 11

Table 11.1

Table 11.2

Table 11.3

Table 11.4

Cover

Table of Contents

Preface

xiii

xvii

xxvii

xxix

xxx

xxxi

xxxii

1

2

3

4

5

6

7

8

9

10

12

13

14

15

16

17

18

19

20

21

22

23

24

25

26

27

28

29

30

31

32

33

34

35

36

37

38

39

40

41

42

43

44

45

46

47

48

49

50

51

52

53

54

55

56

57

58

59

60

61

62

63

64

65

66

67

68

69

70

71

73

74

75

76

77

79

80

81

82

83

84

85

86

87

88

89

90

91

92

93

94

95

96

97

98

99

100

101

102

103

104

105

106

107

108

109

110

111

112

113

114

115

116

117

118

119

120

121

122

123

124

125

126

127

128

129

130

131

132

133

134

135

136

137

138

139

140

141

142

143

144

145

146

147

148

149

150

151

152

153

154

155

156

157

158

159

160

161

162

164

165

166

167

168

169

170

171

172

173

174

175

176

177

179

180

181

182

183

184

185

186

187

188

189

190

191

192

193

194

195

196

197

198

199

200

201

202

203

204

205

206

207

208

209

210

211

212

213

214

215

216

217

218

219

220

222

223

224

225

226

227

228

229

230

231

232

233

234

235

236

237

239

240

241

242

244

245

247

250

251

252

253

254

255

256

257

258

259

260

261

262

263

264

265

266

267

268

269

270

271

272

273

274

275

276

279

280

281

282

283

284

285

286

287

288

289

290

291

292

293

294

295

296

297

298

299

300

301

302

304

305

306

307

308

309

310

311

312

313

317

318

319

320

321

322

323

324

325

326

327

328

329

330

331

332

333

334

335

336

337

338

339

340

341

342

343

344

345

346

347

348

349

352

353

354

359

360

361

362

363

364

365

366

367

368

369

370

371

372

373

374

375

376

377

378

379

380

381

382

383

384

385

386

387

388

389

390

391

392

393

394

395

396

397

398

399

400

401

402

403

404

405

406

407

408

409

410

411

412

413

414

415

416

417

418

419

420

421

422

423

424

425

427

428

429

430

431

432

433

434

435

436

437

438

440

441

442

443

444

445

446

447

448

449

450

451

452

453

454

455

456

457

458

459

460

461

462

463

464

465

466

467

468

469

470

471

472

473

474

475

476

477

478

479

480

481

482

483

484

485

486

487

488

489

490

491

492

493

494

495

496

497

498

500

501

502

503

504

505

506

507

508

509

510

511

512

513

514

515

516

517

518

519

520

521

522

523

524

525

526

527

528

529

530

531

532

533

534

535

536

537

538

539

540

541

543

544

545

546

547

548

549

550

551

552

553

554

555

556

557

558

559

560

561

562

563

564

565

566

567

569

570

572

573

574

575

576

577

578

579

580

581

582

583

584

585

602

The second edition of Bayesian Signal Processing incorporates a chapter on “Sequential Bayesian Detection” (Chapter 10) and a section on “Ensemble Kalman Filters” (Section 6.7), as well as an expansion of case studies in the final chapter (Chapter 11). These new “physics-based” studies detail Bayesian approaches to problem solving in real-world applications incorporating detailed particle filter designs, adaptive particle filters, and sequential Bayesian detection. In addition to these major developments, a variety of sections are expanded to “fill in the gaps” of the first edition. Here, metrics for particle filter (PF) designs with emphasis on classical “sanity tests,” introduced earlier in model-based processors, led to ensemble techniques as a basic requirement for performance analysis. Next, the expansion of information theory metrics (Kullback–Leibler divergence (KD), Hellinger distance (HD)) and their application to PF designs is discussed. These “fill-in-the-gap” expansions provide a more cohesive discussion with examples and applications enabling the comprehension of these alternative approaches to solving estimation/detection problems.

Detection theory, and more specifically sequential detection theory, is closely coupled to sequential estimation techniques presented in this text and is often the primary reason for constructing the estimators in the first place [1]–[14]. Sequential techniques find application in many technical application areas such as radar, sonar (detection/tracking), biomedical (anomaly detection/localization), speech (recognition/tracking), communications (real-time/obstructed environments), the sciences (e.g., seismology (earthquakes), structures (vibrations), materials (additive manufacturing/threat detection), radiation (threat detection, etc.), and of course, a huge variety of military applications [3], [7]. By incorporating a new chapter on sequential detection techniques primarily aimed at the binary decision problem, we enable the extension of these estimation methods to an entire class of problems especially when a physical model is available that can be incorporated into the algorithm [4], [6]. This new chapter, in itself, will provide wider application, since sequential detection is such a natural extension to sequential estimation and vice versa.

The ensemble Kalman Filter (EnKF) addition to the second edition is an area that has been neglected in most non-specialized texts. The EnKF is basically a little known hybrid in the engineering area, but well-known in the sciences. It is a hybrid of a regression-based processor (e.g., unscented Kalman filter (UKF)) and a particle-like (PF) “sampling” filter. The EnKF is well known in science areas where large-scale computations are required such as seismology, energy systems (wind, ocean waves, etc.), weather prediction, climatology (global warming), computational biology, large structures (vibrations), and more because of its computational efficiency for very large-scale computational problems (super-computer applications). Here, the coupling of model-based Bayesian techniques to these large-scale problems is unique.

With this in mind, let us consider the construct of the new chapter entitled “Sequential Bayesian Detection.” Here, we develop the Bayesian approach to decision theory primarily aimed at a coupling of sequential Bayesian estimation to sequential decision-making. We start with the binary decision problem for multi-channel measurements and develop the usual Bayesian solutions based on probability-of-error minimization leading to the well-known Bayes’ risk criterion. Next, the Neyman–Pearson detection approach (maximize detection probability for fixed false-alarm probability) is developed and compared to the classical Bayesian schemes illustrating their similarity and differences. Once these “batch schemes” are developed, we introduce the Wald sequential approach to solving these problems in pseudo real time [3], [7]. Once developed, we then investigate a variety of performance criteria based on the receiver operating characteristic (ROC) curve and its variants that provide the foundations for classical analysis [9], [10]. Other metrics (e.g., area-under-curve, and so on) associated with the ROC curve are introduced and applied as well. With the sequential detection theory developed, we investigate the basic linear Gaussian case and demonstrate that a sequential scheme easily follows when coupled to the model-based (Kalman) processor. Next, we generalize this approach to nonlinear models and again under Gaussian-like approximations develop the sequential detection scheme [7]. Finally, we remove the Gaussian assumptions and show how, using an MCMC (particle filter), sequential detection schemes can be developed and applied. A variety of applications are included in case studies on anomaly/change detection.

Finally, sequential detection enables the inclusion of more relevant case studies (Chapter 11) in ocean acoustics and physics-based radiation detection as well as X-ray threat material detection offering a completely different perspective on classical problem solving incorporating these physics-based approaches from the sequential Bayesian framework.

JAMES V. CANDY

Danville, CA

S. Kay,

Fundamentals of Statistical Signal Processing: Detection Theory

(Englewood Cliffs, NJ: Prentice-Hall, 1998).

C. Therrien,

Decision, Estimation, and Classification: An Introduction to Pattern Recognition and Related Topics

(New York: John Wiley & Sons, 1989).

J. Melsa and D. Cohn,

Detection and Estimation Theory

(New York: McGraw-Hill, 1978).

M. Basseville and I. Nikiforov,

Detection of Abrupt Changes: Theory and Application

(Englewood Cliffs, NJ: Prentice-Hall, 1993).

F. Gustafasson,

Adaptive Filtering and Change Detection

(Hoboken, NJ: John Wiley & Sons, 2000).

K. Burnham and D. Anderson,

Model Selection and Multimodal Inference

(New York: Springer, 1998).

A. Sage and J. Melsa,

Estimation Theory with Applications to Communications and Control

(New York: McGraw-Hill, 1971).

L. Scharf,

Statistical Signal Processing: Detection, Estimation, and Time Series Analysis

(Reading, MA: Addison-Wesley, 1990).

H. Van Trees,

Detection, Estimation and Modulation Theory, Part 1

(New York: John Wiley & Sons, 1968).

R. Duda, P. Hart, and D. Stork,

Pattern Classification

, 2nd Ed. (Hoboken, NJ: John Wiley & Sons, 2001).

A. Gelman, J. Carlin, H. Stern, and D. Rubin,

Bayesian Data Analysis

(New York: Chapman & Hall, 2004).

J. Hancock and P. Wintz,

Signal Detection Theory

(New York: McGraw-Hill, 1966).

D. Middleton,

Introduction to Statistical Communication Theory

(New York: McGraw-Hill, 1960).

S. Kassam,

Signal Detection in Non-Gaussian Noise

(New York: Springer-Verlag, 1988).

In the real world, systems designed to extract signals from noisy measurements are plagued by errors evolving from constraints of the sensors employed, random disturbances and noise, and probably, most common, the lack of precise knowledge of the underlying physical phenomenology generating the process in the first place! Methods capable of extracting the desired signal from hostile environments require approaches that capture all of the a priori information available and incorporate them into a processing scheme. This approach is typically model-based [1], employing mathematical representations of the component processes involved. However, the actual implementation providing the algorithm evolves from the realm of statistical signal processing using a Bayesian approach based on Bayes’ rule. Statistical signal processing is focused on the development of processors capable of extracting the desired information from noisy, uncertain measurement data. This is a text that develops the “Bayesian approach” to statistical signal processing for a variety of useful model sets. It features the next generation of processors which have recently been enabled with the advent of high-speed/high-throughput computers. The emphasis is on nonlinear/non-Gaussian problems, but classical techniques are included as special cases to enable the reader familiar with such methods to draw a parallel between the approaches. The common ground is the model sets. Here, the state–space approach is emphasized because of its inherent applicability to a wide variety of problems both linear and nonlinear as well as time invariant and time-varying problems including what has become popularly termed “physics-based” models. This text brings the reader from the classical methods of model-based signal processing including Kalman filtering for linear, linearized and approximate nonlinear processors as well as the recently developed unscented or sigma-point filters to the next generation of processors that will clearly dominate the future of model-based signal processing for years to come. It presents a unique viewpoint of signal processing from the Bayesian perspective in contrast to the pure statistical approach found in many textbooks. Although designed primarily as a graduate textbook, it will prove very useful to the practicing signal processing professional or scientist, since a wide variety of applications are included to demonstrate the applicability of the Bayesian approach to real-world problems. The prerequisites for such a text is a melding of undergraduate work in linear algebra, random processes, linear systems, and digital signal processing as well as a minimal background in model-based signal processing illustrated in the recent text [1]. It is unique in the sense that few texts cover the breadth of its topics, whereas, the underlying theme of this text is the Bayesian approach that is uniformly developed and followed throughout in the algorithms, examples, applications, and case studies. It is this theme coupled with the hierarchy of physics-based models developed that contribute to its uniqueness. This text has evolved from three previous texts, Candy [1–3] coupled with a wealth of practical applications to real-world Bayesian problems.

The Bayesian approach has existed in statistical physics for a long time and can be traced back to the 1940s with the evolution of the Manhattan project and the work of such prominent scientists as Ulam, von Neumann, Metropolis, Fermi, Feynman, and Teller. Here the idea of Monte Carlo (MC) techniques to solve complex integrals evolved [4]. Since its birth, Monte Carlo related methods have been the mainstay of many complex statistical computations. Many applications have evolved from this method in such areas as physics, biology, chemistry, computer science, economics/finance, material science, statistics and more recently in engineering. Thus, statisticians have known for a long time about these methods, but their practicalities have not really evolved as a working tool until the advent of high-speed super computers around the 1980s. In signal processing, it is hard to pinpoint the actual initial starting point but clearly the work of Handschin and Mayne in the late 1960s and early 1970s [5, 6] was the initial evolution of Monte Carlo techniques for signal processing and control. However, from the real-time perspective, it is probably the development of the sequential Bayesian processor made practical by the work of Gordon, Salmond, and Smith in 1993 [7] enabling the evolution and the explosion of the Bayesian sequential processor that is currently being researched today. To put this text in perspective, we must discuss the current signal processing texts available on Bayesian processing. Since its evolution much has been published in the statistical literature on Bayesian techniques for statistical estimation; however, the earliest texts are probably those of Harvey [8], Kitigawa and Gersch [9], and West [10] which emphasize the Bayesian model-based approach incorporating dynamic linear or nonlinear models into the processing scheme for additive Gaussian noise sources leading to the classical approximate (Kalman) filtering solutions. These works extend those results to nonGaussian problems using Monte Carlo techniques for eventual solution laying the foundation for works to follow. Statistical MC techniques were also available, but not as accessible to the signal processor due to statistical jargon and abstractness of the discussions. Many of these texts have evolved during the 1990s such as Gilks [11], Robert [12], Tanner [13], Tanizaki [14], with the more up-to-date expositions evolving in the late 1990s and currently such as Liu [4], Ruanaidh [15], Haykin [16], Doucet [17], Ristic [18], and Cappe [19]. Also during the last period a sequence of tutorials and special IEEE issues evolved exposing the MC methods to the signal processing community such as Godsill [20], Arulampalam [21], Djuric [22], Haykin [23], Doucet [24], Candy [25], as well as a wealth of signal processing papers (see references for details). Perhaps the most complete textbook from the statistical researcher’s perspective is that of Cappe [19]. In this text, much of the statistical MC sampling theory is developed along with all of the detailed mathematics—ideal for an evolving researcher. But what about the entry level person—the engineer, the experimentalist, and the practitioner? This is what is lacking in all of this literature. Questions like, how do the MC methods relate to the usual approximate Kalman methods? How does one incorporate models (model-based methods) into a Bayesian processor? How does one judge performance compared with classical methods? These are all basically pragmatic questions that the proposed text will answer in a lucid manner through coupling the theory to real-world examples and applications. Thus, the goal of this text is to provide a bridge for the practitioners with enough theory and applications to provide the basic background to comprehend the Bayesian framework and enable the application of these powerful techniques to real-world problem solving. Next, let us discuss the structure of the proposed text in more detail to understand its composition and approach.

We first introduce the basic ideas and motivate the need for such processing while showing that they clearly represent the next generation of processors. We discuss potential application areas and motivate the requirement for such a generalization. That is, we discuss how the simulation-based approach to Bayesian processor design provides a much needed capability, while well known in the statistical community, not very well known (until recently) in the signal processing community. After introducing the basic concepts in Chapter 1, we begin with the basic Bayesian processors in Chapter 2. We start with the Bayesian “batch” processor and establish its construction by developing the fundamental mathematics required. Next we discuss the well-known maximum likelihood (ML) and minimum (error) variance (MV) or equivalently minimum mean-squared error (MMSE) processors. We illustrate the similarity and differences between the schemes. Next, we launch into sequential Bayesian processing schemes which forms the foundation of the text. By examining the “full” posterior distribution in both dynamic variables of interest as well as the full data set, we are able to construct the sequential Bayesian approach and focus on the usual filtered or filtering distribution case of highest interest demonstrating the fundamental prediction/update recursions inherent in the sequential Bayesian structure. Once establishing the general Bayesian sequential processor (BSP), the schemes that follow are detailed depending on the assumed distribution with a variety of model sets.

We briefly review simulation-based methods starting with sampling methods, progressing to Monte Carlo approaches leading to the basic iterative methods of sampling using the Metropolis, Metropolis-Hastings, Gibb’s, and slice samplers. Since one of the major motivations of recursive or sequential Bayesian processing is to provide a real-time or pseudo real-time processor, we investigate the idea of importance sampling as well as sequential importance sampling techniques leading to the generic Bayesian sequential importance sampling algorithm. Here we show the solution can be applied, once the importance sampling distribution is defined.

In order to be useful, Bayesian processing techniques must be specified through a set of models that represent the underlying phenomenology driving the particular application. For example, in radar processing, we must investigate the propagation models, tracking models, geometric models, and so forth. In Chapter 4, we develop the state–space approach to signal modeling which forms the basis of many applications such as speech, radar, sonar, acoustics, geophysics, communications, control, etc. Here, we investigate continuous, sampled-data and discrete state–space signals and systems. We also discuss the underlying systems theory and extend the model-set to include the stochastic case with noise driving both process and measurements leading the well-known Gauss–Markov (GM) representation which forms the starting point for the classical Bayesian processors to follow. We also discuss the equivalence of the state–space model to a variety of time series (ARMA, AR, MA, etc.) representations as well as the common engineering model sets (transfer functions, all-pole, all-zero, pole-zero, etc.). This discussion clearly demonstrates why the state–space model with its inherent generality is capable of capturing the essence of a broad variety of signal processing representations. Finally, we extend these ideas to nonlinear state–space models leading to “approximate” Gauss-Markov representation evolving from nonlinear, perturbed and linearized systems.

In the next chapter, we develop classical Bayesian processors by first motivating the Bayesian approach to the state–space where the required conditional distributions use the embedded state–space representation. Starting with the linear, time-varying, state–space models, we show that the “optimum” classical Bayesian processor under multivariate Gaussian assumptions leads to minimum (error) variance (MV) or equivalently minimum mean-squared error (MMSE), which is the much heralded Kalman filter of control theory [1]. That is, simply substituting the underlying Gauss-Markov model into the required conditional distributions leads directly to the BSP or Kalman filter in this case. These results are then extended to the nonlinear state–space representation which are linearized using a known reference trajectory through perturbation theory and Taylor-series expansions. Starting with the linearized or approximate GM model of Chapter 4, we again calculate the required Bayesian sequential processor from the conditionals which lead to the “linearized” BSP (or linearized Kalman filter) algorithm. Once this processor is developed, it is shown that the “extended” Bayesian processor follows directly by linearizing about the most currently available estimate rather than the reference trajectory. The extended Bayesian processor (XBP) or equivalently extended Kalman filter (EKF) of nonlinear processing theory evolves quite naturally from the Bayesian perspective, again following the usual development by defining the required conditionals, making nonlinear approximations and developing the posterior distributions under multivariate Gaussian assumptions. Next, we briefly investigate an iterative version of the XBP processor, again from the Bayesian perspective which leads directly to the iterative version of the extended Bayesian processor (IX-BP) algorithm—an effective tool when nonlinear measurements dominate the uncertain measurements required.

Chapter 6 focuses on statistical linearization methods leading to the modern unscented Bayesian processor (UBP) or equivalently sigma-point Bayesian processor (SPBP). Here we show how statistical linearization techniques can be used to transform the underlying probability distribution using the sigma-point or unscented nonlinear transformation technique (linear regression) leading to the unscented Bayesian processor or equivalently the unscented Kalman filter (UKF). Besides developing the fundamental theory and algorithm, we demonstrate its performance on a variety of example problems. We also briefly discuss the Gaussian–Hermite quadrature (G-H) and Gaussian sum (G-S) techniques for completeness.

We reach the heart of the particle filtering methods in Chapter 7, where we discuss the Bayesian approach to the state–space. Here the ideas of Bayesian and model-based processors are combined through the development of Bayesian state–space particle filters. Initially, it is shown how the state–space models of Chapter 4 are incorporated into the conditional probability distributions required to construct the sequential Bayesian processors through importance sampling constructs. After investigating a variety of importance proposal distributions, the basic set of state-space particle filters (SSPF) are developed and illustrated through a set of example problems and simulations. The techniques including the Bootstrap, auxiliary, regularized MCMC and linearized particle filters are developed and investigated when applied to the set of example problems used to evaluate algorithm performance.

In Chapter 8, the important joint Bayesian SSPF are investigated by first developing the joint filter popularly known as the parametrically adaptive processor [1]. Here both states and static as well as dynamic parameters are developed as solutions to this joint estimation problem. The performance of these processors are compared to classical and modern processors through example problems.

In Chapter 9, the hidden Markov models (HMM) are developed for event-related problems (e.g., Poisson point processes). This chapter is important in order to place purely discrete processes into perspective. HMM evolve for any type of memoryless, counting processes and become important in financial applications, communications, biometrics, as well as radiation detection. Here we briefly develop the fundamental ideas and discuss them in depth to develop a set of techniques used by the practitioner while applying them to engineering problems of interest.

In the final chapter, we investigate a set of physics-based applications focusing on the Bayesian approach to solving real-world problems. By progressing through a step-by-step development of the processors, we see explicitly how to develop and analyze the performance of such Bayesian processors. We start with a practical laser alignment problem followed by a broadband estimation problem in ocean acoustics. Next, the solid-state microelectromechanical (MEM) sensor problem for biothreat detection is investigated followed by a discrete radiation detection problem based on counting statistics. All of these methods invoke Bayesian techniques to solve the particular problems of interest enabling the practitioner the opportunity to track “real-world” Bayesian model-based solutions.

The place of such a text in the signal processing textbook community can best be explained by tracing the technical ingredients that comprise its contents. It can be argued that it evolves from the digital signal processing area primarily from those texts that deal with random or statistical signal processing or possibly more succinctly “signals contaminated with noise.” The texts by Kay [26–28], Therrien [29], and Brown [30] all provide the basic background information in much more detail than this text, so there is little overlap at the detailed level with them.