Wiley-IEEE Press

Home Home About Wiley-IEEE Press Contact Us
Print this page Share

Uncertainty and Information: Foundations of Generalized Information Theory

ISBN: 978-0-471-74867-0
518 pages
December 2005, Wiley-IEEE Press
Uncertainty and Information: Foundations of Generalized Information Theory (0471748676) cover image

Description

Deal with information and uncertainty properly and efficiently using tools emerging from generalized information theory

Uncertainty and Information: Foundations of Generalized Information Theory contains comprehensive and up-to-date coverage of results that have emerged from a research program begun by the author in the early 1990s under the name "generalized information theory" (GIT). This ongoing research program aims to develop a formal mathematical treatment of the interrelated concepts of uncertainty and information in all their varieties. In GIT, as in classical information theory, uncertainty (predictive, retrodictive, diagnostic, prescriptive, and the like) is viewed as a manifestation of information deficiency, while information is viewed as anything capable of reducing the uncertainty. A broad conceptual framework for GIT is obtained by expanding the formalized language of classical set theory to include more expressive formalized languages based on fuzzy sets of various types, and by expanding classical theory of additive measures to include more expressive non-additive measures of various types.

This landmark book examines each of several theories for dealing with particular types of uncertainty at the following four levels:
* Mathematical formalization of the conceived type of uncertainty
* Calculus for manipulating this particular type of uncertainty
* Justifiable ways of measuring the amount of uncertainty in any situation formalizable in the theory
* Methodological aspects of the theory

With extensive use of examples and illustrations to clarify complex material and demonstrate practical applications, generous historical and bibliographical notes, end-of-chapter exercises to test readers' newfound knowledge, glossaries, and an Instructor's Manual, this is an excellent graduate-level textbook, as well as an outstanding reference for researchers and practitioners who deal with the various problems involving uncertainty and information. An Instructor's Manual presenting detailed solutions to all the problems in the book is available from the Wiley editorial department.
See More

Table of Contents

Preface xiii

Acknowledgments xvii

1 Introduction 1

1.1. Uncertainty and Its Significance 1

1.2. Uncertainty-Based Information 6

1.3. Generalized Information Theory 7

1.4. Relevant Terminology and Notation 10

1.5. An Outline of the Book 20

Notes 22

Exercises 23

2 Classical Possibility-Based Uncertainty Theory 26

2.1. Possibility and Necessity Functions 26

2.2. Hartley Measure of Uncertainty for Finite Sets 27

2.2.1. Simple Derivation of the Hartley Measure 28

2.2.2. Uniqueness of the Hartley Measure 29

2.2.3. Basic Properties of the Hartley Measure 31

2.2.4. Examples 35

2.3. Hartley-Like Measure of Uncertainty for Infinite Sets 45

2.3.1. Definition 45

2.3.2. Required Properties 46

2.3.3. Examples 52

Notes 56

Exercises 57

3 Classical Probability-Based Uncertainty Theory 61

3.1. Probability Functions 61

3.1.1. Functions on Finite Sets 62

3.1.2. Functions on Infinite Sets 64

3.1.3. Bayes’ Theorem 66

3.2. Shannon Measure of Uncertainty for Finite Sets 67

3.2.1. Simple Derivation of the Shannon Entropy 69

3.2.2. Uniqueness of the Shannon Entropy 71

3.2.3. Basic Properties of the Shannon Entropy 77

3.2.4. Examples 83

3.3. Shannon-Like Measure of Uncertainty for Infinite Sets  91

Notes  95

Exercises  97

4 Generalized Measures and Imprecise Probabilities 101

4.1. Monotone Measures 101

4.2. Choquet Capacities 106

4.2.1. Möbius Representation 107

4.3. Imprecise Probabilities: General Principles 110

4.3.1. Lower and Upper Probabilities 112

4.3.2. Alternating Choquet Capacities 115

4.3.3. Interaction Representation 116

4.3.4. Möbius Representation 119

4.3.5. Joint and Marginal Imprecise Probabilities 121

4.3.6. Conditional Imprecise Probabilities 122

4.3.7. Noninteraction of Imprecise Probabilities 123

4.4. Arguments for Imprecise Probabilities 129

4.5. Choquet Integral 133

4.6. Unifying Features of Imprecise Probabilities 135

Notes 137

Exercises 139

5 Special Theories of Imprecise Probabilities 143

5.1. An Overview 143

5.2. Graded Possibilities 144

5.2.1. Möbius Representation 149

5.2.2. Ordering of Possibility Profiles 151

5.2.3. Joint and Marginal Possibilities 153

5.2.4. Conditional Possibilities 155

5.2.5. Possibilities on Infinite Sets 158

5.2.6. Some Interpretations of Graded Possibilities 160

5.3. Sugeno l-Measures 160

5.3.1. Möbius Representation 165

5.4. Belief and Plausibility Measures 166

5.4.1. Joint and Marginal Bodies of Evidence 169

5.4.2. Rules of Combination 170

5.4.3. Special Classes of Bodies of Evidence 174

5.5. Reachable Interval-Valued Probability Distributions 178

5.5.1. Joint and Marginal Interval-Valued Probability Distributions 183

5.6. Other Types of Monotone Measures 185

Notes 186

Exercises 190

6 Measures of Uncertainty and Information 196

6.1. General Discussion 196

6.2. Generalized Hartley Measure for Graded Possibilities 198

6.2.1. Joint and Marginal U-Uncertainties 201

6.2.2. Conditional U-Uncertainty 203

6.2.3. Axiomatic Requirements for the U-Uncertainty 205

6.2.4. U-Uncertainty for Infinite Sets 206

6.3. Generalized Hartley Measure in Dempster–Shafer Theory 209

6.3.1. Joint and Marginal Generalized Hartley Measures 209

6.3.2. Monotonicity of the Generalized Hartley Measure 211

6.3.3. Conditional Generalized Hartley Measures 213

6.4. Generalized Hartley Measure for Convex Sets of Probability Distributions 214

6.5. Generalized Shannon Measure in Dempster-Shafer Theory 216

6.6. Aggregate Uncertainty in Dempster–Shafer Theory 226

6.6.1. General Algorithm for Computing the Aggregate Uncertainty 230

6.6.2. Computing the Aggregated Uncertainty in Possibility Theory 232

6.7. Aggregate Uncertainty for Convex Sets of Probability Distributions 234

6.8. Disaggregated Total Uncertainty 238

6.9. Generalized Shannon Entropy 241

6.10. Alternative View of Disaggregated Total Uncertainty 248

6.11. Unifying Features of Uncertainty Measures 253

Notes 253

Exercises 255

7 Fuzzy Set Theory 260

7.1. An Overview 260

7.2. Basic Concepts of Standard Fuzzy Sets 262

7.3. Operations on Standard Fuzzy Sets 266

7.3.1. Complementation Operations 266

7.3.2. Intersection and Union Operations 267

7.3.3. Combinations of Basic Operations 268

7.3.4. Other Operations 269

7.4. Fuzzy Numbers and Intervals 270

7.4.1. Standard Fuzzy Arithmetic 273

7.4.2. Constrained Fuzzy Arithmetic 274

7.5. Fuzzy Relations 280

7.5.1. Projections and Cylindric Extensions 281

7.5.2. Compositions, Joins, and Inverses 284

7.6. Fuzzy Logic 286

7.6.1. Fuzzy Propositions 287

7.6.2. Approximate Reasoning 293

7.7. Fuzzy Systems 294

7.7.1. Granulation 295

7.7.2. Types of Fuzzy Systems 297

7.7.3. Defuzzification 298

7.8. Nonstandard Fuzzy Sets 299

7.9. Constructing Fuzzy Sets and Operations 303

Notes 305

Exercises 308

8 Fuzzification of Uncertainty Theories 315

8.1. Aspects of Fuzzification 315

8.2. Measures of Fuzziness 321

8.3. Fuzzy-Set Interpretation of Possibility Theory 326

8.4. Probabilities of Fuzzy Events 334

8.5. Fuzzification of Reachable Interval-Valued Probability Distributions 338

8.6. Other Fuzzification Efforts 348

Notes 350

Exercises 351

9 Methodological Issues 355

9.1. An Overview 355

9.2. Principle of Minimum Uncertainty 357

9.2.1. Simplification Problems 358

9.2.2. Conflict-Resolution Problems 364

9.3. Principle of Maximum Uncertainty 369

9.3.1. Principle of Maximum Entropy 369

9.3.2. Principle of Maximum Nonspecificity 373

9.3.3. Principle of Maximum Uncertainty in GIT 375

9.4. Principle of Requisite Generalization 383

9.5. Principle of Uncertainty Invariance 387

9.5.1. Computationally Simple Approximations 388

9.5.2. Probability–Possibility Transformations 390

9.5.3. Approximations of Belief Functions by Necessity Functions 399

9.5.4. Transformations Between l-Measures and Possibility Measures 402

9.5.5. Approximations of Graded Possibilities by Crisp Possibilities 403

Notes 408

Exercises 411

10 Conclusions 415

10.1. Summary and Assessment of Results in Generalized Information Theory 415

10.2. Main Issues of Current Interest 417

10.3. Long-Term Research Areas 418

10.4. Significance of GIT 419

Notes 421

Appendix A Uniqueness of the U-Uncertainty 425

Appendix B Uniqueness of Generalized Hartley Measure in the Dempster–Shafer Theory 430

Appendix C Correctness of Algorithm 6.1 437

Appendix D Proper Range of Generalized Shannon Entropy 442

Appendix E Maximum of GSa in Section 6.9 447

Appendix F Glossary of Key Concepts 449

Appendix G Glossary of Symbols 455

Bibliography 458

Subject Index 487

Name Index 494

See More

Author Information

GEORGE J. KLIR, PhD, is currently Distinguished Professor of Systems Science at Binghamton University, SUNY. Since immigrating to the U.S. in 1966, he has held positions at UCLA, Fairleigh Dickinson University, and Binghamton University. He is a Life Fellow of IEEE, IFSA, and the Netherlands Institute for Advanced Studies. He has served as president of SGSR, IFSR, NAFIPS, and IFSA. He has published over 300 research papers and sixteen books, and has edited ten books. He has also served as Editor in Chief of the International Journal of General Systems since 1974 and of the IFSR International Book Series on Systems Science and Engineering since 1985. He has received numerous professional awards, including five honorary doctoral degrees, Bernard Bolzano's Gold Medal, Arnold Kaufmann's Gold Medal, and the SUNY Chancellor's Award for "Exemplary Contributions to Research and Scholarship." He is listed in Who's Who in America and Who's Who in the World. His current research interests include intelligent systems, soft computing, generalized information theory, systems modeling and design, fuzzy systems, and the theory of generalized measures. He has guided twenty-nine successful doctoral dissertations in these areas. Some of his research has been funded by grants from NSF, ONR, the United States Air Force, NASA, Sandia Labs, NATO, and various industries.
See More

Reviews

"..will establish a better understanding of the complex concepts…will make significant contributions toward stimulating research in the area of generalized information theory." (Computing Reviews.com, October 17, 2006)

"…contains comprehensive and up-to-date coverage…can serve as a graduate-level text and a reference for researchers and practitioners…" (IEEE Computer Magazine, February 2006)

See More

Related Titles

Learn more about