A generalized maximum entropy principle for decision analysis
Read Online
Share

A generalized maximum entropy principle for decision analysis by Marlin Uluess Thomas

  • 115 Want to read
  • ·
  • 22 Currently reading

Published by Naval Postgraduate School in Monterey, Calif .
Written in English

Subjects:

  • Bayesian statistical decision theory,
  • Statistical decision

Book details:

About the Edition

A generalized maximum entropy principle is described for dealing with decision problems involving uncertainty but with some prior knowledge about the probability space corresponding to nature. This knowledge about the probabilistic structure is expressed through known bounds on event probabilities and moments, which is incorporated into a nonlinear programming problem. The solution provides a maximum entropy distribution which is then used in treating the decision problem as one involving risk. An example application is described that involves the selection of oil spill recovery systems for inland harbor regions. Other areas of application are identified and tables of some maximum entropy distributions resulting from a variety of moment constraints are provided.

Edition Notes

Statementby Marlin U. Thomas
ContributionsNaval Postgraduate School (U.S.)
The Physical Object
Pagination21 p. :
Number of Pages21
ID Numbers
Open LibraryOL25497964M
OCLC/WorldCa424625442

Download A generalized maximum entropy principle for decision analysis

PDF EPUB FB2 MOBI RTF

choose the distribution that minimizes entropy relative to the default estimate q0. When q0 is uniform this is the same as maximizing the entropy. Here, as usual, the entropy of a distribution p is defined as H(p) = p[ln(1=p)] and the relative entropy, or Kullback-Leibler divergence, as D(p k q) = p[ln(p=q)]. Thus, the maximum entropy principle. The Maximum Entropy Principle also eliminates the mystery as to the origin of the mathematical expressions underlying all probability distributions. The MEP derivation for the Gaussian and generalized Cauchy distributions is shown in detail. The MEP is also related to Fisher information and the Kullback-Leibler measure of relative entropy. A generalized maximum entropy principle for decision. The Principle of Maximum Entropy Let us go back to property 4: The uncertainty is max- imum when the outcomes are equally likely. The uni- form distribution maximizes the entropy; the uniform distribution contains the largest amount of uncer- tainty. But this is just Laplace's Principle of Insufficient.

Ent probabilities (expressed as pdfs) the Generalized principle of Maximum Entropy (GME). W e hav e not previously seen this generalization in the literature, but it is a di-. J.J. Buckley (), Entropy principles in decision making under risk, Risk Analysis, 5, – CrossRef Google Scholar Y.M. Guttmann (), The concept of probability in statistical physics, Cambridge University Press, 55–60 Google Scholar. Before the Principle of Maximum Entropy can be used the problem domain needs to be set up. In cases involving physical systems, this means that the various states in which the system can exist need to be identified, and all the parameters involved in the . O PERATIONS R ESEARCH Vol,No.5,September–October,pp– issnX eissn 08 informs ® doi/opre

  Generalized Maximum Entropy (GME) was firstly proposed by A. Golan as an extension of the well-known Maximum Entropy (ME) principle developed by E.T. Jaynes in the past century,. The Jaynes's idea was mainly based on the basic features of Shannon's Information Theory and the related Entropy measure. Broadly speaking, entropy can be. C.E. Shannon's seminal discovery [] () of his entropy measure in connection with communication theory has found useful applications in several other probabilistic systems.E.T. Jaynes has further extended its scope by discovering the maximum entropy principle (MaxEnt) [] () which is inherent in the process of optimization of the entropy measure when some incomplete information is given. Since he has held courses in Descriptive Statistics and Multivariate Statistics. The activity of methodological research concerns the models of multivariate analysis and structural equation models based on parametric estimators (maximum likelihood), non-parametric (partial least squares - PLS) and semi-parametric (Generalized Maximum Entropy). This book is dedicated to Prof. J. Kapur and his contributions to the field of entropy measures and maximum entropy applications. Eminent scholars in various fields of applied information theory have been invited to contribute to this Festschrift, collected on the occasion of his 75 th birthday. The articles cover topics in the areas of physical, biological, engineering and social sciences.