Updating Probabilities: A Complex Agent Based Example
It has been shown that one can accommodate data (Bayes) and constraints (MaxEnt) in one method, the method of Maximum (relative) Entropy (ME) (Giffin 2007). In this paper we show a complex agent based example of inference with two different forms of information; moments and data. In this example, several agents each receive partial information about a system in the form of data. In addition, each agent agrees or is informed that there are certain global constraints on the system that are always true. The agents are then asked to make inferences about the entire system. The system becomes more complex as we add agents and allow them to share information. This system can have a geometrical form, such as a crystal structure. The shape may dictate how the agents are able to share information, such as sharing with nearest neighbors. This method can be used to model many systems where the agents or cells have local or partial information but must adhere to some global rules. This could also illustrate how the agents evolve and could illuminate emergent behavior of the system.
💡 Research Summary
The paper presents a unified framework for updating probabilities when both observed data and moment‑type constraints are present, using the method of Maximum (relative) Entropy (ME). Traditional Bayesian inference handles data, while MaxEnt handles constraints; the authors show that ME can accommodate both simultaneously.
First, the authors formalize the problem: a joint prior (P_{\text{old}}(x,\theta)=P_{\text{old}}(\theta)P_{\text{old}}(x|\theta)) is defined over the space of parameters (\theta) and data (x). The observed data (x’=x^{\prime}) is imposed as a constraint (C_1: P(x)=\delta(x-x’)), while a moment constraint (C_2: \int P(x,\theta)f(\theta),dx,d\theta = F) encodes global information (e.g., expected values). Maximizing the relative entropy
(S
Comments & Academic Discussion
Loading comments...
Leave a Comment