Modeling Priors in Bayesian Inference Insights from a Number Game Experiment

  • Tech Stack: Python, Bayesian Inference, Number Game, Prior Design, KL Divergence, Importance Sampling, Erlang Distribution
  • Github URL: Project Link

In this paper, we extend the number game experiment from Tenenbaum’s “A Bayesian Framework for Concept Learning” in order to model human priors from empirical data. We construct a survey to collect empirical data and develop our analysis using the Bayesian framework. We build out the prior using a variety of methods that include different interpretations of the hypothesis space and explore alternative computations of lambda. Our results show that the perceived complexity of the hypothesis space seems to influence human decision-making.