Some example Pluto notebooks to explore MEANS to solve maximum entropy (minimum Kullback-Leibler divergence) problems with nested sampling.
Out-of-the box, MEANS (maximum entropy as nested sampling) works only in moderate (dozens to hundreds) dimensions, but in this regime has several interesting advantages over approximate or MCMC-oriented approaches:
- Nested sampling offers clear convergence criteria and has no "burn-in" phase.
- If there is only one constraint active, one nested sampling run can "map out"
the entire problem for any value of the associated Lagrange multiplier
$\lambda$ . - The associated evidence (normalizing constant)
$\log Z$ and its derivatives wrt. the Lagrange multipliers$\nabla_\lambda \log Z$ can be evaluated using automatic differentiation (that is, it is possible to differentiate through an entire nested sampling run!) - Related measures like the KL divergence and the density of states can be estimated.
- Prior information can be included naturally.
The .html
files are rendered from notebooks and can be visualized by
downloading them or directly via raw.githack.com
: