Bayes Forum Purpose.
The idea is to bring together people on the Munich Garching Campus interested in applying Bayesian methods and information theory to their research. The emphasis is on astronomy/astrophysics/cosmology but other applications are welcome. Typical applications are model fitting/evaluation, image deconvolution, spectral analysis. More general statistical topics will also be included. Software packages for Bayesian analysis are another focus. The initial announcement in 2011 led to a large response and currently we have about 170 people signed up on the mailing list. The meetings are very well attended and give us the motivation to pursue this initiative, now in its 6th year. We aim for one meeting per month, but we are flexible. Friday is the preferred day. We depend on the input of the participants, your views are important and can be shared via this mailing list and at the meetings.
February 2015: we were pleased to report that we have received financial support from the Excellence Cluster Universe (TUM), allowing us to invite more external speakers in future. January 2016: we were pleased to announce the continued support from the Excellence Cluster Universe (TUM) for that year. February 2017: we welcome Stella Veith (MPA) to help with the organization. February 2017: we are pleased to announce the continued support from the Excellence Cluster Universe (TUM) for this year.
Organizers.
Fabrizia Guglielmetti (ESO) Andy Strong (MPE) Torsten Ensslin (MPA) Paul Nandra (MPE) Frederik Beaujean (LMU) Allen Caldwell (MPP) Udo von Toussaint (IPP) Stella Veith (MPA)
Mailing list. This is used to inform about meetings, and also can be used to exchange information between members. To subscribe/unsubscribe visit https://lists.mpe.mpg.de/mailman/listinfo/bayesforum
Programme
NEXT MEETING TBD
Sponsored by the Excellence Cluster Universe, TU Munich: Mar 17 2017 Friday: Ata Metin (AIP Potsdam) MPA Seminar Room E.0.11 14:00 Phasespace reconstruction of the cosmic largescale structure Abstract: In the current cosmological understanding the clustering and dynamics of galaxies is driven by the underlying dark matter budget. I will present a framework to jointly reconstruct the density and velocity field of the cosmological largescale structure of dark matter. Therefore, I introduce the ARGO code that is a statistical reconstruction method and will focus on the bias description I use to connect galaxy and dark matter density as well as the perturbative description to correct for redshiftspace distortions arising from galaxy redshift surveys. Finally I will discuss the results of the application of ARGO on the SDSS BOSS galaxy catalogue.
Mar 10 2017 Friday: Reimar Leike (MPA)
Field
dynamics via approximative Bayesian reasoning
Feb 24 2017 Friday: Fabrizia Guglielmetti (ESO)
Surrogate minimization in high dimensions talk as pdf Abstract: Image interpretation is an illposed inverse problem, requiring inference
theory methods for a robust solution.
Bayesian statistics provides the proper principles of inference to solve
the illposedness in astronomical images, enabling explicit declaration on
relevant information entering the physical models. Furthermore, Bayesian
methods require the application of models that are moderately to extremely
computationally expensive. Often, the Maximum a Posteriori (MAP) solution
is used to estimate the most probable signal configuration (and
uncertainties) from the posterior pdf of the signal given the observed
data. Locating the MAP solution becomes a numerically challenging problem,
especially when estimating a complex objective function defined in an
highdimensional design domain. Therefore, there is the need to utilize
fast emulators for much of the required computations.
We propose to use Kriging surrogates to speed up optimization schemes,
like steepest descent. Kriging surrogate models are built and incorporated
in a sequential optimization strategy. Results are presented with
application on astronomical images, showing the proposed method can
effectively search the global optimum.
Sponsored by the Excellence Cluster Universe, TU Munich: 2 December 2016 Friday: Steven Gratton (Institute of Astronomy, Cambridge) Accurate Inference with Incomplete Information
Abstract: This talk will introduce a general scheme for computing posteriors in situations
in which only a partial description of the data's sampling distribution
is available. The method is primarily calculationbased and uses maximum
entropy reasoning.
18 November 2016 Friday: David Yu (MPE): Simultaneous Bayesian Location and Spectral Analysis of GammaRay Bursts talk as pdf
Abstract: We describe the novel paradigm to gammaray burst (GRB) location and spectral
analysis, BAyesian Location Reconstruction Of GRBs (BALROG). The Fermi Gammaray
Burst Monitor (GBM) is a gammaray photon counting instrument. The observed GBM
photon flux is a convolution between the true energy flux and the detector response
matrices (DRMs) of all the GBM detectors. The DRMs depend on the spacecraft
orientation relative to the GRB location on the sky. Precise and accurate spectral
deconvolution thus requires accurate location; precise and accurate location also
requires accurate determination of the spectral shape. We demonstrate that BALROG
eliminates the systematics of conventional approaches by simultaneously fitting for
the location and spectrum of GRBs. It also correctly incorporates the uncertainties
in the location of a transient into the spectral parameters and produces reliable
positional uncertainties for both welllocalized GRBs and those for which the
conventional GBM analysis method cannot effectively constrain the position.
Sponsored by the Excellence Cluster Universe, TU Munich: 7 October 2016 Friday: Reinhard Prix (AlbertEinsteinInstitut, Hannover) Bayesian methods in the search for gravitational waves talk as pdf
Abstract: Bayesian methods have found a number of uses in the search for
gravitational waves, starting from parameterestimation problems (the
most obvious application) to detection methods. I will give a short
overview of the various Bayesian applications (that I am aware of)
in this field, and will then mostly focus on a few selected examples
that I'm the most familiar with and that highlight particular challenges
or interesting aspects of this approach. Sponsored by the Excellence Cluster Universe, TU Munich: 16 September 2016 Friday: Ruediger Schack (Royal Holloway College, Univ. London, UK) MaxPlanckInstitut fuer Physik (Heisenberg Institut), Foehringer Ring 6, (UBahn Studentenstadt), Main auditorum. 14:00. An introduction to QBism Abstract: QBism is an approach to quantum theory which is grounded in the personalist conception of probability pioneered by Ramsey, de Finetti and Savage. According to QBism, a quantum state represents an agent's personal degrees of belief regarding the consequences of her actions on her external world. The quantum formalism provides consistency criteria that enable the agent to make better decisions. This talk gives an introduction to QBism and addresses a number of foundational topics from a QBist perspective, including the question of quantum nonlocality, the quantum measurement problem, the EPR (Einstein, Podolsky and Rosen) criterion of reality, and the recent nogo theorems for epistemic interpretations of quantum states.
Sponsored by the Excellence Cluster Universe, TU Munich: 29 July 2016 Friday: Henrik Junklewitz (Argelander Institute fuer Astronomie, Bonn) Statistical Inference in Radio Astronomy: Bayesian Radio Imaging with the RESOLVE package Abstract: Imaging and data analysis become a more and more pressing issue in modern astronomy, with ever larger and more complex data sets available to the scientist. This is particularly true in radio astronomy, where a number of new interferometric instruments are now available or will be in the foreseeable future, offering unprecedented data quality but also posing challenges to existing data analysis tools. In this talk, I present the growing RESOLVE package, a collection of newly developed radio interferometric imaging methods firmly based on Bayesian inference and Information field theory. The algorithm package can handle the total intensity image reconstruction of extended and point sources, take multifrequency data into account, and is in the development stage for polarization analysis as well. It is the first radio imaging method to date that can provide an estimate of the statistical image uncertainty, which is not possible with current standard methods. The presentation includes a theoretical introduction to the inference principles being used as well as a number of application examples.
17 June 2016 Friday: Maksim Greiner (MPA) Galactic Tomography talk as pdf Abstract: Tomography problems can be found in astronomy and medicinal imaging.
They are complex inversion problems which demand a regularization. I will
demonstrate how a Bayesian setup automatically provides such a regularization
through the prior. The setup is applied to two realistic scenarios: Galactic
tomography of the free electron density and medicinal computer tomography.
Sponsored by the Excellence Cluster Universe, TU Munich: 29 April 2016: Kevin H. Knuth (University at Albany (SUNY) )
Modern Probability Theory talk as pdf Abstract: A theory of logical inference should be allencompassing, applying to any subject about which inferences are to be made. This includes problems ranging from the early applications of games of chance, to modern applications involving astronomy, biology, chemistry, geology, jurisprudence, physics, signal processing, sociology, and even quantum mechanics. This talk focuses on how the theory of inference has evolved in recent history: expanding in scope, solidifying its foundations, deepening its insights, and growing in calculational power.
Additional talk by Kevin H. Knuth at Excellence Cluster 28 April 2016 Rethinking the Foundations talk as pdf
Sponsored by the Excellence Cluster Universe, TU Munich: 4 April 2016 James Berger (Duke Univ.) Bayesian Multiplicity Control talk as pdf Abstract: Issues of multiplicity in testing are increasingly being encountered in a wide range of disciplines, as the growing complexity of data allows for consideration of a multitude of possible tests (e.g., lookelsewhere effects in Higgs discovery, Gravitational wave discovery, etc). Failure to properly adjust for multiplicities is one of the major causes for the apparently increasing lack of reproducibility in science. The way that Bayesian analysis does (and sometimes does not) deal with multiplicity will be discussed. Different types of multiplicities that are encountered in science will also be introduced, along with discussion of the current status of multiplicity control (or lack thereof).
Sponsored by the Excellence Cluster Universe, TU Munich: 4 March 2016 (Friday): Stephan Hartmann (LMU, Mathematical Philosophy Dept.) Learning Causal Conditionals talk as pdf
Abstract: Modeling how to learn an indicative conditional (“if A then B”) has been a major
challenge for Bayesian epistemologists. One proposal to meet this
challenge is to construct the posterior probability distribution by
minimizing the KullbackLeibler divergence between the posterior
probability distribution and the prior probability distribution, taking
the learned information as a constraint (expressed as a conditional
probability statement) into account. This proposal has been criticized in
the literature based on several clever examples. In this talk, I will
revisit four of these examples and show that one obtains intuitively
correct results for the posterior probability distribution if the
underlying probabilistic models reflect the causal structure of the
scenarios in question.
19 February 2016 (Friday): Allen Caldwell (MPP)
Confidence and Credible Intervals talk as pdf
Abstract: There is considerable confusion in our community concerning the definition and
meaning of frequentist confidence intervals and Bayesian credible intervals. We
will review how they are constructed and compare and contrast what they mean and
how to use them. In particular for frequentist intervals, there are different
popular constructions; they will be defined and compared and with the Bayesian
intervals on examples taken from typical situations faced in experimental work.
18 February 2016 (Thursday) Frederick Beaujean (LMC) talk at Excellence Cluster Universe, 1st floor, 12:30
Bayes vs frequentist: why should I care?
Abstract: There is considerable confusion among physicists regarding the two main
interpretations of probability theory. I will review the basics and then
discuss some common mistakes to clarify which questions can be asked and
how to interpret the numbers that come out.
22 January 2016 (Friday): Daniel Pumpe (MPA)
Dynamic system classifier talk as pdf
Abstract: Stochastic differential equations (SDE) describe well many physical,
biological and sociological systems, despite the simplification often
made in their description. Here the usage of simple SDEs to characterize
and classify complex dynamical systems is proposed within a Bayesian
framework. To this end, the algorithm ’dynamic system classifier’ (DSC)
is developed. The DSC first abstracts training data of a system in terms
of time dependent coefficients of the descriptive SDE. This then permits
the DSC to identify unique features within the training data. For
definiteness we restrict ourselves to oscillation processes with a
timewise varying frequency ω(t) and damping factor γ(t).
Although real systems might be more complex, this simple oscillating SDE with
timevarying coefficients can capture many of their characteristic
features. The ω and γ timelines represent the abstract system
characterization and permit the construction of efficient classifiers.
Numerical experiments show that the classifiers perform well even in the
low signaltonoise regime.
Special additional seminar: 21 December 2015 (Monday) : Jan Leike (Australian National University)
Bayesian Reinforcement Learning talk as pdf Abstract: Reinforcement learning (RL) is a subdiscipline of machine learning that
studies algorithms that learn to act in an unknown environment through
trial and error; the goal is to maximize a numeric reward signal. We
introduce the Bayesian RL agent AIXI that is based on the universal
(Solomonoff) prior. This prior is incomputable and thus our focus is not
on practical algorithms, but rather on the fundamental problems with the
Bayesian approach to RL and potential solutions.
Sponsored by the Excellence Cluster Universe, TU Munich: 11 December 2015: Will Handley (Kavli Institute, Cambridge) wh260@mrao.cam.ac.uk
PolyChord: nextgeneration nested sampling talk as pdf talk as tar including movies
Abstract: PolyChord is a novel Bayesian inference tool for highdimensional
parameter estimation and model comparison. It represents the latest
advance in nested sampling technology, and is the natural successor to
MultiNest. The algorithm uses John Skilling's slice sampling, utilising
a slicesampling MarkovChainMonteCarlo approach for the generation of
new live points. It has cubic scaling with dimensionality, and is
capable of exploring highly degenerate multimodal distributions.
Further, it is capable of exploiting a hierarchy of parameter speeds
present in many cosmological likelihoods.
In this talk I will give a brief account of nested sampling, and the
workings of PolyChord. I will then demonstrate its efficacy by
application to challenging toy likelihoods and realworld cosmology problems.
PolyChord website
27 November 2015 Hans Eggers (Univ. Stellenbosch; TUM; Excellence Cluster) The Gaussian onion: priors for a spherical world talk as pdf
Hans Eggers and Michiel de Kock
Abstract: Computational approaches to model comparison are often necessary, but
they are expensive. Analytical Bayesian methods remain useful as
benchmarks and signposts. While Gaussian likelihoods with linear
parametrisations are usually considered a closed subject, more
detailed inspection reveals successive layers of issues. In
particular, comparison of models with parameter spaces of different
dimension inevitably retain an unwanted dependence on prior
metaparameters such as the cutoffs of uniform priors. Reducing the
problem to symmetry on the hypersphere surface and its radius, we
formulate a prior that treats different parameter space dimensions
fairly. The resulting "rprior" yields closed forms for the evidence
for a number of choices, including three priors from the statistics
literature which are hence just special cases. The rprior may
therefore point the way towards a better understanding of the inner
mathematical structure which is not yet fully understood. Simple
simulations show that the current interim formulation performs as well
as other model comparison approaches. However, the performance of the
different approaches varies considerably, and more numerical work is
needed to obtain a comprehensive picture.
16 October 2015 : Christian Robert (Universite ParisDauphine) Approximate Bayesian Computing (ABC) for model choice
Abstract: Introduced in the late 1990's, the ABC method can be considered from several
perspectives, ranging from a purely practical motivation towards handling complex likeli
hoods to nonparametric justifications. We propose here a different analysis of ABC techniques
and in particular of ABC model selection. Our exploration focus on the idea that generic machine
learning tools like random forests (Breiman, 2001) can help in conducting model selection among
the highly complex models covered by ABC algorithms. Both theoretical and algorithmic output
indicate that posterior probabilities are poorly estimated by ABC.
I will describe how our research for an alternative first led us to abandoning the use of posterior
probabilities of the models under comparison as evidence tools. As a first substitute, we proposed to
select the most likely model via a random forest procedure and to compute posterior predictive
performances of the corresponding ABC selection method. It is only recently that we realised that
random forest methods can also be adapted to the further estimation of the posterior probability
of the selected model. I will also discuss our recommendation towards sparse implementation of the
random forest tree construction, using severe subsampling and reduced reference tables. The
performances in term of power in model choice and gain in computation time of the resulting ABCrandom
forest methodology are illustrated on several population genetics datasets.
29 September 2015: John Skilling (MEDC) Bayesian tomography talk as pdf
Abstract: Tomography measures the density of a body (usually by
its opacity to Xrays)
From these line integrals, we reconstruct an image of the density as a function of position, ρ(x).
The user then interprets this in terms of material classification (in medicine, bone or muscle or fat
or fluid or cavity, etc.).
Bayesian analysis does not need to pass through the intermediate step of density: it can start directly
with a prior probability distribution over plausible material models. This expresses the user's judgement
about which models are thought plausible, and which are not.
That prior is modulated by the data (through the likelihood function) to give the posterior distribution of
those images that remain plausible after the data are used.
Usually, only a tiny proportion O(e^{ size of dataset}) of prior possibilities survives into the posterior, so
that Bayesian analysis was essentially impossible before computers. Even with computers, direct search is
impossible in large dimension, and we need numerical methods such as nested sampling to guide the exploration.
But this can now be done, easily and generally. The exponential curse of dimensionality is removed. Starting
directly from material models enables data fusion, where different modalities (such as CT, MRI, ultrasound)
can all be brought to bear on the same material model, with results that are clearer and more informative than
any individual analysis. The aim is to use sympathetic prior models along with modern algorithms to advance the
state of the art.
24 July 2015 Maria Bergemann (MPIA Heidelberg)
Stellar and Galactic Archaeology with Bayesian Methods talk as pdf
Abstract: Spectroscopic stellar observations have shaped our understanding of stars
and galaxies. This is because spectra of stars are the only way to
determine their chemical composition, which is the fundamental resource to
study cosmic nucleosynthesis in different environments and on different
timescales. Research in this field has never been more exciting and
important to astronomy: the ongoing and future largescale stellar
spectroscopic surveys are making gigantic steps along the way towards
highprecision stellar, Galactic, and extragalactic archaeology.
sHowever, the data we extract from stellar spectra are not
strictlyspeaking ‘observational’. These data  fundamental parameters and
chemical abundances  heavily rely upon physical models of stars and
statistical methods to compare the model predictions with raw
observations. I will describe our efforts to provide the most realistic
models of stellar spectra, based upon 3D nonlocal thermodynamic
equilibrium physics, and the new Bayesian spectroscopy approach for the
quantitative analysis of the data. I will show how our new methods and
models transform quantitative spectroscopy, and discuss implications for
stellar and Galactic evolution.
17 July 2015 Igor Ovchinnikov (UCLA) Supersymmetry in a classical world: new insights on stochastic dynamics
from topological field theory talk as pdf
Abstract: Prominent longrange phenomena in natural dynamical systems such
as the 1/f and flicker noises, and the powerlaw statistics observed,
e.g., in solar flares, gammaray bursts, and neuroavalanches
are still not fully understood. A new framework sheds light on the
mechanism of emergence of these rich phenomena from the point of
view of stochastic dynamics of their underlying systems. This framework
exploits an exact correspondence relation of any stochastic differential
equation (SDE) with a dual (topological) supersymmetric model. This
talk introduces into the basic ideas of the framework. It will show that
the emergent longrange dynamical behavior in Nature can be
understood as the result of the spontaneous breakdown of the
topological supersymmetry that all SDE's possess. Surprisingly
enough, the concept of supersymmetry, devised mainly for particle
physics, has direct applicability and predictive power, e.g., over the
electrochemical dynamics in a human brain. The mathematics of the
framework will be detailed in a separate lecture series on "Dynamical
Systems via Topological Field Theory" at MPA from July 27th to July 31st
26 June 2015 Volker Schmid (Statistics Department of LMU Munich) Bayesian regularization for compartment models in imaging
Abstract: Compartment models are used as biological model for the analysis of a
variety of imaging data. Perfusion imaging, for example, aims to
investigate the kinetics in human tissue in vivo via a contrast agent.
Using a series of images, the exchange of contrast agent between different
compartment in the tissue over time is of interest. Using the analytic
solution of system of differential equations, nonlinear parametric
functions are gained and can be fitted to the data at a voxel level.
However, the estimation of the parameters is unstable, in particular in
models with more compartments.
To this end, we use Bayesian regularization to gain stable estimators
along with credible intervals. Prior knowledge is determined about the
context of the local compartment models. Here, context can refer to either
spatial information, potentially including edges in the tissue.
Additionally patient or study specific information can be used, in order
to develop a comprehensive model for the analysis of a set of images at
once.
I will show the application of fully Bayesian multicompartment models in
dynamic contrastenhanced magnetic resonance imaging (DCEMRI) and
fluorescence recovery after photo bleaching (FRAP) microscopy.
Additionally, I will discuss some alternative nonBayesian approaches
using the same or similar prior information.
19 June 2015 Emille Ishida (MPA) Approximate Bayesian Computation in Astronomy talk as pdf
Abstract: Approximate Bayesian Computation (ABC) enables parameter
inference for complex physical systems in cases where the true
likelihood function is unknown, unavailable, or computationally too
expensive. It relies on the forward simulation of mock data and comparison between
observed and synthetic catalogues. In this talk I will go through the basic principles
of ABC and show how it has been used in astronomy recently. As a practical example,
I will present ``cosmoabc'', a Python ABC sampler featuring a Population Monte Carlo
variation of the original ABC algorithm, which uses an adaptive importance sampling scheme.
References:
Paper: Ishida et al, 2015  http://arxiv.org/abs/1504.06129
Code: https://pypi.python.org/pypi/cosmoabc/1.0.1
http://ascl.net/code/v/1129
Documentation: http://cosmoabc.readthedocs.org/en/latest/
8 May 2015 Allen Caldwell (MPP) pValues for Model Evaluation talk as pdf
Abstract: Deciding whether a model provides a good description of data is often based on a
goodnessoffit criterion summarized by a pvalue. Although there is considerable
confusion concerning the meaning of pvalues, leading to their misuse, they are
nevertheless of practical importance in common data analysis tasks. We motivate
their application using a Bayesian argumentation. We then describe commonly and less
commonly known discrepancy variables and how they are used to define pvalues. The
distribution of these are then extracted for examples modeled on typical data
analysis tasks, and comments on their usefulness for determining goodnessoffit are
given.
(based on paper by Frederik Beaujean, Allen Caldwell, Daniel Kollar, Kevin Kroeninger.)
24 April 2015 2015 Udo von Toussaint (IPP) MPA lecture hall Room E.0.11 at 14:00
Uncertainty quantification for computer models
The quantification of uncertainty for complex simulations is of increasing
importance. However, standard approaches like Monte Carlo sampling become
infeasible quickly. Therefore Bayesian and nonBayesian probabilistic
uncertainty quantification methods like polynomial chaos (PC) expansion
methods or Gaussian processes have found increasing use over the recent
years. This presentation focuses on the concepts and use of nonintrusive
collocation methods for the propagation of uncertainty in computational
models using illustrative examples as well as realworld problems, ie. the
VlasovPoisson equation with uncertain inputs.
27 February 2015: Maximilian Imgrund (USM/LMU Munich and MPIfR Bonn) A Bayesian method for high precision pulsar timing  Obstacles,
narrow pathways and chances
Abstract: Pulsars' extremely predictable timing data is used to detect signals
from fundamental physics such as the general theory of relativity.To
fit for actual parameters of interest, the rotational phase of these
compact and fast rotating objects is modeled and then matched with
observational data by using the times of arrival (ToAs) of a certain
rotational phase tracked over years of observations. While there exist
Bayesian methods to infere the parameters' pdfs from the ToAs, the
standard methods to generate these ToAs relies on simply averaging the
data received by the telescope. Our novel Bayesian method acting on
the single pulse level both may give more precise results and a more
accurate error estimation.
30 January 2015: Torsten Ensslin (MPA) Information field dynamics
Abstract: How to optimally describe an evolving field on a finite computer?
How to assimilate measurements and statistical knowledge about a field into a simulation?
Information field dynamics is a novel conceptual framework to address such questions from an
informationtheoretical perspective. It has already proven to provide superior simulation
schemes in a number of applications.
19 December 2014: Max Knoetig (Institute for Particle Physics, ETH Zürich) On the on/off problem talk as pdf Abstract: Discusses the on/off problem that consists of two
counting measurements, one with and one without a possible
contribution from a signal process. The task is to infer the strength
of the signal; e.g., of a gammaray source. While this sounds pretty
basic, Max will present a first analytical Bayesian solution valid for
any count (incl. zero) that supersedes frequentist results valid only
for large counts. The assumptions he makes have to be reconciled with
our intuition of the foundations of Bayesian probability theory.
24 October 2014: Frederik Beaujean (LMU) Bayesian analyses in Bmeson physics
Abstract: In measurements of rare Bmeson decays experiments often see only few,
say 100, interesting events in their detector. This may not be sufficient to determine
the full angular distribution with 10 or more parameters – the object of interest
to newphysics searches  in a likelihood fit. I will discuss the somewhat surprising result
that the frequentist method of moments can be used to better convey the information in the data
into Bayesian global fits. In the second part, I will briefly discuss how to perform such fits
with a new sampling algorithm that Stephan Jahn and I recently developed. It is based on
importance sampling and variational Bayes, and can be used to samp from and compute the evidence
of multimodal distributions in up to 4050 dimensions, while most of the (costly) likelihood
evaluations can be done in parallel. We just released the first version of pypmc, a python package
that implements the algorithm.
Friday 23 May 2014 Johannes Buchner (MPE) Calibrated Bayes for Xray spectra talk as pdf associated paper
Abstract: Bayesian inference provides a allpurpose recipe to draw parameter and
model inferences based on knowledge of the stochastic processes at work.
But is the used model the correct one (model verification)? Have I found
all relevant effects in my data (model discovery)? Is it more reliable
to do Bayesian model comparison than likelihood ratio tests or BIC/AIC,
which are much simpler to compute?
In my recently submitted paper, we introduce Bayesian parameter
estimation and model selection for Xray spectral analysis. As an
application, we infer about the geometry of the "torus" in obscured AGN
spectra within the Chandra Deep Field South, using 10 physically
motivated models. The data is strong enough to rule out 7 of them in
individual spectra, and another two if we combine all spectra. In this
manner, we find that a obscuring toroid with a finite opening angle is
preferred over a entirely closed or disklike obscurer. Additionally, we
find a dense reflection component in obscured AGN, possibly the
accretion disk.
My work does not stop at just applying model selection. The results are
vetted against outliers from individual spectra. We numerically compare
the frequentist properties of model selection methods (evidence,
likelihoodratio, AIC/BIC) and look at the pvalues associated with
different model priors. We showcase the QQplot in combination with the
AIC as a generic tool for model discovery. And finally, we show that in
our problem, multimodal parameter spaces cause commonly used ML fitting
methods to underestimate parameter uncertainties.
Friday 4 April 2014 Rafael de Souza (Korea Astronomy and Space Science Institute) Unraveling general relationships in multidimensional datasets
Abstract: I will present a set of tools for mining multidimensional datasets. In
particular, an application to the MillenniumII Simulation dataset, where
we automatically recover all groups of linear and nonlinear associations
within a set of ~30 parameters from thousands of galaxies.
Also, I will show alternative ways to visualize general correlations in
huge datasets and how dimensionality reduction techniques and graph
theory can help us to understand the redshift evolution of galaxy
properties.
Finally I will give a short introduction of the recently created working
group of cosmostatistics within the international astrostatistics
association.
Tuesday 25 Mar 2014 Paul Sutter (IAP Paris) Probabilistic image reconstruction with radio interferometers
Abstract: I will present a new, generalpurpose method for reconstructing images
from radio interferometric observations using the Bayesian method of
Gibbs sampling. The method automatically takes into account incomplete
coverage and mode coupling. Using a set of mock images with realistic
observing scenarios I will show that this approach performs better
than traditional methods. In addition, Gibbs sampling scales well and
provides complete statistical information. I will discuss the
application of this method to upcoming 21cm and CMB polarization
experiments.
Thursday 27 Feb 2014 Bernd Noack (PPRIME Institute, Univ. Poitiers) Turbulence modelling and control using maximum entropy principles talk as pdf
Abstract: Fluid turbulence is an ubiqious phenomenon in nature and technology.
Lungs, for instance, need turbulence for effective transport of Oxygen and
COx. Other examples include rivers, atmospheric boundary layers and the
corona of the sun. Most technologically relevant flows are turbulent:
flows in oil pipelines, combustors and mixers as well as flows around
cars, trains, ships and airplanes.
The manipulation of turbulent flows is an important means for engineering
optimization. Examples are drag reduction of cars and trucks, lift
increase of wings, increase of pressure recovery in diffusers, and
efficiency increase of wind and water energy. One rapidly evolving
technique towards this goal is closedloop turbulence control. Here, the
flow is manipulated with actuators which may, for instance, blow or suck
air, sensors which monitor the flow state, and a control logic feeding
back back sensor information into efficient actuation commands.
Closedloop turbulence control has applications of epic proportion for
optimization of aerodynamic forces, noise and mixing processes.
The key mathematical challenge and control opportunity is the strong
nonlinearity of the actuation response. In this talk, we present examples
of such nonlinear control effects and propose a modelling and control
strategy building on reducedorder models and maximum entropy principles.
MaxEnt principles guide (1) lowdimensional representations for the
largescale coherent structures, (2) lowdimensional dynamical models for
their temporal evolution, and (3) closure schemes for the statistical
moments and (4) control laws exploiting nonlinear actuation response. We
present turbulence control examples hithertoo not accessible by
stateoftheart strategies or significantly outperforming them.
Previous meetings (reordering in reverse chronololgical order in progress)
13 Dec 2013 Sebastian Dorn (MPA)
DIP: Diagnostics for insufficiences of posterior calculations – a CMB application talk as pdf
Abstract: An errordiagnostic validation method for posterior distributions (DIP test) in Bayesian signal inference is presented.
It transfers deviations from the correct posterior into characteristic deviations from a uniform distribution of a quantity constructed for this
purpose. I show that this method is able to reveal and discriminate several kinds of numerical and approximation errors, as well as their
impact on the posterior distribution. I further illustrate the DIP test with help of analytic examples and an application to an actual problem in
precision cosmology.
15 Nov 2013 Simona Vegetti (MPA) Bayesian modelling of regularized and gravitationally lensed sources ....talk as pdf Abstract: Strong gravitational lens systems with extended sources are particularly interesting because they provide additional constraints on the lens mass distribution. In order to infer the lens properties one needs to simultaneously determine the lens potential and the source surface brightness distribution by modeling the lensed images. In this talk, I will present a linear inversion technique to reconstruct pixellated sources and lens potentials. This technique makes use of Bayesian analysis to determine the best lens mass model parameters and the best level of source regularization. Finally, I will discuss the MultiNest technique and how this is used to compare different lens models.
26 July 2013 : David Hogg (MPIA Heidelberg) Hierarchical modeling for astronomers …. talk as pdf Abstract: I will show a few applications  some real and some toys  that demonstrate that there are astrophysics questions that can be answered using hierarchical Bayesian inference better than they can be answered in any other way. In particular I will talk about problems in which there are many noisily observed instances (eg, exoplanets or quasars) and the goal is to infer the distribution from which the (noiseless) instances are drawn, and problems in which the goal is to make predictions for high signaltonoise data when only low signaltonoise data are at hand. I will particularly emphasize that hierarchical methods can combine information from multiple noisy data sets even in situations in which the popular methods of "stacking" or "coadding" necessarily fail.
2011 12 December: First meeting.
Fabrizia Guglielmetti (MPE Garching) Introduction to Bayes Forum... talk as pdf Torsten Ensslin (MPA Garching) Information field theory  turning data into images .... talk as pdf Abstract: Astronomical image reconstruction, cosmography of the largescalestructure and investigations of the CMB have to deal with complex inverse problems for spatial distributed quantities. To tackle such problems in a systematic way, I present information field theory (IFT) as a means of Bayesian, data based inference on spatially distributed signals. IFT is a statistical field theory, which permits the construction of optimal signal recovery algorithms. The theory can be used in many fields, and a few applications in cosmography, CMB research, and cosmic magnetism studies are presented.
2012 25 January: Udo von Toussaint (IPP Garching) Bayesian inference in physics talk as pdf
Abstract:
Bayesian inference provides a consistent method for the
extraction of information from physics experiments even in
illconditioned circumstances.
10 February: Allen Caldwell (Max Planck Institute for Physics, Munich) Signal Discovery in Sparse Spectra  a Bayesian Analysis talk as pdf
Abstract: A Bayesian analysis of the probability of a signal in the presence of background is developed. The method is general and, in particular, applicable to sparsely populated spectra. As a concrete example, we consider the search for neutrinoless double beta decay with the GERDA experiment.
16 March :Volker Dose (IPP Garching) Beyond least squares talk as pdf
13 April: Kevin Kroeninger (Universitaet Goettingen) The Bayesian Analysis Toolkit  a C++ tool for Bayesian inference talk as pdf
Abstract: Bayesian inference is typically used to estimate the values of free parameters of a model, to test the validity of the model under study
and to compare predictions of different models with data. The Bayesian Analysis Toolkit, BAT, is a software package which addresses the
points above. It is designed to help solve statistical problems encountered in Bayesian inference. BAT is based on Bayes' Theorem and
is realized with the use of Markov Chain Monte Carlo. This gives access to the full posterior probability distribution and enables
straightforward parameter estimation, limit setting and uncertainty propagation.
BAT is implemented in C++ and allows for a flexible definition of mathematical models and applications while keeping in mind the
reliability and speed requirements of the numerical operations. It provides a set of algorithms for numerical integration, optimization
and error propagation. Predefined models exist for standard cases. In addition, methods to judge the goodnessoffit of a model are
implemented. An interface to ROOT allows for further analysis and graphical display of results. BAT can also be run from within RooStats
analysis.
25 May: Niels Oppermann (MPA Garching) The uncertain uncertainties in the Galactic Faraday sky talk as pdf MPE 14:00 Abstract: The reconstruction of spatially or temporally distributed signals from a data set can benefit from using information on the
signal's correlation structure. I will present a method, developed within Information Field Theory, that uses this correlation information, even
though it is not a priori available. The method is furthermore robust against outliers in the data since it allows for errors
in the observational error estimate. The performance of the method will be demonstrated on an astrophysical mapmaking problem,
the reconstruction of an allsky image of the Galactic Faraday depth.
15 June: Roberto Saglia (MPE Garching) Photometric redshifts talk as pdf MPE 14:00 Abstract: Widefield surveys of the sky like SDSS, PanSTARRS or DES provide multiband photometry of millions of galaxies down to faint magnitudes. Redshifts derived from this kind of datasets, i.e. photometric redshifts, deliver galaxy distances well beyond the limits of spectroscopic surveys. The ESA mission EUCLID will rely on photometric redshifts (combined with lensing measurements) to constrain the properties of Dark Energy. I will review the current methods of photometric redshifts determinations. Empirical strategies like polynomial fitting, kernel regression, support vector machines, and especially artificial neural networks deliver excellent results, but require extensive spectroscopic training sets down to faint magnitudes. Template fitting in the Bayesian framework can provide a competitive alternative, especially for the class of luminous red galaxies.
6 July: Fabrizia Guglielmetti (MPE Garching) Bayesian mixture models for backgroundsource separation talk as pdf
MPE
Abstract: A method to solve the longlasting problem of disentanglement of the background from the sources has been developed using Bayesian mixture modelling.
The technique employs a joint estimate of the background and detection of the sources in astronomical images.
Since the interpretation of observational data implies the solution of an illposed inverse problem,the technique makes use of Bayesian probability theory to ensure that the solution is stable,
unique and close to the exact solution of the inverse problem. The technique is general and applicable to any counting detector.
So far, this technique has been applied to Xray data from ROSAT and Chandra and it is under a feasibility study for the forthcoming eROSITA mission.
11 September : Jens Jasche (IAP, Paris) Large Scale Bayesian Inference in Cosmology talk as pdf MPE Abstract:
Already the last decade has witnessed unprecedented progress in
the collection of cosmological data. Presently proposed and
designed future cosmological probes and surveys permit us to
anticipate the upcoming avalanche of cosmological information
during the next decades.
12 October Rainer Fischer (IPP Garching) talk as pdf
Recent Developments and Applications of Bayesian Data Analysis in Fusion Science
MPE
Abstract: In fusion science a large set of measured data have to be analysed to obtain most reliable results. This data set usually suffers from statistical as well as systematic uncertainties,
from lack of information, from data inconsistencies, and from interdependencies of the various physical models describing the data from the heterogeneous measurements.
A joint analysis of the combined data sets allows to improve the reliability of the results, to improve spatial and temporal resolution, to achieve synergies, and to find and resolve data inconsistencies
exploiting the complementary and redundancy of the various measured data. The concept and benefits of Integrated Data Analysis in the framework of Bayesian probability theory
will be shown with examples from various diagnostics measurements at ASDEX Upgrade.
7 November: Tamas Budavari (Johns Hopkins University) talk as pdf
Bayesian CrossIdentification in Astronomy
MPE
Abstract: The crossidentification of objects in separate observations is one of the most fundamental problems in astronomy. Scientific analyses typically
build on combined, multicolor and/or multiepoch datasets, and heavily rely on the quality of their associations. Crossmatching, however, is a
hard problem both statistically and computationally. We will discuss a probabilistic approach, which yields intuitive and easily calculable
formulas for point sources, but also generalizes to more complex situations. It naturally accommodates sophisticated physical and
geometric models, such as that of the spectral energy distribution of galaxies, radio jets or the proper motion of stars. Building on this new
mathematical framework, new tools are being developed to enable automated associations.
7 December 2012: Christoph Raeth (MPE Garching) talk as pdf
Surrogates
MPE
Abstract: The method of surrogates represents one of the key concepts of nonlinear data analysis. The technique that has originally been developed
for nonlinear time series analysis allows to test for weak nonlinearities in data sets in a modelindependent way.
The method has found numerous applications in many fields of research ranging from geophysical and physiological time series analysis to econophysics
and astrophysics. In the talk the basic idea of this approach and the most common ways to generate surrogates are reviewed.
Further, more recent applications of the method of surrogates in the field of cosmology, namely for testing for scaledependent nonGaussianities
in maps of the cosmic microwave background (CMB) are outlined. Open questions about a possible combination of the surrogate approach
and Bayesianinspired component separation techniques for further improvements of the CMBmapmaking procedure will be pointed out for a common discussion.
25 January 2013 : Marcel Reginatto (PhysikalischTechnische Bundesanstalt, Braunschweig)
Data analysis for neutron spectrometry with liquid scintillators: applications to fusion diagnostics talk as pdf
MPE
Abstract: Neutron spectrometry is a useful diagnostic tool for fusion science. Measurements of the neutron energy spectrum provide information about the
reactions that take place in the plasma, information that is relevant to understanding the consequences of choosing different experimental setups; e.g.,
when evaluating the effectiveness of different heating schemes.
In this talk, I will focus on neutron spectrometry with liquid scintillators. The analysis of such spectrometric measurements is not straightforward,
in part because the neutron energy spectrum is not measured directly but must be inferred from the data provided by the spectrometer.
It requires both the use of methods of data analysis that are well suited to the questions that are being asked and an understanding of the energy resolution
achievable when these methods are used under particular measurement conditions.
After a short overview of neutron spectrometry with liquid scintillation detectors, I will present an analysis of the resolving power and the superresolution
that is theoretically achievable with this spectrometer, based on the work of Backus and Gilbert and of Kosarev.
I will then discuss the analysis of measurements made at the PhysikalischTechnische Bundesanstalt (PTB) accelerator facility
and at the Joint European Torus (JET), in particular the application of maximum entropy deconvolution and Bayesian parameter estimation to these data.
6 February 2013 : Giulio D'Agostini (Univ. Rome “La Sapienza”)
Probability, propensity and probability of propensity values talk as pdf
MPE
Abstract: Assuming the attendants to the MPE Bayesian Forum familiar with `Bayesian computation', I will focus
on some foundamental questions. We shall play a little interactive game, supported by a Bayesian Network
implemented in Hugin, from which some of the basic, debated issues concerning probability, about which there is disagreement
even among `Bayesians', will arise spontaneously.
22 March 2013 : Marco Selig (MPA)
The NIFTY way of Bayesian signal inference talk as pdf
MPE
Abstract: Although, a large number of Bayesian methods for 1D signal reconstruction, 2D imaging, as well as 3D tomography, appear formally similar, one often
finds individualized implementations that are neither flexible nor transferable.
The open source Python library “NIFTY” ("Numerical Information Field Theory": http://www.mpagarching.mpg.de/ift/nifty ) allows its user to write signal reconstruction
algorithms independent of the underlying spatial grid and its resolution. In this talk, a number of 1D, 2D, 3D and spherical examples of NIFTY's versatile
application areas that involve astrophysical, medical, and abstract inference problems are presented. Those include the use of spectral smoothness priors,
the tomography of the galactic electron density, and more.
19 April 2013 : Mikko Tuomi (University of Hertfordshire, UK)
Bayesian search for other Earths: lowmass planets around nearby M dwarfs talk as pdf
MPE
Abstract: The Bayes' rule of conditional probabilities can be used to construct a data analysis framework in the broadest possible sense. No assumptions
remain hidden in the methodology and the results can be presented solely conditioned on the selected prior and likelihood models. When searching
for small fingerprints of lowmass planets orbiting nearby stars, the optimality of statistical techniques is of essence. We apply such
techniques to precision radial velocities of nearby M dwarfs and reveal the existence of a diverse population of lowmass planets right next door.
7 June 2013 : Alex Szalay (Johns Hopkins University)
Photometric Redshifts Using Random Forests talk as pdf
MPE Abstract: The talk will describe how Breiman's Random Forest technique provides a simple and elegant approach to photometric redshifts. We show through a simple toy model how the estimator's uncertainty can be quantified, and how the uncertainty scales with sample size, forest size and sampling rate. We then show how the RF is really a computationally convenient form of a kernel density estimator, and as such maps unto Bayesian techniques extremely naturally. Goto top of page for next meeting!
Conferences Summer school in statistics for astronomers VIII
Related Sites ASAIP: Astrostatistics and Astroinformatics Portal
Literature A selection by the organizers  suggestions welcome.
Books E.T. Jaynes: "Probability Theory the Logic of Science" (2003), Cambridge Univ. Press D. MacKay: "Information Theory, Inference and Learning Algorithms" G. D'Agostini: "Bayesian Reasoning in Data Analysis: A Critical Introduction" (2003), World Scientific Publishing C. P. Robert and G. Casella: "Monte Carlo Statistical Methods" (2004), Springer Verlag P.C. Gregory: "Bayesian Logical Data Analysis for the Physical Sciences" (2005), Cambridge Univ. Press D.S. Sivia and J. Skilling: "Data Analysis: A Bayesian Tutorial" (2006), Oxford Univ. Press C. Bishop: "Pattern Recognition and Machine Learning" (2006), Springer Verlag C. P. Robert and G. Casella: "Introducing Monte Carlo Methods with R" (2010), Springer Verlag J.V. Stone: "Bayes' Rule: A Tutorial Introduction to Bayesian Analysis" (2013), Sebtel Press J.V. Stone: "Information Theory: A Tutorial Introduction" (2013), Sebtel Press W. von der Linden, V. Dose, U. von Toussaint: "Bayesian Probability Theory: Applications in the Physical Sciences" (2014), Cambridge Univ. Press
Kevin H. Knuth & John Skilling, 2012, Axioms, 1, 3873 'Foundations of Inference' http://arxiv.org/abs/1008.4831 Udo von Toussiant (2011): 'Bayesian Inference in Physics', RvMP, 83, Issue 3, 943999 F. Beaujean, A. Caldwell, D. Kollár, and K. Kröninger (2011): 'pvalues for model evaluation', Phys. Rev. D 83, 012004
Torsten A. Ensslin, Mona Frommert, Francisco S. Kitaura 2009, Phys. Rev. D 80, 105005 'Information field theory for cosmological perturbation reconstruction
and nonlinear signal analysis' http://arxiv.org/abs/0806.3474
Volker Dose (2003): 'Bayesian inference in physics: case studies', Reports on Progress in Physics, 66, Number 9

