Upgrade to Pro — share decks privately, control downloads, hide ads and more …

Stochastic Relaxation, Gibbs Distributions, and...

Stochastic Relaxation, Gibbs Distributions, and the Bayesian Restoration of Images

#pwlkyiv on a seminal paper by Stuart Geman and Donald Geman

http://ieeexplore.ieee.org/document/4767596/

Volodymyr Kyrylov

October 22, 2016
Tweet

More Decks by Volodymyr Kyrylov

Other Decks in Science

Transcript

  1. → Bayesian (an inference framework) → MRFs & Images (modeling

    image restoration images) → Gibbs Sampler (a computationally tractable representation for MRFs defined over graphs) → Relaxation (computing the best Gibbs)
  2. Bayes → originally published to make theological arguments → made

    cool by Laplace → inference and decision come as plugins → works recursively (Kalman, Particle filters & more, aka sequential estimation) → AI without a GPU: just refine your priors
  3. → y - signal (measurement, evidence, input) → x -

    state (unknown, hypothesis, output) think: model parameters vs data
  4. Estimation problems → MAP - maximum a posteriori → MLE

    - maximum likelihood (aka MAP when you have no prior)
  5. logs are like probability buffs: slay exps, turn ugly into

    neat negation turns maximization into minimization
  6. cliques are subsets of sites such that every pair in

    the set is a neighbor means "all cliques":
  7. X is a MRF wrt if all probabilities are positive

    probability of a site given others is the same as a probability of a site given its neighbors (Markov Property)
  8. Markov Property → probability at site depends only at values

    of a finite neighborhood → neighborhood of site
  9. Noisy Image Prior → : pixel intensities ( ) →

    - some nonlinearity (like ) → - blur → - invertible noise
  10. Original Image is a MRF over a graph that contains

    original intensities ( ) and image edges ( ). A set of all possible configurations of :
  11. Hammersley-Clifford Theorem → any probability measure that satisfies a Markov

    property is a Gibbs distribution for an appropriate choice of (locally defined) energy function MRF Gibbs
  12. Gibbs Sampler → introduced in the subj paper → produces

    a Markov chain with as equilibrium distribution → MCMC algorithm family
  13. P(new state of site) = Gibbs(visited now | others) P(everyone

    else before) → This MAP is a statistical process itself (hence MCMC) → Parallel!
  14. Theorem A (Relaxation) No matter where we start ( ),

    our state will end up a Gibbs Distribution if we keep ( ) sampling infinitely .
  15. Runnable Demo → http://www.inf.u-szeged.hu/~kato/software/ → "Supervised Image Segmentation Using Markov

    Random Fields" → "Supervised Color Image Segmentation in a Markovian Framework" → Usable on https://github.com/proger/mrf
  16. → Do more things with MRFs! (like segmentation) → MAP

    using Graph Cuts (Ford-Fulkerson for max- flow/min-cut) → CRF (learning potentials by conditioning on training data)
  17. Next steps Probabilistic Graphical Models by Daphne Koller (MRF is

    a "undirected probabilistic graphical model") Pattern Recognition and Machine Learning by Christopher Bishop (More theory on everything)