Artificial Super Intelligence Classifying disease Self driving cars Playing Go · · · Using the knowledge of driving a car and applying it to another domain specific task In general transcending domains · · Scaling intelligence and moving beyond human capabilities in all fields Far away? · · 5/40
that? Data: Data is not available in cardinality needed for many real world interesting applications Structure: Problem structure is hard to detect without domain knowledge Identifiability: For any given data set there are many possible models that fit really well to it with fundamentally different interpretations Priors: The ability to add prior knowledge about a problem is crucial as it is the only way to do science Uncertainty: Machine learning application based on maximum likelihood cannot express uncertainty about it's model · · · · · 8/40
∼ ∼ = = ∼ N ( , ) μ x σ x N ( , ) μ y σ y (r + δ) cos( ) t 2π (r + δ) sin( ) t 2π N (0.5, 0.1) Instead of throwing a lot of nonlinear generic functions at this beast we could do something different · From just looking at the data we can see that the generating functions must look like Which fortunatly can be · · 20/40
knowledge into the model solving for mathematical structure A generative model can be realized Direct measures of uncertainty comes out of the model No crazy statistical only results due to identifiability problems · · · · 21/40
has a strong anticipation about what will come Look at the tiles to the left and judge the color of the A and B tile To a human this task is easy because · · · 28/40
decisions in the face of uncertainty. Probabilistic reasoning combines knowledge of a situation with the laws of probability. Until recently, probabilistic reasoning systems have been limited in scope, and have not successfully addressed real world situations. It allows us to specify the models as we see fit Curse of dimensionality is gone We get uncertainty measures for all parameters We can stay true to the scientific principle We do not need to be experts in MCMC to use it! · · · · · 31/40
functions in Stan’s probabilistic programming language and get: Stan’s math library provides differentiable probability functions & linear algebra (C++ autodiff). Additional R packages provide expression-based linear modeling, posterior visualization, and leave-one-out cross-validation. full Bayesian statistical inference with MCMC sampling (NUTS, HMC) approximate Bayesian inference with variational inference (ADVI) penalized maximum likelihood estimation with optimization (L-BFGS) · · · 32/40
gave you two a task of investing 1 million USD in either Ratio or TV advertising The average ROI for Radio and TV is How would you invest? · · 0.5 · Now I will tell you that ROI are actually a distribution Radio and TV both have a minimum value of 0 · · Radio and TV have a maximum of 8 and 1.2 respectively Where do you invest? · · How to think about this? You need to ask the following question What is ? · · · p(ROI > 0.3) 33/40
image freely available with an up to date R version installed and the most common packages https://hub.docker.com/r/drmike/r-bayesian/ · · R: Well you know RStan: Run the Bayesian model OpenCPU: Immediately turn your R packages into REST API's · · · 36/40
learning and inference machines Don't get stuck in patterns using existing model structures Stay true to the scientific principle Always state your mind! Be free, be creative and most of all have fun! · · · · · 39/40