This talk was presented as part of JuliaCon 2021.
Abstract:
ReactiveMP.jl is a native Julia implementation of reactive message passing-based Bayesian inference in probabilistic graphical models. The package supports a large range of standard probabilistic models and can be extended to custom novel nodes and message update rules. In contrast to non-reactive (imperatively coded) Bayesian inference packages, ReactiveMP.jl scales easily to support inference on a standard laptop for large models with tens of thousands of variables and millions of nodes.
For more info on the Julia Programming Language, follow us on Twitter: https://twitter.com/JuliaLanguage and consider sponsoring us on GitHub: https://github.com/sponsors/JuliaLang
Contents
0:00 Welcome!
0:25 Overview of Julia probabilistic programming libraries: Gen.jl, Turing.jl, etc.
0:37 Evolution of ideas behind ForneyLab.jl
1:04 Brief introduction to ForneyLab.jl ideas
2:22 Example: estimating hidden state of moving object
4:41 Benefits of reactive implementation
5:42 Performance comparison
6:34 Plan for the future: combining ReactiveMR.jl and GraphPPL.jl
6:46 The scope of the package
7:35 Other usages examples
7:53 Quick overview of ReactiveMR.jl
S/O to https://github.com/KZiemian for the video timestamps!
Want to help add timestamps to our YouTube videos to help with discoverability? Find out more here: https://github.com/JuliaCommunity/YouTubeVideoTimestamps
Interested in improving the auto generated captions? Get involved here: https://github.com/JuliaCommunity/YouTubeVideoSubtitles
1 Comments