This talk was presented as part of JuliaCon 2021.
Abstract:
Kernel methods are widely used in statistics, machine learning, and physical simulations. These methods give rise to dense matrices that are naïvely expensive to multiply or invert. Herein, we present CovarianceFunctions.jl, a package that automatically detects and exploits low rankness, hierarchical structure, approximate sparsity. We highlight applications of this technology in Bayesian optimization and physical simulations.
For more info on the Julia Programming Language, follow us on Twitter: https://twitter.com/JuliaLanguage and consider sponsoring us on GitHub: https://github.com/sponsors/JuliaLang
Contents
0:00 Welcome!
0:23 Why do we care about Kernel Methods
1:02 Motivation
1:59 Lazy Representation of a Kernel Matrix
2:59 Structure of Fast Kernel Matrices
4:42 Fast Kernel Transform
6:23 Experiments
S/O to https://github.com/Sov-trotter for the video timestamps!
Want to help add timestamps to our YouTube videos to help with discoverability? Find out more here: https://github.com/JuliaCommunity/YouTubeVideoTimestamps
Interested in improving the auto generated captions? Get involved here: https://github.com/JuliaCommunity/YouTubeVideoSubtitles
1 Comments