Awesome
KernelFunctions.jl
Kernel functions for machine learning
KernelFunctions.jl is a general purpose kernel package. It provides a flexible framework for creating kernel functions and manipulating them, and an extensive collection of implementations. The main goals of this package are:
- Flexibility: operations between kernels should be fluid and easy without breaking, with a user-friendly API.
- Plug-and-play: being model-agnostic; including the kernels before/after other steps should be straightforward. To interoperate well with generic packages for handling parameters like ParameterHandling.jl and FluxML's Functors.jl.
- Automatic Differentiation compatibility: all kernel functions which ought to be differentiable using AD packages like ForwardDiff.jl or Zygote.jl should be.
Examples
x = range(-3.0, 3.0; length=100)
# A simple standardised squared-exponential / exponentiated-quadratic kernel.
k₁ = SqExponentialKernel()
K₁ = kernelmatrix(k₁, x)
# Set a function transformation on the data
k₂ = Matern32Kernel() ∘ FunctionTransform(sin)
K₂ = kernelmatrix(k₂, x)
# Set a matrix premultiplication on the data
k₃ = PolynomialKernel(; c=2.0, degree=2) ∘ LinearTransform(randn(4, 1))
K₃ = kernelmatrix(k₃, x)
# Add and sum kernels
k₄ = 0.5 * SqExponentialKernel() * LinearKernel(; c=0.5) + 0.4 * k₂
K₄ = kernelmatrix(k₄, x)
plot(
heatmap.([K₁, K₂, K₃, K₄]; yflip=true, colorbar=false)...;
layout=(2, 2), title=["K₁" "K₂" "K₃" "K₄"],
)
<p align=center>
<img src="docs/src/assets/heatmap_combination.png" width=400px>
</p>
Related Work
This package replaces the now-defunct MLKernels.jl. It incorporates lots of excellent existing work from packages such as GaussianProcesses.jl, and is used in downstream packages such as AbstractGPs.jl, ApproximateGPs.jl, Stheno.jl, and AugmentedGaussianProcesses.jl.
See the JuliaGaussianProcesses Github organisation and website for more information.
Issues/Contributing
If you notice a problem or would like to contribute by adding more kernel functions or features please submit an issue, or open a PR (please see the ColPrac contribution guidelines).