🏠 Home
Benchmark Hub
📊 All Benchmarks 🦖 Dinosaur v1 🦖 Dinosaur v2 ✅ To-Do List Applications 🎨 Creative Free Pages 🎯 FSACB - Ultimate Showcase 🌍 Translation Benchmark
Models
🏆 Top 10 Models 🆓 Free Models 📋 All Models ⚙️ Kilo Code
Resources
💬 Prompts Library 📖 AI Glossary 🔗 Useful Links

AI Glossary

The complete dictionary of Artificial Intelligence

162
categories
2,032
subcategories
23,060
terms
📖
terms

Covariance Function

Kernel function that defines the correlation between two input points in a Gaussian process, determining the regularity and structure of the modeled function.

📖
terms

Matérn Kernel

Family of covariance functions parameterized by a roughness factor ν, offering fine control over the differentiability of the modeled Gaussian process.

📖
terms

RBF (Gaussian) Kernel

Infinitely differentiable radial basis function covariance, assuming very smooth functions and widely used for standard Gaussian processes.

📖
terms

Kernel Hyperparameters

Parameters of the covariance function (such as length scale and variance) that control the behavior of the Gaussian process and are optimized by maximum likelihood.

📖
terms

Length Scale

Kernel hyperparameter determining the distance over which input points are correlated, controlling the variability of the function modeled by the Gaussian process.

📖
terms

Signal Variance

Kernel hyperparameter representing the vertical standard deviation of the modeled function, controlling the average amplitude of fluctuations in the Gaussian process.

📖
terms

Observational Noise

Parameter σ² modeling the uncertainty of observations, added to the diagonal of the covariance matrix to handle noisy data in Gaussian processes.

📖
terms

Conditional Distribution Prediction

Calculation of the posterior distribution of the Gaussian process at a new point, conditioned on existing observations to provide predictive mean and variance.

📖
terms

Maximum Likelihood Evidence Maximization

Procedure for optimizing Gaussian process hyperparameters by maximizing the marginal log-likelihood of the observed data under the model.

📖
terms

Karhunen-Loève Theorem

Decomposition of a Gaussian process into a series of orthogonal functions with independent Gaussian coefficients, enabling a compact representation of the process.

📖
terms

Dot-Product Kernel

Covariance function k(x,x') = σ² + xᵀx' used to model linear or polynomial functions in Gaussian processes.

📖
terms

Deep Gaussian Process

Extension of Gaussian processes where the covariance function is itself parameterized by a neural network, allowing for complex non-stationary models.

📖
terms

Sparse Gaussian Process

Computational approximation using inducing points to reduce the cubic complexity O(n³) of standard Gaussian processes for large datasets.

📖
terms

Cholesky Decomposition

Factorization of the covariance matrix K = LLᵀ used to efficiently solve linear systems and compute the log-likelihood in Gaussian processes.

🔍

No results found