AI-ordlista
Den kompletta ordlistan över AI
Rotation Forest
Ensemble method that builds decision trees on sets of features transformed by Principal Component Analysis (PCA) to maximize diversity between base classifiers.
PCA on subsets
Application of Principal Component Analysis to random partitions of the feature space, creating distinct projection axes for each classifier in the forest.
Rotation matrix
Orthogonal matrix resulting from PCA, used to project data into a new feature space, ensuring decorrelation and diversity of tree predictions.
Classifier diversity
Fundamental principle of ensemble methods aiming to maximize prediction differences between base models to reduce overall variance and improve generalization.
Bagging with transformation
Extension of Bootstrap Aggregating where bootstrapped samples undergo feature space transformation before training each base model.
Group feature selection
Technique of partitioning variables into disjoint subsets on which independent transformations are applied, increasing classifier heterogeneity.
K-fold feature splitting
Strategy of dividing features into K groups for Rotation Forest, where each group is transformed separately before being recombined to form the final feature set.
Augmented feature space
New representation space created by concatenating principal components from each feature subset, preserving all original information while increasing diversity.
Decision tree on projected data
Base classifier in Rotation Forest trained on data previously projected into a transformed space by PCA, where decision nodes operate on linear combinations of original features.
Rotation coefficients
Parameters of the orthogonal transformation matrix that define how each original feature contributes to the new principal components for a specific tree in the forest.
Variance explained by component
Metric from PCA indicating the proportion of total data variance captured by each principal component, influencing the quality of transformation in Rotation Forest.
Feature orthogonalization
Mathematical process ensuring linear independence between newly created features, essential to avoid redundancy and maximize diversity in ensembles.
Heterogeneous ensemble
Collection of base classifiers operating on different feature spaces, as in Rotation Forest where each tree sees a unique rotation of input data.
Covariance reduction
Objective of Rotation Forest aiming to minimize covariance between errors of different classifiers by forcing them to operate on decorrelated data representations.
Local linear projection
Transformation specific to each tree in the forest, applied only to a subset of features, creating a unique perspective on the data for that particular classifier.
PCA stability
Robustness of principal component decomposition against variations in training data, a critical factor for the performance and consistency of Rotation Forest.
Group Hyperparameter
Parameter controlling the number of features per group in the Rotation Forest, directly influencing the balance between classifier diversity and information preserved in each transformation.