YZ Sözlüğü
Yapay Zekanın tam sözlüğü
Random Forest
Ensemble method that combines multiple decision trees trained on data subsets to improve predictive accuracy and reduce overfitting.
Bagging
Bootstrap aggregating technique where multiple models are trained on different bootstrap samples and their predictions are combined by majority vote.
Decision Trees
Predictive models that build a tree-like structure of decisions based on data features to arrive at a final prediction.
Out-of-Bag
Samples not selected during bootstrap for a specific tree, used as an internal validation set to estimate generalization error.
Feature Sampling
Random selection of a subset of features at each node split, increasing diversity among forest trees.
Majority Vote
Aggregation method where the predicted class is the one receiving the most votes among all trees for classification problems.
Average of Predictions
Aggregation technique for regression where the final predicted value is the average of predictions from all forest trees.
Terminal Node
Leaf of the decision tree where no further split is performed, containing the final prediction for samples reaching this point.
Splitting Criterion
Metric used to evaluate the quality of a node split, such as Gini index or entropy for classifications.
Hyperparameters
Configurable parameters before training that control the behavior of the random forest, such as the number of trees or maximum depth.