🏠 홈
벤치마크
📊 모든 벤치마크 🦖 공룡 v1 🦖 공룡 v2 ✅ 할 일 목록 앱 🎨 창의적인 자유 페이지 🎯 FSACB - 궁극의 쇼케이스 🌍 번역 벤치마크
모델
🏆 톱 10 모델 🆓 무료 모델 📋 모든 모델 ⚙️ 킬로 코드 모드
리소스
💬 프롬프트 라이브러리 📖 AI 용어 사전 🔗 유용한 링크

AI 용어집

인공지능 완전 사전

162
카테고리
2,032
하위 카테고리
23,060
용어
📖
용어

C4.5

Supervised learning algorithm developed by Quinlan in 1993, an extension of ID3 capable of handling continuous attributes and missing data, using gain ratio as the splitting criterion.

📖
용어

C5.0

Improved version of C4.5 developed by Quinlan, offering superior performance, more efficient handling of large datasets, and the ability to generate ensembles of trees (boosting).

📖
용어

Gain ratio

Splitting criterion used in C4.5 to correct the bias of information gain towards attributes with many values, calculated as the information gain divided by the intrinsic entropy of the attribute.

📖
용어

Intrinsic entropy

Measure used in the calculation of gain ratio to penalize attributes with a large number of values, representing the amount of potential information contained in the distribution of an attribute's values.

📖
용어

Binary discretization

Technique used by C4.5 to transform continuous attributes into binary categorical attributes by identifying the optimal splitting point that maximizes information gain.

📖
용어

Missing value handling

C4.5's ability to handle instances with missing attributes using probabilistic weighting methods or by fractionally distributing the instance across possible branches.

📖
용어

Pessimistic pruning

Complexity reduction method in C4.5 that eliminates non-essential branches using a pessimistic statistical estimate of error based on the binomial distribution.

📖
용어

C5.0 Boosting

Ensemble learning technique implemented in C5.0 that combines multiple weak decision trees to create a strong classifier, significantly improving prediction accuracy.

📖
용어

Optimal cut point

Threshold value determined by C4.5 to split a continuous attribute into two intervals, selected to maximize the information gain of the resulting split.

📖
용어

Normalized information gain

Variant of information gain used in some contexts to avoid bias, similar to gain ratio but with a slightly different mathematical approach to normalization.

📖
용어

C4.5 decision tree

Hierarchical structure produced by the C4.5 algorithm where each internal node represents a test on an attribute, each branch represents a test outcome, and each leaf represents a class label.

📖
용어

C5.0 sliding window

Optimization in C5.0 to efficiently process large datasets using a window of samples that moves through the complete dataset during tree construction.

📖
용어

Confidence factor

Parameter in C4.5 (typically 25%) used in error estimation for pruning, controlling the level of pessimism in evaluating tree branch performance.

📖
용어

C4.5 IF-THEN rules

Alternative representation of decision trees generated by C4.5 where each path from root to leaf is converted into a conditional classification rule.

📖
용어

C4.5 computational complexity

Algorithmic cost of C4.5 on the order of O(n * m * log n) where n is the number of instances and m is the number of attributes, optimized by sorting and incremental computation techniques.

📖
용어

Multi-way split

Ability of C4.5 to create nodes with more than two branches for categorical attributes, unlike other algorithms that limit themselves to binary splits.

🔍

결과를 찾을 수 없습니다