🏠 홈
벤치마크
📊 모든 벤치마크 🦖 공룡 v1 🦖 공룡 v2 ✅ 할 일 목록 앱 🎨 창의적인 자유 페이지 🎯 FSACB - 궁극의 쇼케이스 🌍 번역 벤치마크
모델
🏆 톱 10 모델 🆓 무료 모델 📋 모든 모델 ⚙️ 킬로 코드 모드
리소스
💬 프롬프트 라이브러리 📖 AI 용어 사전 🔗 유용한 링크

AI 용어집

인공지능 완전 사전

162
카테고리
2,032
하위 카테고리
23,060
용어
📖
용어

Active Learning for NLP

Learning paradigm where the model intelligently selects the most informative text examples to annotate, thereby optimizing the use of human annotation resources in natural language processing tasks.

📖
용어

Density-Weighted Active Learning

Approach combining model uncertainty with example density in the feature space, favoring uncertain text samples located in dense data regions.

📖
용어

Pool-Based Active Learning

Framework where the algorithm has access to a fixed pool of unlabeled text examples and iteratively selects the most informative instances for human annotation.

📖
용어

Stream-Based Active Learning

Paradigm where textual data arrives sequentially and the model must decide in real-time whether to annotate or reject each instance without the possibility of going back.

📖
용어

Batch Mode Active Learning

Variant selecting multiple text samples simultaneously for annotation, optimizing human annotation processes in batches while maintaining diversity among chosen instances.

📖
용어

Active Learning for Sequence Labeling

Specialization of active learning for sequence labeling tasks such as NER or POS tagging, where selection occurs at the level of entire sequences or tokens.

📖
용어

Core-Set Selection

Theoretical approach ensuring that the selected set well approximates the complete dataset, often used in the context of active learning for NLP with performance guarantees.

📖
용어

Annotation Cost Modeling

Technique integrating variable annotation costs (time, expertise required) into the sample selection process, optimizing cost-effectiveness in NLP projects.

📖
용어

Active Learning for Low-Resource Languages

Application spécialisée de l'apprentissage actif pour les langues avec peu de données disponibles, utilisant des stratégies de transfert et de sélection adaptées aux contraintes linguistiques.

📖
용어

Active Learning with Pre-trained Models

Intégration de l'apprentissage actif avec des modèles de langue pré-entraînés comme BERT, exploitant les représentations contextuelles pour améliorer la sélection d'échantillons informatifs.

📖
용어

Multi-Task Active Learning

Cadre où un seul ensemble d'annotations est utilisé pour améliorer plusieurs tâches NLP simultanément, optimisant la sélection d'échantillons bénéfiques pour l'ensemble des tâches.

📖
용어

Active Learning for Text Classification

Spécialisation de l'apprentissage actif pour les tâches de classification de documents, utilisant des stratégies adaptées aux particularités des données textuelles de grande dimension.

📖
용어

Cold Start Problem in Active Learning

Défi initial où le modèle manque de données étiquetées pour faire des prédictions fiables, nécessitant des stratégies d'initialisation comme l'échantillonnage aléatoire ou l'apprentissage semi-supervisé.

🔍

결과를 찾을 수 없습니다