Latent Diffusion Models
Prompt Tokenization
Preprocessing step where input text is converted into a sequence of numerical identifiers (tokens) that are then transformed into embeddings by the language model (e.g., CLIP) for conditioning.
← Indietro