MemotivaLLM Engineer Interview Questions: Transformer Architecture, Self-Attention, and Modern LLM Foundations

What is the difference between pretraining and fine-tuning?

LLM Engineer Interview Questions: Transformer Architecture, Self-Attention, and Modern LLM Foundations

Audio flashcard · 0:20

Nortren·

What is the difference between pretraining and fine-tuning?

0:20

Pretraining is the initial training of a language model on a massive corpus of unlabeled text using self-supervised objectives like next-token prediction. Fine-tuning takes a pretrained model and adapts it to a specific task, domain, or instruction-following style using a smaller labeled dataset. Pretraining costs millions of dollars; fine-tuning is typically much cheaper. ---
huggingface.co