MemotivaLLM Engineer Interview Questions: Transformer Architecture, Self-Attention, and Modern LLM Foundations

What is autoregressive language modeling?

LLM Engineer Interview Questions: Transformer Architecture, Self-Attention, and Modern LLM Foundations

Audio flashcard · 0:17

Nortren·

What is autoregressive language modeling?

0:17

Autoregressive language modeling means generating text one token at a time, where each new token is conditioned on all the previous tokens. The model predicts the probability distribution over the vocabulary for the next token, then samples or selects from it. This is how all decoder-only LLMs generate text.
huggingface.co