MemotivaRAG & Vector DB Interview: Embeddings, Cosine Similarity, Dimensions, Models Compared

What is the difference between OpenAI text-embedding-3-small and text-embedding-3-large?

RAG & Vector DB Interview: Embeddings, Cosine Similarity, Dimensions, Models Compared

Audio flashcard · 0:34

Nortren·

What is the difference between OpenAI text-embedding-3-small and text-embedding-3-large?

0:34

text-embedding-3-small produces 1536-dimensional vectors and costs about five times less than text-embedding-3-large per token. text-embedding-3-large produces 3072-dimensional vectors with measurably higher accuracy on the MTEB benchmark, scoring around 64.6 versus 62.3. Both support Matryoshka truncation, so you can shrink the large model to 256 or 1024 dimensions and still beat the small model at the same size. Choose small for cost-sensitive production at scale, and large when retrieval quality is the bottleneck and storage cost is acceptable.
platform.openai.com