Text Embeddings Quiz

Complete this assessment with 100% score to master this chapter.

01What does 'Cosine Similarity' measure between two vectors?

02What famous semantic relationship was discovered in Word2Vec?

03How does 'Retrieval-Augmented Generation' (RAG) help a language model?

04Why do you need a specialized 'Vector Database' for large AI apps?

05Which of these is a common open-source library for generating high-quality sentence embeddings?

06What is a 'Vector' in the context of embeddings?

07What is the result of a Cosine Similarity score of 1.0?

08In a RAG pipeline, what is 'Retrieval'?

09What is the advantage of using 'Dense' embeddings over 'Sparse' methods like keyword matching?

10What does it mean if two word vectors are 'close' to each other in vector space?