Chapter 5. Ranking search results with word embeddings

 

This chapter covers

  • Statistical and probabilistic retrieval models
  • Working with the ranking algorithm in Lucene
  • Neural information retrieval models
  • Using averaged word embeddings to rank search results

Since chapter 2, we’ve been building components based on neural networks that can improve a search engine. These components aim to help the search engine better capture user intent by expanding synonyms, generating alternative representations of a query, and giving smarter suggestions while the user is typing a query. As these approaches show, a query can be expanded, adapted, and transformed before matching with the terms stored in the inverted indexes is performed. Then, as mentioned in chapter 1, the terms of the query are used to find matching documents.

5.1. The importance of ranking

 
 

5.2. Retrieval models

 
 

5.3. Neural information retrieval

 

5.4. From word to document vectors

 
 
 

5.5. Evaluations and comparisons

 
 
 

Summary

 
sitemap

Unable to load book!

The book could not be loaded.

(try again in a couple of minutes)

manning.com homepage
test yourself with a liveTest