6 Reasoning with word embeddings (word vectors)
This chapter covers
- Understanding word embeddings or word vectors
- Representing meaning with a vector
- Customizing word embeddings to create domain-specific nessvectors
- Reasoning with word embeddings
- Visualizing the meaning of words
Word embeddings are perhaps the most approachable and generally useful tools in your NLP toolbox. They can give your NLP pipeline a general understanding of words. In this chapter you will learn how to apply word embeddings to real world applications. And just as importantly you’ll learn where not to use word embeddings. And hopefully these examples will help you dream up new and interesting applications in business as well as in your personal life.
You can think of word vectors as sorta like lists of attributes for Dota 2 heroes or roll playing game (RPG) characters and monsters. Now imagine that there was no text on these character sheets or profiles. You would want to keep all the numbers in a consistent order so you knew what each number meant. That’s how word vectors work. The numbers aren’t labeled with their meaning. They are just put in a consistent slot or location in the vector. That way when you add or subtract or multiply two word vectors together the attribute for "strength" in one vector lines up with the strength attribute in another vector. Likewise for "agility" and "intelligence" and alignment or philosophy attributes in D&D (Dungeons and Dragons).