Skip to content

Module 1: Word Embeddings (How Computers Read)

📚 Module 1: Word Embeddings

Course ID: GENAI-501
Subject: The Meaning Map

Computers cannot read. They only understand numbers. To teach a computer to read, we must turn every word into a list of numbers (a Vector). We call this a Word Embedding.


🏗️ Step 1: The Meaning Map

🍎 The Analogy: The Fruit Map

  • Apple is at (5, 5).
  • Pear is at (5, 6). (Close together).
  • Bicycle is at (-10, -10). (Far away).

🏗️ Step 2: The “King - Man + Woman = Queen” Trick

Because words are numbers, we can do math on their meaning. Subtracting “maleness” from King and adding “femaleness” results in the vector for Queen.


🥅 Module 1 Review

  1. Word Embedding: A list of numbers representing a word’s meaning.
  2. Semantic Space: The “Map” where similar words live close to each other.
  3. Cosine Similarity: Measuring how similar two words are.

:::tip Slow Learner Note You don’t need to define the numbers yourself. Models like BERT have already “read” the internet to learn these numbers for us! :::