Daniel Lyons' Notes

Vectoring Words (Word Embeddings) - Computerphile

Description

How do you represent a word in AI? Rob Miles reveals how words can be formed from multi-dimensional vectors - with some unexpected results.

08:06 - Yes, it's a rubber egg :)

Unicorn AI:
EXTRA BITS: https://youtu.be/usthqKtw2LA
AI YouTube Comments: https://youtu.be/XyMdpcAPnZc

More from Rob Miles: http://bit.ly/Rob_Miles_YouTube

Thanks to Nottingham Hackspace for providing the filming location: http://bit.ly/notthack

This video was filmed and edited by Sean Riley.

Computer Science at the University of Nottingham: https://bit.ly/nottscomputer

Computerphile is a sister project to Brady Haran's Numberphile. More at http://www.bradyharan.com

My Notes

  • Subject: embeddings
  • 05:07: Two words are similar if they are often used in similar context.
  • 05:39: The problem that word embeddings is trying to solve is How do you represent words as vectors such that two similar words makes two similar vectors.

Transcript

Vectoring Words (Word Embeddings) - Computerphile
Interactive graph
On this page
Description
My Notes
Transcript