Embeddings are one of the most successful applications of neural networks. They first shot to prominence in Word2vec, a model for distributed representations of words developed at Google by Tomas Mikolov et al. They are a key part of all voice recognition systems (Siri, Alexa etc.) and modern search engines and seem to improve any imaginable NLP task. Recently embeddings have been applied in a broad array of domains including to represent social media users (eg. Facebook’s social graph), Amazon products and ASOS customers.
In this session, I will explain what neural embeddings are, why they are so useful in practice and how they work.
You can view Ben’s slides below:
Ben Chamberlain: Embeddings Explained
You can view Ben’s presentation below: