Embeddings Explained: The master algorithm to represent words, networks and customers

Embeddings are one of the most successful applications of neural networks. They first shot to prominence in Word2vec, a model for distributed representations of words developed at Google by Tomas Mikolov et al. They are a key part of all voice recognition systems (Siri, Alexa etc.) and modern search engines and seem to improve any imaginable NLP task. Recently embeddings have been applied in a broad array of domains including to represent social media users (eg. Facebook’s social graph), Amazon products and ASOS customers.

In this session, I will explain what neural embeddings are, why they are so useful in practice and how they work.

You can view Ben’s slides below:

Ben Chamberlain: Embeddings Explained

You can view Ben’s presentation below:

Track 3
Location:   Date: October 9, 2017 Time: 4:30 pm – 5:15 pm Benjamin Chamberlain, Imperial College London