Recurrent neural networks (RNNs) are finding increasing popularity in the machine learning field, demonstrating notable advantages over vanilla, feed-forward, networks in the NLP space. However, RNNs have a reputation for being much more difficult to both implement and reason about. Thankfully, TensorFlow is here to help! With no assumptions about your previous knowledge of either RNNs and TensorFlow, this talk will demonstrate how to use TensorFlow to implement a RNN that can perform text generation.
Required audience experience: Basic machine learning knowledge required.
Objective of the talk: The talk will introduce the audiencethe underlying theory of recurrent neural networks (RNNs) and how RNNs can be implemented using TensorFlow.
Keywords: TensorFlow, Machine Learning, Python, Recurrent Neural Networks, Deep Learning
You can view Marcel’s slides below:
Marcel Tilly: An introduction to recurrent neural networks