We're sorry but this page doesn't work properly without JavaScript enabled. Please enable it to continue.
Feedback

Understanding and Implementing Recurrent Neural Networks using Python

Formale Metadaten

Titel
Understanding and Implementing Recurrent Neural Networks using Python
Serientitel
Anzahl der Teile
132
Autor
Lizenz
CC-Namensnennung - keine kommerzielle Nutzung - Weitergabe unter gleichen Bedingungen 3.0 Unported:
Sie dürfen das Werk bzw. den Inhalt zu jedem legalen und nicht-kommerziellen Zweck nutzen, verändern und in unveränderter oder veränderter Form vervielfältigen, verbreiten und öffentlich zugänglich machen, sofern Sie den Namen des Autors/Rechteinhabers in der von ihm festgelegten Weise nennen und das Werk bzw. diesen Inhalt auch in veränderter Form nur unter den Bedingungen dieser Lizenz weitergeben
Identifikatoren
Herausgeber
Erscheinungsjahr
Sprache

Inhaltliche Metadaten

Fachgebiet
Genre
Abstract
Recurrent Neural Networks (RNNs) have become famous over time due to their property of retaining internal memory. These neural nets are widely used in recognizing patterns in sequences of data, like numerical timer series data, images, handwritten text, spoken words, genome sequences, and much more. Since these nets possess memory, there is a certain analogy that we can make to the human brain in order to learn how RNNs work. RNNs can be thought of as a network of neurons with feedback connections, unlike feedforward connections which exist in other types of Artificial Neural Networks. The flow of talk will be as follows: - Self Introduction - Introduction to Deep Learning - Artificial Neural Networks (ANNs) - Diving DEEP into Recurrent Neural Networks (RNNs) - Comparing Feedforward Networks with Feedback Networks - Quick walkthrough: Implementing RNNs using Python (Keras) - Understanding Backpropagation Through Time (BPTT) and Vanishing Gradient Problem - Towards more sophisticated RNNs: Gated Recurrent Units (GRUs)/Long Short-Term Memory (LSTMs) - End of talk - Questions and Answers Session