Understanding and Implementing Recurrent Neural Networks using Python
Recurrent Neural Networks (RNNs) have become famous over time due to their property of retaining internal memory. These neural nets are widely used in recognizing patterns in sequences of data, like numerical timer series data, images, handwritten text, spoken words, genome sequences, and much more. Since these nets possess memory, there is a certain analogy that we can make to the human brain in order to learn how RNNs work. RNNs can be thought of as a network of neurons with feedback connections, unlike feedforward connections which exist in other types of Artificial Neural Networks.
The flow of the talk will be as follows:
- Self Introduction
- Introduction to Deep Learning
- Artificial Neural Networks (ANNs)
- Diving DEEP into Recurrent Neural Networks (RNNs)
- Comparing Feedforward Networks with Feedback Networks
- Quick walkthrough: Implementing RNNs using Python (Keras)
- Understanding Backpropagation Through Time (BPTT) and Vanishing Gradient Problem
- Towards more sophisticated RNNs: Gated Recurrent Units (GRUs)/Long Short-Term Memory (LSTMs)
- End of talk
- Questions and Answers Session
- Familiarity with programming in Python.
- Basic knowledge of Linear Algebra, Probability Theory, and Statistics.
- A basic idea of how Artificial Neural Networks work.
- Some experience with Keras or TensorFlow will be good but not necessary.
I delivered a talk on Recurrent Neural Networks at GeoPython 2018, Switzerland. The proposed talk will be enhanced version of my previous talk. This time, I will be covering more topics to make it a more detailed talk. Link to my previous talk: https://github.com/greatdevaks/GeoPython_Basel_2018
- Former Software Developer Intern at IBM & an ALL STACK DEVELOPER capable of designing and developing solutions for Mobile, Web, Embedded Systems, and Desktop. Areas of interest are Computational Neuroscience, Deep Learning, and Cloud Computing.
- Represented India at International Hackathons like Hack Junction’16, Finland and Hack the North’16, Canada. Got invited for more than a ‘dozen’ of prestigious International Hackathons (PennApps’17, HackNY’17, Hack Princeton’17 and many more) and Conferences.
- Recently talked about "Understanding and Implementing Recurrent Neural Networks using Python" at GeoPython, Basel, Switzerland'18.
- Will be speaking about Artificial Neural Networks at EuroPython 2018, Edinburgh, Scotland.
- A Microsoft Certified Professional, Microsoft Technology Associate, IBM Certified Web Developer, and Hewlett Packard Certified Developer.
- Has 8+ International Publications. [Latest work got published in ACM CHI 2018. The project was exhibited in Montreal, Canada.]
- Received 6 Honours and Awards (International and National level).
My compact Biography: My name is Anmol Krishan Sachdeva. I am currently pursuing MSc Advanced Computing from University of Bristol, United Kingdom. My specialization is in AI, ML, Applied Data Science, Computer Vision, and Computational Neuroscience. I am also doing research work on Neural Networks and Computational Neuroscience and hence understand the nitty-gritty of the subject. Deep Learning is a Black Art and I want to impart knowledge of this Black Art to people who are willing to learn. This conference is the right place to deliver the knowledge. Looking forward to speaking at the conference.