Deep(Neural ) Language Models (keras/Tensorflow)

Kashyap Raval (~kashyap32)


6

Votes

Description:

Neural language models (NLMs) have been able to improve machine translation (MT) thanks to their ability to generalize well to long contexts, Our language model, partly inspired by human memory, is built upon the powerful deep learning-based Long Short Term Memory architecture that is capable of learning long-term dependencies. Things that we will learn: basics of RNN, Keras, Q/A Chatbot.

Abstract:

Deep Learning and Recurrent Neural Networks (RNNs) have fueled language modeling research in the past years as it allowed researchers to explore many tasks for which the strong conditional independence assumptions are unrealistic.Deep learning neural networks have shown promising results in problems related to vision, speech and text with varying degrees of success.The analogous neural network for text data is the recurrent neural network (RNN). This kind of network is designed for sequential data and applies the same function to the words or characters of the text. These models are successful in translation (Google Translate), speech recognition and language generation. In this workshop, we will write an RNN in Keras that can 1) classify the intent. 2) Give an appropriate answer.

  • 1). Introduction Of neural networks.

  • 2).An introduction of keras using tensorflow backend.(live demo of TensorBoard too.)

  • 3).RNN in keras.

  • 4). Q/A chatbot Using keras.

Prerequisites:

The participants should have interest in ML/DL.

  1. Basics of Linear Algebra
  2. Python

Speaker Info:

Python-lover Machine Learning\Deep learning enthusiast.My main work focused on ML / DL / NLP/ WEB. I am an also open source contributor.

Work : -

   Intern as  Python Developer at LetsNurture.

Previous Talk:-

    GDG (Google Developer Group) Ahmedabad - AI/ML in chatbot
    Alpha College of Engineering - Python

Speaker Links:

Section: Data Analysis and Visualization
Type: Workshops
Target Audience: Advanced
Last Updated: