Toward Theoretical Understanding of Deep Learning





We survey progress in recent years toward developing a theory of deep learning. Works have started addressing issues such as: (a) the effect of architecture choices on the optimization landscape, training speed, and expressiveness (b) quantifying the true "capacity" of the net, as a step towards understanding why nets with hugely more parameters than training examples nevertheless do not overfit (c) understanding inherent power and limitations of deep generative models, especially (various flavors of) generative adversarial nets (GANs) (d) understanding properties of simple RNN-style language models and some of their solutions (word embeddings and sentence embeddings)


This talk will be of general in nature. Those who are witnessing the recent AI hype should be able to follow my talk. Basic python knowledge is assumed.

Content URLs:

Speaker Info:

Parthiban Srinivasan is the CEO of VINGYANI, a data science company deals with Informatics 2.0, that is, Deep learning, Natural Language Processing and Machine Learning for Drug Discovery and Health. Parthiban Srinivasan is an experienced data scientist, earned his PhD from Indian Institute of Science, specializing in Computational Chemistry. He holds dual Masters Degree- one in Science and the other in Engineering. After his PhD, he continued the research at NASA Ames Research Center (USA) and Weizmann Institute of Science (Israel). Then he worked at AstraZeneca in the area of Computer Aided Drug Design for Tuberculosis. Later, he headed informatics business units in Jubilant Biosys and then in GvkBio before he floated the company, Parthys Reverse Informatics. Now his recent venture is VINGYANI.

Speaker Links:

Section: Others
Type: Talks
Target Audience: Intermediate
Last Updated:

Description of the talk is vague!

jatin raj (~jatin85)

Login to add a new comment.