Differentiation Engines: The Elves behind the AI Christmas

rajdeep (~rajdeep1008)


10

Votes

Description:

What do all recent advancements in Large Language Models (LLMs), Computer Vision, and State-of-the-Art deep learning models have in common? They rely on frameworks like TensorFlow, PyTorch, and JAX. And what does all these frameworks have in common? They are all underpinned by numerical differentiation engines crucial for minimizing various loss functions.

This talk will provide a comprehensive view on:

  • The different mathematical differentiation techniques and libraries within the Python ecosystem.
  • An in-depth look at what powers modern machine learning frameworks.
  • Exploration of Automatic Differentiation engines and their rising dominance in recent years.

I will focus on the array of techniques available, such as numerical differentiation, symbolic differentiation, and why automatic differentiation is increasingly preferred for machine learning applications. This discussion, tailored for the Python ecosystem, will adopt a code-first approach without delving deeply into the underlying mathematics.

Outline

  • Introduction [2 minute]
    • Overview of the agenda.
    • What attendees will learn from this talk and what will not be covered (specifically, intricate mathematical theories).
  • Quick mathematical refresher [4 minutes]
    • Basics of calculus including differentiation of expressions involving operations like multiplication and addition.
    • Understanding gradients and their role in the machine learning ecosystem.
  • Algorithmic Differentiation [7 minutes]
    • Various methods to perform mathematical differentiation in code: Numerical, Symbolic, and Automatic.
    • Implementation techniques: Operator overloading vs. Source Code transformation.
    • Pros and cons of each method.
  • Automatic Differentiation [12 minutes]
    • Mechanisms of Automatic Differentiation (AD).
    • Exploring Forward and Reverse mode AD.
    • What the famous kids are using - PyTorch, TensorFlow, Google JAX
    • What the not so famous kids are using - Sympy, Google Tangent, Autograd

Expected Takeaways Attendees will gain an understanding of the mathematical operations that underpin ML frameworks and how they are implemented behind the scenes.

Prerequisites:

  • Basic familiarity with deep learning frameworks such as PyTorch, TensorFlow, etc.
  • Basic understanding of high school calculus

Speaker Info:

Rajdeep is a Research Associate at the Distributed Research on Emerging Applications and Machines Lab (DREAM:Lab) at the Indian Institute of Science (IISc). Prior to this role, he led the Data Engineering team at Loco, one of India's largest esports platforms. Rajdeep is also a contributing author at Kodeco and a writer on Medium.

In his free time, he likes to explore the fields of robotics and computer vision, applying both traditional methods and advanced deep learning techniques to broaden his expertise in these areas.

Speaker Links:

Linkedin: https://www.linkedin.com/in/rajdeep1008/

Github: https://github.com/singhsegv

Medium: https://medium.com/@rajdeepsingh

Kodeco: https://www.kodeco.com/library?q=rajdeep

Section: Artificial Intelligence and Machine Learning
Type: Talk
Target Audience: Beginner
Last Updated: