Data Engineering and Machine Learning using Snowpark - Snowflake's Developer Framework

Prathamesh (~prathamesh6)


0

Votes

Description:

The Snowpark library provides an intuitive library for querying and processing data at scale in Snowflake. Using a library for any of three languages, you can build applications that process data in Snowflake without moving data to the system where your application code runs, and process at scale as part of the elastic and serverless Snowflake engine.

Snowflake currently provides Snowpark libraries for three languages: Java, Python, and Scala. This workshop will focus on Snowpark Python.

Outcomes: 1. Ingest & process data from cloud storage (structured/semi-structured/unstructured). 2. How to build end to end Data Engineering pipelines using Snowpark Python. 3. Performing Model Training and Scoring in Snowflake using Snowpark. 4. How to migrate code from PySpark to Snowpark. 5. Interactive Visualisations in Snowflake Streamlit.

Prerequisites:

Snowflake basics - helpful but not mandatory, Python basics, knowledge on Pandas or Spark Dataframe APIs

Video URL:

https://www.youtube.com/watch?v=_33pkWTa4BE

Speaker Info:

Prathamesh Nimkar - Senior Data Cloud Architect, Snowflake, Phani Raj - Senior Data Cloud Architect, Snowflake

Speaker Links:

https://medium.com/@prathamesh.nimkar, https://medium.com/@phaniraj2112

Section: Data Science, AI & ML
Type: Workshops
Target Audience: Intermediate
Last Updated: