Skip to main content

Batch and Streaming Data Ingestion Best Practices with Snowflake

The surge of real-time data streams puts pressure on data engineering teams who want to derive analytical value from this data by being able to easily combine it with historical data. Join us for a product deep-dive session on what’s new with data ingestion in Snowflake for both batch and streaming.

By the end of this session, you will:

  • Know how to leverage COPY, Snowpipe, and Snowpipe Streaming to build batch and streaming data pipelines at scale
  • Have a better understanding of how schema detection, schema evolution, and error notifications can help you manage pipelines in Snowflake
  • Get a preview of dynamic tables and how to use them to easily combine data streams with historical tables