Best Practices for Optimizing Your dbt and Snowflake Deployment

October 7, 2021

Companies in every industry acknowledge that data is one of their most important assets, yet consistently fall short of realizing the potential of their data. One key reason is the proliferation of data silos, which create expensive and time-consuming bottlenecks, erode trust, and render governance and collaboration nearly impossible. 

This is where Snowflake and dbt come in. By combining dbt with Snowflake, data teams can collaborate on data transformation workflows while operating out of a central source of truth. Snowflake customers can operationalize and automate Snowflake’s hallmark scalability within dbt as part of their analytics engineering workflow and pay only for the resources they need, when they need them, thus maximizing efficiency and minimizing waste and costs. 

In this paper, we provide some best practices for optimizing dbt with Snowflake, including:

  • Automated resource optimization for dbt query tuning

  • Resource management and monitoring

  • Role-based access controls (RBAC)

  • Individual dbt workload elasticity, and more.

Download this comprehensive white paper to learn everything you need to know about optimizing your dbt and Snowflake implementation and unlock the most value from your data while minimizing resources.

Previous Flipbook
Moving from On-Premises ETL to Cloud-Driven ELT
Moving from On-Premises ETL to Cloud-Driven ELT

Next Flipbook
Snowflake Pattern - Ingestion - Ingestion from Oracle Exadata using Matillion
Snowflake Pattern - Ingestion - Ingestion from Oracle Exadata using Matillion

Use Matillion's ETL to move data from Oracle Exadata to Snowflake