Analytic workloads are moving to the cloud, away from on-premises environments, in substantial numbers.
According to industry research firm IDC’s latest big data and analytics software forecast (IDC #US44803719, September 2019), public cloud deployments, revenue-wise, are expanding at a 29.7% compounded annual growth rate (CAGR) compared to on-premises deployments, which are expanding at a 3.7% CAGR clip. IDC further projects that by 2023, cloud deployments for big data and analytics will account for 50% of all deployments, suggesting that cloud deployments will overtake on-premises deployments by 2023 and thereafter.
Being proudly a 100% cloud-built platform since our inception, this comes as no surprise to us here at Snowflake.
It’s not just well-regarded industry research firms validating Snowflake’s long view of the data and analytics industry. Forward-looking, innovative, and data-driven organizations also recognize that cloud-based strategies most effectively provide the agility necessary for companies to swiftly meet their data and analytics challenges, as well as business growth objectives of the future. The fact that Snowflake has grown so dramatically over the last four years, from zero to well over 2,300 customers adopting our data platform, is a clear indication of this.
However, if you think achieving faster business results and dramatically greater agility is just a matter of shifting an on-premises analytics and data platform deployment model to the cloud—think again.
Whether it’s a legacy massively parallel processing (MPP) appliance approach or some other vintage architecture pattern for analytics, data warehousing, or data lakes, if scalability disruptions for user, compute, or storage growth inhibit you like hitting a brick wall, shifting these architectures to the cloud migrates the brick wall along with your data.
In the cloud, architecture matters. Our new white paper, “How Snowflake’s Cloud Architecture Scales Modern Data and Analytics,” walks you through the fundamental differences between legacy architecture approaches and Snowflake’s multi-cluster shared data architecture. Snowflake customers routinely execute millions of queries a day, supporting and empowering thousands upon thousands of users, with tables storing multi-petabytes of structured and semi-structured data such as JSON—all with a single implementation of Snowflake. On top of this, when you need to scale Snowflake to add storage, users, and compute resources (for reading or writing data), Snowflake scales instantly without disruption.
The white paper also explains how Snowflake’s multi-cluster shared data architecture eliminates scaling barriers and disruption, and it highlights the relevant business values.
To learn more about Snowflake’s multi-cluster shared data architecture, download “How Snowflake’s Cloud Architecture Scales Modern Data and Analytics.”
I will continue to elaborate on how cloud-built Snowflake and our multi-cluster shared data architecture uniquely drives your data-driven initiatives forward, faster and further. Stay posted with our blog site and our Snowflake Twitter feed (@SnowflakeDB) or with me (@miclnixon1).