Snowflake and Net Zero: The Case for Data Decarbonisation (Part Three)

February 2, 2021 Zac Aghion

This is the final in a series of three posts (see Part One and Part Two) examining the concept of net zero data and how advances in technology can help the world’s largest organisations—especially those which are particularly emissions-intensive, like oil and gas—reduce the carbon emissions footprint of their data.

BP is one of the largest energy companies in the world. They are currently undergoing a historical business transformation to meet the world’s future energy and sustainability demands. The company plans to invest $5 billion per year into renewable and low carbon energy projects, central to a broader net zero ambition to become carbon neutral in their own operations and halve the carbon impact of their own products by 2050 or sooner.  

As we’ll see, data is central to BP’s business. In fact, the company continues to run the largest commercial supercomputer in the world. Decarbonising this data at the data centre and database level can dramatically help BP meet its net zero goals and build the BP of the future.  

Mapping the Immediate Practical Applications for Net Zero Data at BP

Let’s consider a few areas of particularly data-intensive areas of BP’s operations:

  • Subsurface imaging and upstream exploration analysis. BP employs the world’s largest commercial supercomputer and Microsoft Azure’s AI suite to locate new hydrocarbon reservoirs with subsurface imaging, estimate how much can be retrieved from those reservoirs, and then calculate the optimal extraction path by simulating different drilling scenarios. 

The exploration process leverages enormous geospatial data sets and intensive analytical methods which are likely targets for energy and emissions reduction.

  • Digital twinning of upstream production sites. The oil and gas (O&G) industry has adopted digital twinning to monitor well-site production facilities, equipment, and operations and prevent unplanned downtime. BP pioneered its Plant Operations Advisor in collaboration with Baker Hughes GE in the Gulf of Mexico and has been rolling it out to upstream assets worldwide, expecting a 2% increase in operational efficiency. The digital twin system is enabled by large amounts of operational data coming from well-site sensors. 

The capture, transformation, movement, and analysis of this data are potential targets for cleaner computing. 

  • Monitoring methane emission leaks during production, refinery, and shipping operations. The O&G industry is increasingly adopting advanced analytical techniques to monitor and address undesirable emission leaks, especially methane leaks, across the value chain, including rigs, refineries, ships, and forecourts. BP has designed a comprehensive programme for methane leak detection using gas cloud imaging enabled by drones mounted with on-board sensors, and spectral radiometry that monitors the emissions consumption efficiency of flaring. 

The various aspects of the methane detection programme are underpinned by accurate, timely streaming of data from the field, and are additional target areas for reducing the emissions impact of computing operations. 

  • Supply and trading of energy products and related instruments. Most O&G majors operate trading businesses specialised in buying and selling of energy products, their derivatives, and related financial instruments which help hedge against volatility such as fluctuations in currencies and interest rates. BP’s energy trading business forecasts market supply and demand dynamics for energy products, aggregating large internal data sets including operational data coming from well-sites, refineries, and forecourts, as well as external data sets representing weather patterns, government-issued gas and power market data, and maritime and trade port information. 

The acquisition and analysis of this diverse set of data presents opportunities to reduce the energy and emissions footprint of BP’s computing operations. 

  • Marketing and distribution of retail and commercial end products. BP’s portfolio of fuels, lubricants, electricity, and convenience products, for both B2B and B2C customers, necessitates a complex distribution network and marketing infrastructure to ensure that the right product is offered to the right customer at the right time. The BPme digital rewards programme, for example, collects data about customers as they transact at forecourts. BP uses this data to reward customer loyalty and personalise marketing promotions. The Polar Plus app at BP’s EV network operator, BP Chargemaster, also collects data to drive customer loyalty and personalise offers. 

Collecting, storing, and analysing these large customer transaction and loyalty data sets represents yet another opportunity to optimise data and analytical operations for energy and emissions efficiency. 

  • Optimising the asset and operations management practices of solar and wind power plants. As with any complex infrastructure, the management of gas and renewable power plants increasingly requires large amounts of sensor-based IoT monitoring equipment to collect accurate, timely information on hardware performance and ambient conditions. That data is then used to reduce downtime and extend the lifetime value of power plant assets through an operational management practice known as predictive maintenance. The core of the practice is focused on preempting failures by automating the requisition of replacement parts and engineering labour. 

Data volumes can be unprecedentedly large, with tens of millions of sensors passing information every second. This presents an opportunity to reduce the energy, emissions, and cost of data operations. 

In partnership with AWS, Microsoft Azure, and GCP, Snowflake has developed a cloud technology and management practice which we believe can enable BP to realise the full potential that cloud computing can have on the energy impact and carbon footprint of its enterprise data.  To review the full blog series, read Part One and Part Two.

The post Snowflake and Net Zero: The Case for Data Decarbonisation (Part Three) appeared first on Snowflake.

Previous Article
5 Lessons We Learned Validating Security Controls at Snowflake
5 Lessons We Learned Validating Security Controls at Snowflake

You may have read about Snowflake’s IPO last year. But you probably didn’t hear about all the work that the...

Next Article
Masking Semi-Structured Data with Snowflake
Masking Semi-Structured Data with Snowflake

Snowflake recently launched dynamic data masking, an incredibly useful feature for companies and data-centr...

×

Subscribe to email updates from the Snowflake Blog

You're subscribed!
Error - something went wrong!