More than 70 percent of the world’s current oil and gas production comes from mature fields, many of which are in the secondary or tertiary phases of production, according to Halliburton. The easy oil and gas has already been extracted, and sophisticated discovery and extraction techniques are needed to extend production. In many cases, raw survey data reanalysis and in-depth scenario analysis are performed to better understand the underlying structure and dynamics of aging fields.
To do so, companies rely heavily on complex, proprietary algorithms and code for more efficient exploration, producing higher-quality images that enable precise interpretation of subsurface data. Each generation of these algorithms yields increasingly effective interpretation, allowing exploration teams to locate otherwise hidden deposits and to extract a greater portion of the available oil and gas.
However, new algorithms often demand significantly more processing power and responsive storage, pushing the limits of HPC platforms. Oil and gas companies must process data at higher speeds to create models of ever-increasing fidelity.
That demand is clearly visible in the market. According to IDC, supercomputer deployments are dominating growth in the HPC sector. While sales of all HPC servers grew 7.7 percent from 2011 to 2012, sales of supercomputers over the same period grew by 30 percent. Supercomputer revenues accounted for $5.6 billion of the total $11 billion HPC server market in 2012. The oil and gas industry, along with the financial sector, is the premier commercial consumer of supercomputing systems and solutions.
One area that sets the oil and gas sector apart is its consumption of data. Success in energy exploration ultimately boils down to the quality — and quantity — of raw data. The greater the volume and fidelity of the data, the better the chances that a targeted drill site or prospective deposit will pay out.
Companies have rapidly improved survey techniques and technologies, deploying higher numbers of advanced sensors to gain the most accurate possible picture of the underlying geology. Techniques such as 3D imaging and Wide-Azimuth (WAZ) surveys have multiplied the amount of data companies must capture and process.
Today, it is not unusual for a large exploration project to produce multiple petabytes of raw data, and for energy companies to possess a total data portfolio of tens or even hundreds of petabytes. Moving, processing and interpreting these vast stores of information have put oil and gas firms firmly at the leading edge of Big Data analytics and operations. Spending on storage overall is growing fast — faster, in fact, than any other technology area at HPC sites, according to an Intersect360 survey. And IDC projects that HPC storage revenue will grow at an annual rate of 8.2 percent from 2012 to 2017, to $6 billion.
Big Data analytics are a target of investment across all sectors, according to the 2013 High Performance Data Analysis Report from IDC. It found that 67 percent of sites surveyed perform Big Data analysis on HPC systems, with an average of nearly one-third of available computing cycles devoted to the task.
Spending on high-performance data analysis (HPDA) servers will also continue to grow through 2017, according to IDC, from $743.8 million in 2012 to nearly $1.4 billion in 2017. HPDA storage, meanwhile, will approach $1 billion in revenue by 2017.
In the oil and gas sector, HPC budgets are shifting to account for Big Data and HPDA. An IDC survey, HPC Market Update, HPC Trends In the Oil/Gas Sector and IDC’s Top 10 Predictions for 2014, found that companies in the oil and gas sector are budgeting an average of 12 percent of all HPC spending to support Big Data analytics — more than is spent on storage hardware, middleware or server maintenance. Only server hardware (30 percent) and application software (about 15 percent) consumed a greater share of budgeted spending.
Want to learn more? Check out CDW’s white paper, “High-Performance Computing In Oil and Gas.”