Jul 31 2014
Hardware

Supercomputing Techniques for Finding Energy Reserves

Increasingly powerful computing resources enable new techniques for exploration.

The energy industry is a leading consumer of supercomputing resources in the commercial space, equaled only by the financial sector in its investment in cutting-edge processing, storage and network capabilities. Furthermore, a 2011 study by Intersect360 Research found that the oil and gas sector has the highest overall rate of internal software usage; companies in the energy sector produce and maintain more in-house applications and algorithms than any other commercial sector.

New techniques and algorithms, such as full-wavefield inversion (FWI), are enabled by increasingly powerful computing resources as well as more advanced and extensive sensor data capture. Today, HPC build-outs rely on powerful co-processors — such as the Intel Xeon Phi accelerator and NVIDIA Tesla GPU — to offload specific, mathematically intensive tasks, which then can be resolved far more rapidly than they could on a general-purpose CPU.

The importance of co-processing in HPC is reflected in 2013 market figures published by IDC. According to IDC’s Processors/Co-rocessors/Accelerators Report (one of six reports in the IDC Worldwide HPC End-User Study), the percentage of HPC sites employing co-processors or accelerators in HPC systems increased to 76.9 percent in 2013, up from 28.2 percent in 2011.

Intel Xeon Phi co-processors and NVIDIA GPUs were identified as the most widely deployed co-processors. The massive increases in the amount of data that needs to be processed, analyzed, stored and updated are driving these advances in computing power. Similarly, the advanced algorithms needed to handle this high-volume and high-velocity data place increased pressure on storage infrastructure.

The Techniques Driving Energy Exploration

The stakes continue to rise in the energy sector, as emerging techniques are developed to produce imagery of subsurface structures at ever higher fidelity. Time and depth migration techniques, for instance, take raw data gathered by acoustic sensors and apply complex mathematics to pinpoint the location of the subsurface features they detect. As computing and storage capabilities have improved, energy companies have moved to adopt more effective — and computing intensive — techniques for capturing subsurface images.

Among the techniques being used:

Kirchhoff migration: A long-established acoustic method that traces rays and tries to infer travel times. In general, Kirchhoff migrations work best where the geology is layered, not complex.

Wave-equation migration: A depth migration technique that handles multiple paths during wavefield extrapolation, especially for regions with salt bodies. It is superior to Kirchhoff migration in its ability to deliver accurate imaging.

Reverse-time migration (RTM): Employed since 2009 on GPUs, RTM takes two passes through captured data, simulating the behavior of waves propagating both downward and upward through the earth. Complex wave models enable correlation between the two passes to yield a clearer image that reveals subsurface structures that would otherwise remain hidden.

Full-wavefield inversion: A technique that develops high-resolution models of seismic data by iteratively comparing observed and modeled seismic waveforms. The repetitious FWI method has become affordable with the advent of GPU and co-processor acceleration.

The evolution of increasingly effective seismic imaging techniques is driven by gains in computing power. In a presentation at the 2014 Oil and Gas HPC Workshop at Rice University in Houston, Peter Breunig, general manager of Technology Management and Architecture at Chevron, showed the processing power required to drive contemporary data analytics in oil and gas.

In 2002, Kirchhoff migrations required roughly 1 teraflop of computing performance. Just two years later, that figure had leapt with the adoption of wave-equation migration. By 2010, two-pass, reverse-time migration was demanding 150 teraflops of performance. The emergence of acoustic, full-wavefield inversion represents another massive increase, demanding roughly 1.5 petaflops of processing power.

Want to learn more? Check out CDW’s white paper, “High-Performance Computing in Oil and Gas.”

RonFullHD/ThinkStock
Close

See How Your Peers Are Moving Forward in the Cloud

New research from CDW can help you build on your success and take the next step.