Generating Sketch-Based Synthetic Seismic Images with Generative Adversarial Networks
Abstract
The characterization of the subsurface is paramount for the exploration and production life cycle and, more specifically, for the identification of potential hydrocarbon accumulations. In recent years, the oil and gas industry has increased its interest in applying machine learning to accelerate the seismic interpretation process, which is regarded as a time-consuming and human-centered task. Although machine learning has been successfully used in many applications ranging from stratigraphic segmentation to salt dome detection, a usual bottleneck is the need for a large amount of high-quality annotated data. To overcome this, data augmentation approaches are commonly used, and one of the most powerful is synthetic data generation. In addition, sketch-based synthetic seismic images can be used to support image retrieval applications to help oil companies leverage petabytes of seismic data sets. In this context, this letter investigates the generation of synthetic seismic images based on sketches using generative adversarial networks (GANs). To the best of our knowledge, this is the first work to propose such an approach. Experiments with five different sketch types in a public seismic data set indicate that realistic seismic images can be synthesized using rather simple sketches.