True 2D-to-3D Reconstruction of Heterogenous Porous Media via Deep Generative Adversarial Networks (GANs)

Vogel, Hannah; Amiri, Hamed;

2024 || YoDa Data Repository, Utrecht University, Netherlands

We imaged samples of Berea sandstone from Ohio (USA) using two 2D imaging techniques: backscattered electron (BSE) and optical microscopy, and 3D X-ray (micro-)computed tomography (XCT). The goal is to employ a deep-learning-based generative model called a generative adversarial network (GAN) to reconstruct statistically equivalent microstructures in 3D from exclusively 2D training images. To evaluate the reconstruction accuracy, we conduct a visual and statistical analysis comparing reconstructions with a 3D X-ray tomography of the same sample. Unlike previous research, our method uses true 2D images from three orthogonally oriented planes for training the model. The data are organized into 10 folders: three contain the original segmented (binary) images of Berea sandstone samples, and the other 7 folders contain data and individual figures used to create figures in the main publication.

Link to GitHub containing codes: https://github.com/hamediut/2D-to3D-recon

Originally assigned keywords

Corresponding MSL vocabulary keywords

MSL enriched keywords

MSL enriched sub domains
  • microscopy and tomography
  • rock and melt physics
Source http://dx.doi.org/10.24416/uu01-do6lt4
Source publisher YoDa Data Repository, Utrecht University, Netherlands
DOI 10.24416/uu01-do6lt4
Authors
Contributors
Citation Vogel, H., & Amiri, H. (2024). True 2D-to-3D Reconstruction of Heterogenous Porous Media via Deep Generative Adversarial Networks (GANs) (Version 1.0) [Data set]. Utrecht University. https://doi.org/10.24416/UU01-DO6LT4
Collection period(s)
  • 2022-03-01 - 2024-02-09