Artificial Intelligence Applications in Petroleum Engineering - Part I
1. Artificial Intelligence
Applications in Petroleum
Engineering
“Where oil is first found, in the final analysis, is in the
minds of men”
(Pratt, 1952)
I. By: Ramez M. Aziz Zaky Part I
2. “We let the wells and the reservoir
speak for themselves and impose
their will on the model, instead of
imposing our current understanding
of the geology and physics on the
model. The model is then validated
by testing it with blind data during
post-modeling analysis”
Mohaghegh, S.
3. Agenda
Introduction: Neural Network Papers in OnePetro
Evolutionary Algorithms and Artificial Neural Networks
Reservoir Engineering Applications.
Production Technologies Applications.
Oil Well Drilling Applications.
4. Neural Networks papers in
OnePetro
It is incredible the number of papers on Neural
Networks contributed by the petroleum
engineering community: an astonishing total of
2,918 papers that mention the keyword "neural
networks" . And that's only Conference Papers.
Source: “petro.One” platform for searching papers in the
OnePetro website.
5. Neural Networks papers in
OnePetro
They originate from different institutions: SPE,
OTC, IPTC, SPWLA, PETSOC, SEG, ARMA,
WPC, ISOPE, ISRM, NACE, BHR, URTEC
OMC, PSIG, CMTC, ASSE and SUT.
The institution with more contributions is SPE
with 1527 papers, followed by SEG (439),
ISOPE (217), and ISRM (143).
6. Neural Networks papers in
OnePetro
There is still some work to do with the tagging, labelling
and classification, so these are rough numbers.
7. Artificial Neural Networks Intuition
• A biological neuron has three
types of main components;
dendrites, soma (or cell body)
and axon.
• Dendrites receives signals from
other neurons.
• The soma, sums the incoming signals. When sufficient
input is received, the cell fires; that is it transmit a signal
over its axon to other cells.
9. Neural Networks: Training
Process
Initialize training Epoch = 1
Calculate mse
Initialize weights and biases with
random values
Present input pattern and calculate
output values
mse ≤ mse 𝑚𝑖𝑛
Epoch ≤ Epoch 𝑚𝑎𝑥
Epoch = Epoch +1
Update weights and biases
Stop training
network
Yes
Yes
No
No
10. Tuned Parameters in Neural
Networks
Learning Rate and Momentum:
First: how far the step
Steps must be proportional to the size of the gradient vector. The constant of
proportionality is called the learning rate.
θ new = θ old − α∇ f(θold).
Second ANN can easily get stuck in a local minima and the algorithm may appear
reaching the global minima leading to sub-optimal results. To avoid this situation, a
momentum term is used in the objective function, which is a value between 0 and 1
that increases the size of the steps taken towards the minimum by trying to jump
from a local minima.
A right value of momentum and learning rate can be either learned by hit and trial
or through cross-validation.
11. A genetic algorithm (GA) is a search heuristic that mimics
the process of natural evolution. This heuristic is
routinely used to generate useful solutions to
optimization and search problems.
Genetic algorithms belong to the larger class of
evolutionary algorithms (EA), which generate solutions
to optimization problems using techniques inspired by
natural evolution, such as inheritance, mutation,
selection, and crossover.
Genetic Algorithms are continuously “explore” and “exploit”
the search space in order to achieve objectives.
Genetic Algorithm
12. GA
Mechanism
1. Generation of the initial
population.
2. Evaluation of the fitness
function of each individual in
the population.
3. Ranking of individuals
based on their fitness.
4. Selecting those
individuals to produce the
next generation based on
their fitness.
5. Using genetic operations,
such as crossover, inversion
and mutation, to generate a
new population.
6. Continue the process by
going back to step 2 until the
problem’s objectives are
satisfied.
13. Neural Networks: Architecture
Optimization with Genetic Algorithm
Using genetic algorithms (GAs) and starting from an initial
neural network architecture the GA tends to find a better
architecture that maximizes a fitness function, iteratively.
The GA generates different architectures by breeding a
population of them and then uses them for the task
(playing the game), selects the one yielding a higher
score (using the fitness function). Next time the GA uses
the best architecture candidates (parents in GA
terminology) to use for breeding and again repeats the
process of generating new population (architectures). Of
course, breeding includes mutation too.
14. fx
a
y
denotes +1
denotes -1
How would you
classify this data?
Any of these would
be fine..
..but which is best?
Support Vector Machine Intuition
15. Distance between x 𝑛 and the plane:
Take any point on the plane
Projection of x 𝑛 − x on w
w =
w
w
⟹ 𝑑𝑖𝑠𝑡𝑎𝑛𝑐𝑒 = w 𝑇
x 𝑛 − x
𝑑𝑖𝑠𝑡𝑎𝑛𝑐𝑒 =
1
w
w 𝑇x 𝑛 − w 𝑇x
=
1
w
w 𝑇x 𝑛 + 𝑏 − w 𝑇x − 𝑏
=
1
w
Support vector machines
The Optimization Problem
Maximize
1
w
Subject to 𝑚𝑖𝑛
𝑛=1,2,…,𝑁
w 𝑇
x 𝑛 + 𝑏 = 1
16. Kernels
The Linear kernel is the simplest kernel function. It is given by the
inner product 𝑥, 𝑦 plus an optional constant 𝑐.
𝑘 𝑤, 𝑥 = 𝑤 𝜏 𝑥 + 𝑐
The polynomial kernel is a non-stationary kernel. Polynomial
kernels are well suited for problems where all the training data
is normalized.
𝑘 𝑤, 𝑥 = (α𝑤 𝜏
𝑥 + 𝑐) 𝑑
The Gaussian kernel is an example of radial basis function (RBF)
kernel.
𝑘 w, 𝑥 = 𝑒
(
𝑤 𝑖−𝑥 2
2𝜎2 )
17. Adjustable Parameters Optimization
The adjustable parameter plays a major role in the
performance of the kernel, and should be carefully
tuned to the problem at hand.
If overestimated, the exponential will behave almost
linearly and the higher-dimensional projection will start
to lose its non-linear power.
if underestimated, the function will lack regularization and
the decision boundary will be highly sensitive to noise
in training data.
18. Reservoir Engineering Applications
• Pseudo logs generation
• Reservoir characterization
• Well test analysis and Identification of the Well
Test Model
• Permeability Prediction from Well Logs Using
ANN
• Predicting PVT Data
• Data Driven Reservoir modeling
19. Reservoir Characterization
Neural networks have been utilized to predict
formation characteristics such as porosity,
permeability and fluid saturation from
conventional well logs.
Using well logs as input data coupled with core
analysis of the corresponding depth, these
reservoir characteristics were successfully
predicted for a heterogeneous formation in
deferent areas.
20. Reservoir Modeling
● Uncertainty facing reservoir exploitation is high
when trying to figure how a tight rock formation
will respond to an induced hydraulic fracture
treatment. Uncertainty quantification can be
better achieved by making appropriate use of
complex or hyperdimensional reservoir data
through AI.
● for example enabling the optimization of fracture
spacing and fracture design models.
21. Production Technologies
Applications
• Dynamic system diagnosis:
Sucker rod pumps
PCP
ESP
• Gas Lift Optimization
• Hydraulic Fracturing Design and Optimization
• Production Monitoring
22. Oil Well Drilling Applications.
• Drilling operation optimization
• Drill Bit Diagnosis using ANN
• Stuck Pipe Prediction
23. Presentation Series
Another Technologies Will Be Discussed.
(Kriging- Fuzzy Logic – Deep Learning.. Etc)
Go Deep Inside Each Application in Reservoir,
Production and Drilling Relevant Work.
How To Code Each Problem and Explanation of
Various Frameworks That May Be Helpful.
Stay Tuned