Artificial neural networks, usually simply called neural networks, are computing systems vaguely inspired by the biological neural networks that constitute animal brains. An ANN is based on a collection of connected units or nodes called artificial neurons, which loosely model the neurons in a biological brain.
Automating Google Workspace (GWS) & more with Apps Script
Artificial Neural Network
1. Artificial
intelligence
Artificial intelligence is intelligence demonstrated by
machines, unlike the natural intelligence displayed by
humans and animals, which involves consciousness and
emotionality.
2. The history of Artificial
Intelligence
The history of Artificial Intelligence began in antiquity, with myths, stories and rumors of artificial beings
endowed with intelligence or consciousness by master craftsmen. The seeds of modern AI were planted by
classical philosophers who attempted to describe the process of human thinking as the mechanical
manipulation of symbols. This work culminated in the invention of the programmable digital computer in the
1940s, a machine based on the abstract essence of mathematical reasoning. This device and the ideas behind
it inspired a handful of scientists to begin seriously discussing the possibility of building an electronic brain.
3. Definition of AI from
the time when it was
first coined
• The official idea and definition of AI
was first coined by John McCarthy in
1955 at the Dartmouth conference of
course those plenty of research work
done on AI by others such as Alan
Turing .
• In an article published in 1936,
Alan Turing introduced a
theoretical contraption based on
the principle that a machine can
emulate any machine: that is the
so-called “Turing machine.”
4. The original 7 Aspects of AI
(1955)
• Simulating higher functions of the human brain.
• Programming a computer to use general language.
• Arranging hypothetical neurons in a manner enabling
them to form concepts.
• A way to determine and measure problem
complexity.
• Self-improvement.
• Abstraction defined as the quality of dealing with
ideas rather than events.
• Randomness and creativity.
5. Intelligence
The intelligence according to Jack Copeland who has
written several books on AI some of the most important
factors of intelligence.
• Generalization learning : that is learning that enables
the learner to be able to perform better in situations not
previously encountered
• - Reasoning: to reason is to draw conclusions
appropriate to the situation in hand.
• Problem solving given such and such data find X.
• Perception analyzing ask and environment and analyzing
features and relationships between objects self-driving
cars are an example
• Language understanding: understanding language by
following syntax and other rules similar to a human.
6. Types of Artificial
Intelligence
• Strong AI
It is simulating the human brain by building systems that
think and, in the process, give us an insight into how the
brain works.
(Ultron from Avengers is an ideal example of a strong AI
that's because it's self-aware and eventually even
develops emotions this makes the eyes response
unpredictable)
• Weak AI (narrow AI)
It focuses solely on one task, and it is a system that
behaves like a human but doesn't give us an insight into
how the brain works
(for example, AlphaGo is a maestro of the game go but
you can't expect it to be even remotely good at this makes
AlphaGo a weak AI)
7. Examples of Artificial Intelligence
Machine learning
Knowledge management Computer vision
Natural language processing
Robotics
8. How is artificial
intelligence
different
from machine
learning and deep
learning
• Machine learning is a technique to achieve AI.
• Deep learning in turn is a subset of machine learning machine learning provides a machine with the
capability to learn from data and experience throughout rhythms deep learning does this learning
through ways inspired by the human brain.
• This means through deep learning data and patterns can be better perceived.
• Neural networks form the base of deep learning a subfield of machine learning where the algorithms are
inspired by the structure of the human brain.
• Neural networks take in data train themselves to recognize the patterns in this data and then predict the
outputs for a new set of similar data.
9. Artificial neural networks
• Artificial neural networks, usually simply called neural networks,
are computing systems vaguely inspired by the biological neural
networks that constitute animal brains. An ANN is based on a
collection of connected units or nodes called artificial neurons,
which loosely model the neurons in a biological brain.
• A neural network is a series of algorithms that endeavors to
recognize underlying relationships in a set of data through a
process that mimics the way the human brain operates. In this
sense, neural networks refer to systems of neurons, either organic
or artificial in nature.
• Today, neural networks are used for solving many business
problems such as sales forecasting, customer research, data
validation, and risk management. we apply neural networks for
time-series predictions, anomaly detection in data, and natural
language understanding.
10. Artificial Neural Network and
How Does It Work
Construct a neural network that
differentiates between a square
circle and triangle neural
networks are made up of layers
of neurons.
• These neurons are the core
processing units of the
network.
• First, we have the input layer
which receives the input.
• The output layer predicts our
final output.
• In between exist the hidden
layers which perform most of
the computations required by
our network .
11. • Each pixel is fed as input to each neuron of the first layer.
• Neurons of one layer are connected to neurons of the next layer through channels
It is an image of a circle this image is composed of 28
by 28 pixels which make up for 784 pixels.
12. What is Weight ?
• Each of these channels is assigned a numerical value known as
weight
• the inputs are multiplied to the corresponding weights and their
sum is sent as input to the neurons in the hidden layer.
• The weight shows the effectiveness of a particular input. More the
weight of input, more it will have impact on network.
• The weight within an ANNS is the parameter that transforms input
data in the hidden layers of networks.
• We can say the artificial neural network is a series of
nodes/neurons wherein each node there exists a set of inputs,
weight, and a bias with the amount of value.
• When an input passes in the node, it gets multiplied by a weight
value and which results in output being either perceived
/observed or getting passed to the next layer in the neural
networks. Regularly the weights of a neural network are contained
within the hidden layers of the network.
𝑆 =
𝑖=1
𝑛
𝑋𝑖 ∗ 𝑊𝑖
13. What is Bias?
• Each of these neurons is associated with a
numerical value called the bias which is then
added to the input sum.
• Bias is like the intercept added in a linear
equation. It is an additional parameter in the
Neural Network which is used to adjust the
output along with the weighted sum of the
inputs to the neuron. Therefore, Bias is a
constant which helps the model in a way that
it can fit best for the given data.
output = sum (weights * inputs) + bias
𝑆 =
𝑖=1
𝑛
𝑋𝑖 ∗ 𝑊𝑖 + 𝐵𝑖
14. The activation function
• The input sum value is then passed through a
threshold function called the activation
function.
• The result of the activation function
determines if the particular neuron will get
activated or not.
• The activation function is generally non-linear.
Linear functions are limited because the
output is simply proportional to the input.
15. Forward propagation
• An activated neuron transmits data to the
neurons of the next layer over the channels in
this manner the data is propagated through
the network this is called forward
propagation.
• Forward propagation means we are moving in
only one direction, from input to the output,
in a neural network.
16. The output layer
• In the output layer the neuron with the
highest value fires and determines the output
the values are basically a probable.
For example here are near unassociated with
square has the highest probability hence that's
the output .
17. Wrong prediction
• by the neural network of course just by a look
at it we know our neural network has made a
wrong prediction .
• But how does the network figure this out a
note that our network is yet to be trained?
• During this training process along with the
input our network also as the output fed to it.
• The predicted output is compared against the
actual output to realize the error in the
prediction.
18. Backpropagation
• The magnitude of the error indicates
how wrong we are in the sign suggests if
our predicted values are higher or lower
than expected the arrows here give an
indication of the direction and
magnitude of change to reduce the error.
• This information is then transferred
backwards through our network this is
known as backpropagation
• Now based on this information the
weights are adjusted this cycle of
forward propagation and
backpropagation is iteratively performed
with multiple inputs.
19. Correct output
• A set of examples for training the network is assembled. Each case consists of a
problem statement (which represents the input into the network) and the
corresponding solution (which represents the desired output from the network).
• The input data is entered into the network via the input layer.
• Each neuron in the network processes the input data with the resultant values
steadily "percolating„/passing through the network, layer by layer, until a result is
generated by the output layer.
• For the expected output for a particular input, the actual output of the network is
compared.
• It can result in an error value. Furthermore, the connection weights in the network
are steadily adjusted, working backwards from the output layer, over the hidden
layer, and to the input layer,up to the correct output is produced. In this way, the
finetuning the weights have the influence of teaching the network in order to create
the correct output for a particular input,(i.e.the network learns).
• This process continues until our weights are assigned such that the network can
predict the shapes correctly in most of the cases this brings our training process to
an end.
20. • Some of the prime applications of neural
networks facial recognition cameras on
smartphones these days can estimate the age of
the person based on their facial features.
• This is neural networks at play first
differentiating the face from the background and
then correlating the lines and spots on your face
to a possible age.
Facial recognition
21. Forecasting
• Forecasting neural networks are trained to
understand the patterns and detect the
possibility of rainfall or arise and stock prices
with high accuracy.
22. Music Composition
• Music composition neural networks can even
learn patterns and music and train itself enough
to compose a fresh tune.