SlideShare una empresa de Scribd logo
1 de 47
Neural Network: A Brief Overview


               Presented by
               Ashraful Alam
               02/02/2004
Outline

   Introduction
   Background
   How the human brain works
   A Neuron Model
   A Simple Neuron
   Pattern Recognition example
   A Complicated Perceptron
Outline Continued

   Different types of Neural Networks
   Network Layers and Structure
   Training a Neural Network
   Learning process
   Neural Networks in use
   Use of Neural Networks in C.A.I.S. project
   Conclusion
Introduction

   What are Neural Networks?
    –   Neural networks are a new method of programming
        computers.
    –   They are exceptionally good at performing pattern
        recognition and other tasks that are very difficult to program
        using conventional techniques.
    –   Programs that employ neural nets are also capable of
        learning on their own and adapting to changing conditions.
Background
   An Artificial Neural Network (ANN) is an information processing
    paradigm that is inspired by the biological nervous systems,
    such as the human brain’s information processing mechanism.
   The key element of this paradigm is the novel structure of the
    information processing system. It is composed of a large
    number of highly interconnected processing elements
    (neurons) working in unison to solve specific problems. NNs,
    like people, learn by example.
   An NN is configured for a specific application, such as pattern
    recognition or data classification, through a learning process.
    Learning in biological systems involves adjustments to the
    synaptic connections that exist between the neurons. This is
    true of NNs as well.
How the Human Brain learns




   In the human brain, a typical neuron collects signals from others
    through a host of fine structures called dendrites.
   The neuron sends out spikes of electrical activity through a long, thin
    stand known as an axon, which splits into thousands of branches.
   At the end of each branch, a structure called a synapse converts the
    activity from the axon into electrical effects that inhibit or excite
    activity in the connected neurons.
A Neuron Model
   When a neuron receives excitatory input that is sufficiently large
    compared with its inhibitory input, it sends a spike of electrical activity
    down its axon. Learning occurs by changing the effectiveness of the
    synapses so that the influence of one neuron on another changes.




   We conduct these neural networks by first trying to deduce the essential
    features of neurons and their interconnections.
   We then typically program a computer to simulate these features.
A Simple Neuron




   An artificial neuron is a device with many inputs and
    one output.
   The neuron has two modes of operation;
   the training mode and
   the using mode.
A Simple Neuron (Cont.)
   In the training mode, the neuron can be trained to fire (or not),
    for particular input patterns.
   In the using mode, when a taught input pattern is detected at
    the input, its associated output becomes the current output. If
    the input pattern does not belong in the taught list of input
    patterns, the firing rule is used to determine whether to fire or
    not.
   The firing rule is an important concept in neural networks and
    accounts for their high flexibility. A firing rule determines how
    one calculates whether a neuron should fire for any input
    pattern. It relates to all the input patterns, not only the ones on
    which the node was trained on previously.
Pattern Recognition
   An important application of neural networks is pattern
    recognition. Pattern recognition can be implemented by using a
    feed-forward neural network that has been trained accordingly.
   During training, the network is trained to associate outputs with
    input patterns.
   When the network is used, it identifies the input pattern and
    tries to output the associated output pattern.
   The power of neural networks comes to life when a pattern that
    has no output associated with it, is given as an input.
   In this case, the network gives the output that corresponds to a
    taught input pattern that is least different from the given pattern.
Pattern Recognition (cont.)




   Suppose a network is trained to recognize
    the patterns T and H. The associated
    patterns are all black and all white
    respectively as shown above.
Pattern Recognition (cont.)




 Since the input pattern looks more like a ‘T’, when
 the network classifies it, it sees the input closely
 resembling ‘T’ and outputs the pattern that
 represents a ‘T’.
Pattern Recognition (cont.)




 The input pattern here closely resembles ‘H’
 with a slight difference. The network in this
 case classifies it as an ‘H’ and outputs the
 pattern representing an ‘H’.
Pattern Recognition (cont.)



   Here the top row is 2 errors away from a ‘T’ and 3 errors away
    from an H. So the top output is a black.
   The middle row is 1 error away from both T and H, so the
    output is random.
   The bottom row is 1 error away from T and 2 away from H.
    Therefore the output is black.
   Since the input resembles a ‘T’ more than an ‘H’ the output of
    the network is in favor of a ‘T’.
A Complicated Perceptron




   A more sophisticated Neuron is know as the McCulloch and
    Pitts model (MCP).
   The difference is that in the MCP model, the inputs are
    weighted and the effect that each input has at decision making,
    is dependent on the weight of the particular input.
   The weight of the input is a number which is multiplied with the
    input to give the weighted input.
A Complicated Perceptron (cont.)

   The weighted inputs are then added together and if
    they exceed a pre-set threshold value, the
    perceptron / neuron fires.
   Otherwise it will not fire and the inputs tied to that
    perceptron will not have any effect on the decision
    making.
   In mathematical terms, the neuron fires if and only if;
              X1W1 + X2W2 + X3W3 + ... > T
A Complicated Perceptron

   The MCP neuron has the ability to adapt to a
    particular situation by changing its weights and/or
    threshold.
   Various algorithms exist that cause the neuron to
    'adapt'; the most used ones are the Delta rule and
    the back error propagation.
Different types of Neural Networks

   Feed-forward networks
    –   Feed-forward NNs allow signals to travel one way
        only; from input to output. There is no feedback
        (loops) i.e. the output of any layer does not affect
        that same layer.
    –   Feed-forward NNs tend to be straight forward
        networks that associate inputs with outputs. They
        are extensively used in pattern recognition.
    –   This type of organization is also referred to as
        bottom-up or top-down.
Continued

   Feedback networks
    –   Feedback networks can have signals traveling in both
        directions by introducing loops in the network.
    –   Feedback networks are dynamic; their 'state' is changing
        continuously until they reach an equilibrium point.
    –   They remain at the equilibrium point until the input changes
        and a new equilibrium needs to be found.
    –   Feedback architectures are also referred to as interactive or
        recurrent, although the latter term is often used to denote
        feedback connections in single-layer organizations.
Diagram of an NN




         Fig: A simple Neural Network
Network Layers

   Input Layer - The activity of the input units
    represents the raw information that is fed into the
    network.
   Hidden Layer - The activity of each hidden unit is
    determined by the activities of the input units and the
    weights on the connections between the input and
    the hidden units.
   Output Layer - The behavior of the output units
    depends on the activity of the hidden units and the
    weights between the hidden and output units.
Continued

   This simple type of network is interesting because
    the hidden units are free to construct their own
    representations of the input.
   The weights between the input and hidden units
    determine when each hidden unit is active, and so by
    modifying these weights, a hidden unit can choose
    what it represents.
Network Structure

   The number of layers and of neurons depend on the
    specific task. In practice this issue is solved by trial
    and error.
   Two types of adaptive algorithms can be used:
    –   start from a large network and successively remove some
        neurons and links until network performance degrades.
    –   begin with a small network and introduce new neurons until
        performance is satisfactory.
Network Parameters

   How are the weights initialized?
   How many hidden layers and how many
    neurons?
   How many examples in the training set?
Weights

   In general, initial weights are randomly
    chosen, with typical values between -1.0 and
    1.0 or -0.5 and 0.5.
   There are two types of NNs. The first type is
    known as
    –   Fixed Networks – where the weights are fixed
    –   Adaptive Networks – where the weights are
        changed to reduce prediction error.
Size of Training Data

   Rule of thumb:
    –   the number of training examples should be at least five to
        ten times the number of weights of the network.




   Other rule:
                                        |W|= number of weights
            |W|
        N>                              a = expected accuracy on
           (1 - a)                      test set
Training Basics

   The most basic method of training a neural
    network is trial and error.
   If the network isn't behaving the way it
    should, change the weighting of a random
    link by a random amount. If the accuracy of
    the network declines, undo the change and
    make a different one.
   It takes time, but the trial and error method
    does produce results.
Training: Backprop algorithm
   The Backprop algorithm searches for weight values that
    minimize the total error of the network over the set of training
    examples (training set).
   Backprop consists of the repeated application of the following
    two passes:
     – Forward pass: in this step the network is activated on one
       example and the error of (each neuron of) the output layer
       is computed.
     – Backward pass: in this step the network error is used for
       updating the weights. Starting at the output layer, the error
       is propagated backwards through the network, layer by
       layer. This is done by recursively computing the local
       gradient of each neuron.
Back Propagation

   Back-propagation training algorithm
                                          Network activation
                                          Forward Step



                                          Error
                                          propagation
                                          Backward Step
   Backprop adjusts the weights of the NN in order to
    minimize the network total mean squared error.
The Learning Process

   The memorization of patterns and the
    subsequent response of the network can be
    categorized into two general paradigms:
    –   Associative mapping
    –   Regularity detection
Associative Mapping

   Associative mapping a type of NN in which
    the network learns to produce a particular
    pattern on the set of input units whenever
    another particular pattern is applied on the
    set of input units.
   This allows the network to complete a pattern
    given parts of a pattern that is similar to a
    previously learned pattern.
Regularity Detection

   Regularity detection is a type of NN in
    which units learn to respond to particular
    properties of the input patterns. Whereas in
    associative mapping the network stores the
    relationships among patterns.
   In regularity detection the response of each
    unit has a particular 'meaning'. This means
    that the activation of each unit corresponds
    to different input attributes.
The Learning Process (cont.)

   Every neural network possesses knowledge
    which is contained in the values of the
    connection weights.
   Modifying the knowledge stored in the
    network as a function of experience implies a
    learning rule for changing the values of the
    weights.
The Learning Process (cont.)

   Recall: Adaptive networks are NNs that allow
    the change of weights in its connections.
   The learning methods can be classified in
    two categories:
    –   Supervised Learning
    –   Unsupervised Learning
Supervised Learning
   Supervised learning which incorporates an external
    teacher, so that each output unit is told what its
    desired response to input signals ought to be.
   An important issue concerning supervised learning is
    the problem of error convergence, ie the
    minimization of error between the desired and
    computed unit values.
   The aim is to determine a set of weights which
    minimizes the error. One well-known method, which
    is common to many learning paradigms is the least
    mean square (LMS) convergence.
Supervised Learning

   In this sort of learning, the human teacher’s
    experience is used to tell the NN which outputs are
    correct and which are not.
   This does not mean that a human teacher needs to
    be present at all times, only the correct
    classifications gathered from the human teacher on a
    domain needs to be present.
   The network then learns from its error, that is, it
    changes its weight to reduce its prediction error.
Unsupervised Learning

   Unsupervised learning uses no external teacher
    and is based upon only local information. It is also
    referred to as self-organization, in the sense that it
    self-organizes data presented to the network and
    detects their emergent collective properties.
   The network is then used to construct clusters of
    similar patterns.
   This is particularly useful is domains were a
    instances are checked to match previous scenarios.
    For example, detecting credit card fraud.
Neural Network in Use
Since neural networks are best at identifying patterns or trends
in data, they are well suited for prediction or forecasting
needs including:
    – sales forecasting
    – industrial process control
    – customer research
    – data validation
    – risk management


   ANN are also used in the following specific paradigms:
   recognition of speakers in communications; diagnosis of
   hepatitis; undersea mine detection; texture analysis; three-
   dimensional object recognition; hand-written word recognition;
   and facial recognition.
Neural networks in Medicine

   Artificial Neural Networks (ANN) are currently a 'hot'
    research area in medicine and it is believed that they
    will receive extensive application to biomedical
    systems in the next few years.
   At the moment, the research is mostly on modeling
    parts of the human body and recognizing diseases
    from various scans (e.g. cardiograms, CAT scans,
    ultrasonic scans, etc.).
Continued
   Neural networks are ideal in recognizing diseases
    using scans since there is no need to provide a
    specific algorithm on how to identify the disease.
   Neural networks learn by example so the details of
    how to recognize the disease are not needed.
   What is needed is a set of examples that are
    representative of all the variations of the disease.
   The quantity of examples is not as important as the
    'quality'. The examples need to be selected very
    carefully if the system is to perform reliably and
    efficiently.
Modeling and Diagnosing the
Cardiovascular System

   Neural Networks are used experimentally to model
    the human cardiovascular system.
   Diagnosis can be achieved by building a model of
    the cardiovascular system of an individual and
    comparing it with the real time physiological
    measurements taken from the patient.
   If this routine is carried out regularly, potential
    harmful medical conditions can be detected at an
    early stage and thus make the process of combating
    the disease much easier.
Continued
   A model of an individual's cardiovascular system
    must mimic the relationship among physiological
    variables (i.e., heart rate, systolic and diastolic blood
    pressures, and breathing rate) at different physical
    activity levels.
   If a model is adapted to an individual, then it
    becomes a model of the physical condition of that
    individual. The simulator will have to be able to adapt
    to the features of any individual without the
    supervision of an expert. This calls for a neural
    network.
Continued
   Another reason that justifies the use of NN technology, is the
    ability of NNs to provide sensor fusion which is the combining
    of values from several different sensors.
   Sensor fusion enables the NNs to learn complex relationships
    among the individual sensor values, which would otherwise be
    lost if the values were individually analyzed.
   In medical modeling and diagnosis, this implies that even
    though each sensor in a set may be sensitive only to a specific
    physiological variable, NNs are capable of detecting complex
    medical conditions by fusing the data from the individual
    biomedical sensors.
Use of Neural Networks in C.A.I.S.
Project

   In C.A.I.S. Project, a Neural Network implementing
    supervised learning should be used.
   This would allows the system to first predict an
    output and then compare its output with the gold
    standard to calculate its error.
   By implementing a Neural Network utilizing the Back
    propagation algorithm, we can assure that the
    network will be able to reduce its prediction error
    given the right training examples.
Continued

   However, to do this, the network needs to have
    enough information/ variables to make connection to
    the diagnosis of different variations of heart murmur.
   Simply the signal processed sound may give us an
    output, but it has the possibility to be error prone.
   Hence the network will need to have a supply of
    contextual data, gathered from experts while training,
    to make strong and correct predictions.
Disadvantage of Neural Network

   The individual relations between the input
    variables and the output variables are not
    developed by engineering judgment so that
    the model tends to be a black box or
    input/output table without analytical basis.
   The sample size has to be large.
   Requires lot of trial and error so training can
    be time consuming.
Conclusion

   A Neural Network can be successfully used
    in the C.A.I.S. project, given that we have
    access to enough training data.
   Since we need it to be highly accurate, we
    need to carefully gather contextual variables
    that closely relate to diagnosing heart
    murmur and its various grades.

Más contenido relacionado

La actualidad más candente

Artificial neural networks and its application
Artificial neural networks and its applicationArtificial neural networks and its application
Artificial neural networks and its applicationHưng Đặng
 
Artificial Neural Network (draft)
Artificial Neural Network (draft)Artificial Neural Network (draft)
Artificial Neural Network (draft)James Boulie
 
Artificial Neural Networks Lect1: Introduction & neural computation
Artificial Neural Networks Lect1: Introduction & neural computationArtificial Neural Networks Lect1: Introduction & neural computation
Artificial Neural Networks Lect1: Introduction & neural computationMohammed Bennamoun
 
Artificial neural network
Artificial neural networkArtificial neural network
Artificial neural networkmustafa aadel
 
Artificial Neural Networks: Pointers
Artificial Neural Networks: PointersArtificial Neural Networks: Pointers
Artificial Neural Networks: PointersFariz Darari
 
Artificial Neural Networks Lect3: Neural Network Learning rules
Artificial Neural Networks Lect3: Neural Network Learning rulesArtificial Neural Networks Lect3: Neural Network Learning rules
Artificial Neural Networks Lect3: Neural Network Learning rulesMohammed Bennamoun
 
Artificial neural network
Artificial neural networkArtificial neural network
Artificial neural networksweetysweety8
 
Artificial Neural Network seminar presentation using ppt.
Artificial Neural Network seminar presentation using ppt.Artificial Neural Network seminar presentation using ppt.
Artificial Neural Network seminar presentation using ppt.Mohd Faiz
 
Artificial Neural Network in Medical Diagnosis
Artificial Neural Network in Medical DiagnosisArtificial Neural Network in Medical Diagnosis
Artificial Neural Network in Medical DiagnosisAdityendra Kumar Singh
 
Artificial neural networks (2)
Artificial neural networks (2)Artificial neural networks (2)
Artificial neural networks (2)sai anjaneya
 
Artificial Neural Network and its Applications
Artificial Neural Network and its ApplicationsArtificial Neural Network and its Applications
Artificial Neural Network and its Applicationsshritosh kumar
 
Artificial Intelligence- Neural Networks
Artificial Intelligence- Neural NetworksArtificial Intelligence- Neural Networks
Artificial Intelligence- Neural NetworksLearnbay Datascience
 

La actualidad más candente (19)

Artificial neural networks and its application
Artificial neural networks and its applicationArtificial neural networks and its application
Artificial neural networks and its application
 
Neural Networks
Neural NetworksNeural Networks
Neural Networks
 
Artificial Neural Network (draft)
Artificial Neural Network (draft)Artificial Neural Network (draft)
Artificial Neural Network (draft)
 
Artificial Neural Networks Lect1: Introduction & neural computation
Artificial Neural Networks Lect1: Introduction & neural computationArtificial Neural Networks Lect1: Introduction & neural computation
Artificial Neural Networks Lect1: Introduction & neural computation
 
Artificial neural network
Artificial neural networkArtificial neural network
Artificial neural network
 
Artificial Neural Networks: Pointers
Artificial Neural Networks: PointersArtificial Neural Networks: Pointers
Artificial Neural Networks: Pointers
 
Artificial Neural Networks Lect3: Neural Network Learning rules
Artificial Neural Networks Lect3: Neural Network Learning rulesArtificial Neural Networks Lect3: Neural Network Learning rules
Artificial Neural Networks Lect3: Neural Network Learning rules
 
Artificial neural network
Artificial neural networkArtificial neural network
Artificial neural network
 
Project Report -Vaibhav
Project Report -VaibhavProject Report -Vaibhav
Project Report -Vaibhav
 
Ann
Ann Ann
Ann
 
Artificial Neural Network seminar presentation using ppt.
Artificial Neural Network seminar presentation using ppt.Artificial Neural Network seminar presentation using ppt.
Artificial Neural Network seminar presentation using ppt.
 
Artificial Neural Network in Medical Diagnosis
Artificial Neural Network in Medical DiagnosisArtificial Neural Network in Medical Diagnosis
Artificial Neural Network in Medical Diagnosis
 
Neural networks introduction
Neural networks introductionNeural networks introduction
Neural networks introduction
 
Neural
NeuralNeural
Neural
 
Artificial neural networks (2)
Artificial neural networks (2)Artificial neural networks (2)
Artificial neural networks (2)
 
neural networks
neural networksneural networks
neural networks
 
Artificial Neural Network and its Applications
Artificial Neural Network and its ApplicationsArtificial Neural Network and its Applications
Artificial Neural Network and its Applications
 
Artificial neural network
Artificial neural networkArtificial neural network
Artificial neural network
 
Artificial Intelligence- Neural Networks
Artificial Intelligence- Neural NetworksArtificial Intelligence- Neural Networks
Artificial Intelligence- Neural Networks
 

Destacado

U-healthcare monitoring and reporting using smartphone
U-healthcare monitoring and reporting using smartphoneU-healthcare monitoring and reporting using smartphone
U-healthcare monitoring and reporting using smartphoneRajeev Piyare
 
Counter propagation Network
Counter propagation NetworkCounter propagation Network
Counter propagation NetworkAkshay Dhole
 
Control Strategies in AI
Control Strategies in AIControl Strategies in AI
Control Strategies in AIAmey Kerkar
 
Artificial Intelligence
Artificial IntelligenceArtificial Intelligence
Artificial IntelligenceBise Mond
 
giasan.vn real-estate analytics: a Vietnam case study
giasan.vn real-estate analytics: a Vietnam case studygiasan.vn real-estate analytics: a Vietnam case study
giasan.vn real-estate analytics: a Vietnam case studyViet-Trung TRAN
 
weak slot and filler structure
weak slot and filler structureweak slot and filler structure
weak slot and filler structureAmey Kerkar
 
Counterpropagation NETWORK
Counterpropagation NETWORKCounterpropagation NETWORK
Counterpropagation NETWORKESCOM
 
Natural Language Processing with Graph Databases and Neo4j
Natural Language Processing with Graph Databases and Neo4jNatural Language Processing with Graph Databases and Neo4j
Natural Language Processing with Graph Databases and Neo4jWilliam Lyon
 
Counterpropagation NETWORK
Counterpropagation NETWORKCounterpropagation NETWORK
Counterpropagation NETWORKESCOM
 
Associative memory 14208
Associative memory 14208Associative memory 14208
Associative memory 14208Ameer Mehmood
 
Amit ppt
Amit pptAmit ppt
Amit pptamitp26
 
lecture07.ppt
lecture07.pptlecture07.ppt
lecture07.pptbutest
 
soft-computing
 soft-computing soft-computing
soft-computingstudent
 
Adaline madaline
Adaline madalineAdaline madaline
Adaline madalineNagarajan
 
Deep Learning: Theory, History, State of the Art & Practical Tools
Deep Learning: Theory, History, State of the Art & Practical ToolsDeep Learning: Theory, History, State of the Art & Practical Tools
Deep Learning: Theory, History, State of the Art & Practical ToolsIlya Kuzovkin
 

Destacado (20)

U-healthcare monitoring and reporting using smartphone
U-healthcare monitoring and reporting using smartphoneU-healthcare monitoring and reporting using smartphone
U-healthcare monitoring and reporting using smartphone
 
Unit+i
Unit+iUnit+i
Unit+i
 
Counter propagation Network
Counter propagation NetworkCounter propagation Network
Counter propagation Network
 
Control Strategies in AI
Control Strategies in AIControl Strategies in AI
Control Strategies in AI
 
Artificial Intelligence
Artificial IntelligenceArtificial Intelligence
Artificial Intelligence
 
giasan.vn real-estate analytics: a Vietnam case study
giasan.vn real-estate analytics: a Vietnam case studygiasan.vn real-estate analytics: a Vietnam case study
giasan.vn real-estate analytics: a Vietnam case study
 
weak slot and filler structure
weak slot and filler structureweak slot and filler structure
weak slot and filler structure
 
Counterpropagation NETWORK
Counterpropagation NETWORKCounterpropagation NETWORK
Counterpropagation NETWORK
 
Natural Language Processing with Graph Databases and Neo4j
Natural Language Processing with Graph Databases and Neo4jNatural Language Processing with Graph Databases and Neo4j
Natural Language Processing with Graph Databases and Neo4j
 
Counterpropagation NETWORK
Counterpropagation NETWORKCounterpropagation NETWORK
Counterpropagation NETWORK
 
Propositional logic
Propositional logicPropositional logic
Propositional logic
 
Associative memory 14208
Associative memory 14208Associative memory 14208
Associative memory 14208
 
Lecture11 - neural networks
Lecture11 - neural networksLecture11 - neural networks
Lecture11 - neural networks
 
Amit ppt
Amit pptAmit ppt
Amit ppt
 
lecture07.ppt
lecture07.pptlecture07.ppt
lecture07.ppt
 
Forward Backward Chaining
Forward Backward ChainingForward Backward Chaining
Forward Backward Chaining
 
soft-computing
 soft-computing soft-computing
soft-computing
 
Adaline madaline
Adaline madalineAdaline madaline
Adaline madaline
 
Deep Learning: Theory, History, State of the Art & Practical Tools
Deep Learning: Theory, History, State of the Art & Practical ToolsDeep Learning: Theory, History, State of the Art & Practical Tools
Deep Learning: Theory, History, State of the Art & Practical Tools
 
Artificial Intelligence
Artificial IntelligenceArtificial Intelligence
Artificial Intelligence
 

Similar a Neuralnetwork 101222074552-phpapp02

Neural networks of artificial intelligence
Neural networks of artificial  intelligenceNeural networks of artificial  intelligence
Neural networks of artificial intelligencealldesign
 
Neural Networks Ver1
Neural  Networks  Ver1Neural  Networks  Ver1
Neural Networks Ver1ncct
 
Artificial Neural Networks ppt.pptx for final sem cse
Artificial Neural Networks  ppt.pptx for final sem cseArtificial Neural Networks  ppt.pptx for final sem cse
Artificial Neural Networks ppt.pptx for final sem cseNaveenBhajantri1
 
Neural networks are parallel computing devices.docx.pdf
Neural networks are parallel computing devices.docx.pdfNeural networks are parallel computing devices.docx.pdf
Neural networks are parallel computing devices.docx.pdfneelamsanjeevkumar
 
Soft Computing-173101
Soft Computing-173101Soft Computing-173101
Soft Computing-173101AMIT KUMAR
 
Artificial neural networks
Artificial neural networks Artificial neural networks
Artificial neural networks ShwethaShreeS
 
neuralnetwork.pptx
neuralnetwork.pptxneuralnetwork.pptx
neuralnetwork.pptxSherinRappai
 
Classification by back propagation, multi layered feed forward neural network...
Classification by back propagation, multi layered feed forward neural network...Classification by back propagation, multi layered feed forward neural network...
Classification by back propagation, multi layered feed forward neural network...bihira aggrey
 
BACKPROPOGATION ALGO.pdfLECTURE NOTES WITH SOLVED EXAMPLE AND FEED FORWARD NE...
BACKPROPOGATION ALGO.pdfLECTURE NOTES WITH SOLVED EXAMPLE AND FEED FORWARD NE...BACKPROPOGATION ALGO.pdfLECTURE NOTES WITH SOLVED EXAMPLE AND FEED FORWARD NE...
BACKPROPOGATION ALGO.pdfLECTURE NOTES WITH SOLVED EXAMPLE AND FEED FORWARD NE...DurgadeviParamasivam
 
Basics of Artificial Neural Network
Basics of Artificial Neural Network Basics of Artificial Neural Network
Basics of Artificial Neural Network Subham Preetam
 

Similar a Neuralnetwork 101222074552-phpapp02 (20)

Neural Networks
Neural NetworksNeural Networks
Neural Networks
 
Neural networks of artificial intelligence
Neural networks of artificial  intelligenceNeural networks of artificial  intelligence
Neural networks of artificial intelligence
 
Neural Networks Ver1
Neural  Networks  Ver1Neural  Networks  Ver1
Neural Networks Ver1
 
ANN - UNIT 1.pptx
ANN - UNIT 1.pptxANN - UNIT 1.pptx
ANN - UNIT 1.pptx
 
Artificial Neural Networks ppt.pptx for final sem cse
Artificial Neural Networks  ppt.pptx for final sem cseArtificial Neural Networks  ppt.pptx for final sem cse
Artificial Neural Networks ppt.pptx for final sem cse
 
Neural networks are parallel computing devices.docx.pdf
Neural networks are parallel computing devices.docx.pdfNeural networks are parallel computing devices.docx.pdf
Neural networks are parallel computing devices.docx.pdf
 
02 Fundamental Concepts of ANN
02 Fundamental Concepts of ANN02 Fundamental Concepts of ANN
02 Fundamental Concepts of ANN
 
Soft Computing-173101
Soft Computing-173101Soft Computing-173101
Soft Computing-173101
 
Artificial neural networks
Artificial neural networks Artificial neural networks
Artificial neural networks
 
Neural network
Neural networkNeural network
Neural network
 
ANN.ppt
ANN.pptANN.ppt
ANN.ppt
 
neuralnetwork.pptx
neuralnetwork.pptxneuralnetwork.pptx
neuralnetwork.pptx
 
neuralnetwork.pptx
neuralnetwork.pptxneuralnetwork.pptx
neuralnetwork.pptx
 
Jack
JackJack
Jack
 
Classification by back propagation, multi layered feed forward neural network...
Classification by back propagation, multi layered feed forward neural network...Classification by back propagation, multi layered feed forward neural network...
Classification by back propagation, multi layered feed forward neural network...
 
BACKPROPOGATION ALGO.pdfLECTURE NOTES WITH SOLVED EXAMPLE AND FEED FORWARD NE...
BACKPROPOGATION ALGO.pdfLECTURE NOTES WITH SOLVED EXAMPLE AND FEED FORWARD NE...BACKPROPOGATION ALGO.pdfLECTURE NOTES WITH SOLVED EXAMPLE AND FEED FORWARD NE...
BACKPROPOGATION ALGO.pdfLECTURE NOTES WITH SOLVED EXAMPLE AND FEED FORWARD NE...
 
neuralnetwork.pptx
neuralnetwork.pptxneuralnetwork.pptx
neuralnetwork.pptx
 
Neural network
Neural networkNeural network
Neural network
 
SoftComputing6
SoftComputing6SoftComputing6
SoftComputing6
 
Basics of Artificial Neural Network
Basics of Artificial Neural Network Basics of Artificial Neural Network
Basics of Artificial Neural Network
 

Último

"LLMs for Python Engineers: Advanced Data Analysis and Semantic Kernel",Oleks...
"LLMs for Python Engineers: Advanced Data Analysis and Semantic Kernel",Oleks..."LLMs for Python Engineers: Advanced Data Analysis and Semantic Kernel",Oleks...
"LLMs for Python Engineers: Advanced Data Analysis and Semantic Kernel",Oleks...Fwdays
 
Scanning the Internet for External Cloud Exposures via SSL Certs
Scanning the Internet for External Cloud Exposures via SSL CertsScanning the Internet for External Cloud Exposures via SSL Certs
Scanning the Internet for External Cloud Exposures via SSL CertsRizwan Syed
 
Connect Wave/ connectwave Pitch Deck Presentation
Connect Wave/ connectwave Pitch Deck PresentationConnect Wave/ connectwave Pitch Deck Presentation
Connect Wave/ connectwave Pitch Deck PresentationSlibray Presentation
 
What's New in Teams Calling, Meetings and Devices March 2024
What's New in Teams Calling, Meetings and Devices March 2024What's New in Teams Calling, Meetings and Devices March 2024
What's New in Teams Calling, Meetings and Devices March 2024Stephanie Beckett
 
DevEX - reference for building teams, processes, and platforms
DevEX - reference for building teams, processes, and platformsDevEX - reference for building teams, processes, and platforms
DevEX - reference for building teams, processes, and platformsSergiu Bodiu
 
Dev Dives: Streamline document processing with UiPath Studio Web
Dev Dives: Streamline document processing with UiPath Studio WebDev Dives: Streamline document processing with UiPath Studio Web
Dev Dives: Streamline document processing with UiPath Studio WebUiPathCommunity
 
"ML in Production",Oleksandr Bagan
"ML in Production",Oleksandr Bagan"ML in Production",Oleksandr Bagan
"ML in Production",Oleksandr BaganFwdays
 
Nell’iperspazio con Rocket: il Framework Web di Rust!
Nell’iperspazio con Rocket: il Framework Web di Rust!Nell’iperspazio con Rocket: il Framework Web di Rust!
Nell’iperspazio con Rocket: il Framework Web di Rust!Commit University
 
Merck Moving Beyond Passwords: FIDO Paris Seminar.pptx
Merck Moving Beyond Passwords: FIDO Paris Seminar.pptxMerck Moving Beyond Passwords: FIDO Paris Seminar.pptx
Merck Moving Beyond Passwords: FIDO Paris Seminar.pptxLoriGlavin3
 
Leverage Zilliz Serverless - Up to 50X Saving for Your Vector Storage Cost
Leverage Zilliz Serverless - Up to 50X Saving for Your Vector Storage CostLeverage Zilliz Serverless - Up to 50X Saving for Your Vector Storage Cost
Leverage Zilliz Serverless - Up to 50X Saving for Your Vector Storage CostZilliz
 
Powerpoint exploring the locations used in television show Time Clash
Powerpoint exploring the locations used in television show Time ClashPowerpoint exploring the locations used in television show Time Clash
Powerpoint exploring the locations used in television show Time Clashcharlottematthew16
 
Anypoint Exchange: It’s Not Just a Repo!
Anypoint Exchange: It’s Not Just a Repo!Anypoint Exchange: It’s Not Just a Repo!
Anypoint Exchange: It’s Not Just a Repo!Manik S Magar
 
The Ultimate Guide to Choosing WordPress Pros and Cons
The Ultimate Guide to Choosing WordPress Pros and ConsThe Ultimate Guide to Choosing WordPress Pros and Cons
The Ultimate Guide to Choosing WordPress Pros and ConsPixlogix Infotech
 
CloudStudio User manual (basic edition):
CloudStudio User manual (basic edition):CloudStudio User manual (basic edition):
CloudStudio User manual (basic edition):comworks
 
TeamStation AI System Report LATAM IT Salaries 2024
TeamStation AI System Report LATAM IT Salaries 2024TeamStation AI System Report LATAM IT Salaries 2024
TeamStation AI System Report LATAM IT Salaries 2024Lonnie McRorey
 
Transcript: New from BookNet Canada for 2024: BNC CataList - Tech Forum 2024
Transcript: New from BookNet Canada for 2024: BNC CataList - Tech Forum 2024Transcript: New from BookNet Canada for 2024: BNC CataList - Tech Forum 2024
Transcript: New from BookNet Canada for 2024: BNC CataList - Tech Forum 2024BookNet Canada
 
Hyperautomation and AI/ML: A Strategy for Digital Transformation Success.pdf
Hyperautomation and AI/ML: A Strategy for Digital Transformation Success.pdfHyperautomation and AI/ML: A Strategy for Digital Transformation Success.pdf
Hyperautomation and AI/ML: A Strategy for Digital Transformation Success.pdfPrecisely
 
SAP Build Work Zone - Overview L2-L3.pptx
SAP Build Work Zone - Overview L2-L3.pptxSAP Build Work Zone - Overview L2-L3.pptx
SAP Build Work Zone - Overview L2-L3.pptxNavinnSomaal
 

Último (20)

"LLMs for Python Engineers: Advanced Data Analysis and Semantic Kernel",Oleks...
"LLMs for Python Engineers: Advanced Data Analysis and Semantic Kernel",Oleks..."LLMs for Python Engineers: Advanced Data Analysis and Semantic Kernel",Oleks...
"LLMs for Python Engineers: Advanced Data Analysis and Semantic Kernel",Oleks...
 
Scanning the Internet for External Cloud Exposures via SSL Certs
Scanning the Internet for External Cloud Exposures via SSL CertsScanning the Internet for External Cloud Exposures via SSL Certs
Scanning the Internet for External Cloud Exposures via SSL Certs
 
Connect Wave/ connectwave Pitch Deck Presentation
Connect Wave/ connectwave Pitch Deck PresentationConnect Wave/ connectwave Pitch Deck Presentation
Connect Wave/ connectwave Pitch Deck Presentation
 
What's New in Teams Calling, Meetings and Devices March 2024
What's New in Teams Calling, Meetings and Devices March 2024What's New in Teams Calling, Meetings and Devices March 2024
What's New in Teams Calling, Meetings and Devices March 2024
 
DevEX - reference for building teams, processes, and platforms
DevEX - reference for building teams, processes, and platformsDevEX - reference for building teams, processes, and platforms
DevEX - reference for building teams, processes, and platforms
 
Dev Dives: Streamline document processing with UiPath Studio Web
Dev Dives: Streamline document processing with UiPath Studio WebDev Dives: Streamline document processing with UiPath Studio Web
Dev Dives: Streamline document processing with UiPath Studio Web
 
"ML in Production",Oleksandr Bagan
"ML in Production",Oleksandr Bagan"ML in Production",Oleksandr Bagan
"ML in Production",Oleksandr Bagan
 
E-Vehicle_Hacking_by_Parul Sharma_null_owasp.pptx
E-Vehicle_Hacking_by_Parul Sharma_null_owasp.pptxE-Vehicle_Hacking_by_Parul Sharma_null_owasp.pptx
E-Vehicle_Hacking_by_Parul Sharma_null_owasp.pptx
 
Nell’iperspazio con Rocket: il Framework Web di Rust!
Nell’iperspazio con Rocket: il Framework Web di Rust!Nell’iperspazio con Rocket: il Framework Web di Rust!
Nell’iperspazio con Rocket: il Framework Web di Rust!
 
DMCC Future of Trade Web3 - Special Edition
DMCC Future of Trade Web3 - Special EditionDMCC Future of Trade Web3 - Special Edition
DMCC Future of Trade Web3 - Special Edition
 
Merck Moving Beyond Passwords: FIDO Paris Seminar.pptx
Merck Moving Beyond Passwords: FIDO Paris Seminar.pptxMerck Moving Beyond Passwords: FIDO Paris Seminar.pptx
Merck Moving Beyond Passwords: FIDO Paris Seminar.pptx
 
Leverage Zilliz Serverless - Up to 50X Saving for Your Vector Storage Cost
Leverage Zilliz Serverless - Up to 50X Saving for Your Vector Storage CostLeverage Zilliz Serverless - Up to 50X Saving for Your Vector Storage Cost
Leverage Zilliz Serverless - Up to 50X Saving for Your Vector Storage Cost
 
Powerpoint exploring the locations used in television show Time Clash
Powerpoint exploring the locations used in television show Time ClashPowerpoint exploring the locations used in television show Time Clash
Powerpoint exploring the locations used in television show Time Clash
 
Anypoint Exchange: It’s Not Just a Repo!
Anypoint Exchange: It’s Not Just a Repo!Anypoint Exchange: It’s Not Just a Repo!
Anypoint Exchange: It’s Not Just a Repo!
 
The Ultimate Guide to Choosing WordPress Pros and Cons
The Ultimate Guide to Choosing WordPress Pros and ConsThe Ultimate Guide to Choosing WordPress Pros and Cons
The Ultimate Guide to Choosing WordPress Pros and Cons
 
CloudStudio User manual (basic edition):
CloudStudio User manual (basic edition):CloudStudio User manual (basic edition):
CloudStudio User manual (basic edition):
 
TeamStation AI System Report LATAM IT Salaries 2024
TeamStation AI System Report LATAM IT Salaries 2024TeamStation AI System Report LATAM IT Salaries 2024
TeamStation AI System Report LATAM IT Salaries 2024
 
Transcript: New from BookNet Canada for 2024: BNC CataList - Tech Forum 2024
Transcript: New from BookNet Canada for 2024: BNC CataList - Tech Forum 2024Transcript: New from BookNet Canada for 2024: BNC CataList - Tech Forum 2024
Transcript: New from BookNet Canada for 2024: BNC CataList - Tech Forum 2024
 
Hyperautomation and AI/ML: A Strategy for Digital Transformation Success.pdf
Hyperautomation and AI/ML: A Strategy for Digital Transformation Success.pdfHyperautomation and AI/ML: A Strategy for Digital Transformation Success.pdf
Hyperautomation and AI/ML: A Strategy for Digital Transformation Success.pdf
 
SAP Build Work Zone - Overview L2-L3.pptx
SAP Build Work Zone - Overview L2-L3.pptxSAP Build Work Zone - Overview L2-L3.pptx
SAP Build Work Zone - Overview L2-L3.pptx
 

Neuralnetwork 101222074552-phpapp02

  • 1. Neural Network: A Brief Overview Presented by Ashraful Alam 02/02/2004
  • 2. Outline  Introduction  Background  How the human brain works  A Neuron Model  A Simple Neuron  Pattern Recognition example  A Complicated Perceptron
  • 3. Outline Continued  Different types of Neural Networks  Network Layers and Structure  Training a Neural Network  Learning process  Neural Networks in use  Use of Neural Networks in C.A.I.S. project  Conclusion
  • 4. Introduction  What are Neural Networks? – Neural networks are a new method of programming computers. – They are exceptionally good at performing pattern recognition and other tasks that are very difficult to program using conventional techniques. – Programs that employ neural nets are also capable of learning on their own and adapting to changing conditions.
  • 5. Background  An Artificial Neural Network (ANN) is an information processing paradigm that is inspired by the biological nervous systems, such as the human brain’s information processing mechanism.  The key element of this paradigm is the novel structure of the information processing system. It is composed of a large number of highly interconnected processing elements (neurons) working in unison to solve specific problems. NNs, like people, learn by example.  An NN is configured for a specific application, such as pattern recognition or data classification, through a learning process. Learning in biological systems involves adjustments to the synaptic connections that exist between the neurons. This is true of NNs as well.
  • 6. How the Human Brain learns  In the human brain, a typical neuron collects signals from others through a host of fine structures called dendrites.  The neuron sends out spikes of electrical activity through a long, thin stand known as an axon, which splits into thousands of branches.  At the end of each branch, a structure called a synapse converts the activity from the axon into electrical effects that inhibit or excite activity in the connected neurons.
  • 7. A Neuron Model  When a neuron receives excitatory input that is sufficiently large compared with its inhibitory input, it sends a spike of electrical activity down its axon. Learning occurs by changing the effectiveness of the synapses so that the influence of one neuron on another changes.  We conduct these neural networks by first trying to deduce the essential features of neurons and their interconnections.  We then typically program a computer to simulate these features.
  • 8. A Simple Neuron  An artificial neuron is a device with many inputs and one output.  The neuron has two modes of operation;  the training mode and  the using mode.
  • 9. A Simple Neuron (Cont.)  In the training mode, the neuron can be trained to fire (or not), for particular input patterns.  In the using mode, when a taught input pattern is detected at the input, its associated output becomes the current output. If the input pattern does not belong in the taught list of input patterns, the firing rule is used to determine whether to fire or not.  The firing rule is an important concept in neural networks and accounts for their high flexibility. A firing rule determines how one calculates whether a neuron should fire for any input pattern. It relates to all the input patterns, not only the ones on which the node was trained on previously.
  • 10. Pattern Recognition  An important application of neural networks is pattern recognition. Pattern recognition can be implemented by using a feed-forward neural network that has been trained accordingly.  During training, the network is trained to associate outputs with input patterns.  When the network is used, it identifies the input pattern and tries to output the associated output pattern.  The power of neural networks comes to life when a pattern that has no output associated with it, is given as an input.  In this case, the network gives the output that corresponds to a taught input pattern that is least different from the given pattern.
  • 11. Pattern Recognition (cont.)  Suppose a network is trained to recognize the patterns T and H. The associated patterns are all black and all white respectively as shown above.
  • 12. Pattern Recognition (cont.) Since the input pattern looks more like a ‘T’, when the network classifies it, it sees the input closely resembling ‘T’ and outputs the pattern that represents a ‘T’.
  • 13. Pattern Recognition (cont.) The input pattern here closely resembles ‘H’ with a slight difference. The network in this case classifies it as an ‘H’ and outputs the pattern representing an ‘H’.
  • 14. Pattern Recognition (cont.)  Here the top row is 2 errors away from a ‘T’ and 3 errors away from an H. So the top output is a black.  The middle row is 1 error away from both T and H, so the output is random.  The bottom row is 1 error away from T and 2 away from H. Therefore the output is black.  Since the input resembles a ‘T’ more than an ‘H’ the output of the network is in favor of a ‘T’.
  • 15. A Complicated Perceptron  A more sophisticated Neuron is know as the McCulloch and Pitts model (MCP).  The difference is that in the MCP model, the inputs are weighted and the effect that each input has at decision making, is dependent on the weight of the particular input.  The weight of the input is a number which is multiplied with the input to give the weighted input.
  • 16. A Complicated Perceptron (cont.)  The weighted inputs are then added together and if they exceed a pre-set threshold value, the perceptron / neuron fires.  Otherwise it will not fire and the inputs tied to that perceptron will not have any effect on the decision making.  In mathematical terms, the neuron fires if and only if; X1W1 + X2W2 + X3W3 + ... > T
  • 17. A Complicated Perceptron  The MCP neuron has the ability to adapt to a particular situation by changing its weights and/or threshold.  Various algorithms exist that cause the neuron to 'adapt'; the most used ones are the Delta rule and the back error propagation.
  • 18. Different types of Neural Networks  Feed-forward networks – Feed-forward NNs allow signals to travel one way only; from input to output. There is no feedback (loops) i.e. the output of any layer does not affect that same layer. – Feed-forward NNs tend to be straight forward networks that associate inputs with outputs. They are extensively used in pattern recognition. – This type of organization is also referred to as bottom-up or top-down.
  • 19. Continued  Feedback networks – Feedback networks can have signals traveling in both directions by introducing loops in the network. – Feedback networks are dynamic; their 'state' is changing continuously until they reach an equilibrium point. – They remain at the equilibrium point until the input changes and a new equilibrium needs to be found. – Feedback architectures are also referred to as interactive or recurrent, although the latter term is often used to denote feedback connections in single-layer organizations.
  • 20. Diagram of an NN Fig: A simple Neural Network
  • 21. Network Layers  Input Layer - The activity of the input units represents the raw information that is fed into the network.  Hidden Layer - The activity of each hidden unit is determined by the activities of the input units and the weights on the connections between the input and the hidden units.  Output Layer - The behavior of the output units depends on the activity of the hidden units and the weights between the hidden and output units.
  • 22. Continued  This simple type of network is interesting because the hidden units are free to construct their own representations of the input.  The weights between the input and hidden units determine when each hidden unit is active, and so by modifying these weights, a hidden unit can choose what it represents.
  • 23. Network Structure  The number of layers and of neurons depend on the specific task. In practice this issue is solved by trial and error.  Two types of adaptive algorithms can be used: – start from a large network and successively remove some neurons and links until network performance degrades. – begin with a small network and introduce new neurons until performance is satisfactory.
  • 24. Network Parameters  How are the weights initialized?  How many hidden layers and how many neurons?  How many examples in the training set?
  • 25. Weights  In general, initial weights are randomly chosen, with typical values between -1.0 and 1.0 or -0.5 and 0.5.  There are two types of NNs. The first type is known as – Fixed Networks – where the weights are fixed – Adaptive Networks – where the weights are changed to reduce prediction error.
  • 26. Size of Training Data  Rule of thumb: – the number of training examples should be at least five to ten times the number of weights of the network.  Other rule: |W|= number of weights |W| N> a = expected accuracy on (1 - a) test set
  • 27. Training Basics  The most basic method of training a neural network is trial and error.  If the network isn't behaving the way it should, change the weighting of a random link by a random amount. If the accuracy of the network declines, undo the change and make a different one.  It takes time, but the trial and error method does produce results.
  • 28. Training: Backprop algorithm  The Backprop algorithm searches for weight values that minimize the total error of the network over the set of training examples (training set).  Backprop consists of the repeated application of the following two passes: – Forward pass: in this step the network is activated on one example and the error of (each neuron of) the output layer is computed. – Backward pass: in this step the network error is used for updating the weights. Starting at the output layer, the error is propagated backwards through the network, layer by layer. This is done by recursively computing the local gradient of each neuron.
  • 29. Back Propagation  Back-propagation training algorithm Network activation Forward Step Error propagation Backward Step  Backprop adjusts the weights of the NN in order to minimize the network total mean squared error.
  • 30. The Learning Process  The memorization of patterns and the subsequent response of the network can be categorized into two general paradigms: – Associative mapping – Regularity detection
  • 31. Associative Mapping  Associative mapping a type of NN in which the network learns to produce a particular pattern on the set of input units whenever another particular pattern is applied on the set of input units.  This allows the network to complete a pattern given parts of a pattern that is similar to a previously learned pattern.
  • 32. Regularity Detection  Regularity detection is a type of NN in which units learn to respond to particular properties of the input patterns. Whereas in associative mapping the network stores the relationships among patterns.  In regularity detection the response of each unit has a particular 'meaning'. This means that the activation of each unit corresponds to different input attributes.
  • 33. The Learning Process (cont.)  Every neural network possesses knowledge which is contained in the values of the connection weights.  Modifying the knowledge stored in the network as a function of experience implies a learning rule for changing the values of the weights.
  • 34. The Learning Process (cont.)  Recall: Adaptive networks are NNs that allow the change of weights in its connections.  The learning methods can be classified in two categories: – Supervised Learning – Unsupervised Learning
  • 35. Supervised Learning  Supervised learning which incorporates an external teacher, so that each output unit is told what its desired response to input signals ought to be.  An important issue concerning supervised learning is the problem of error convergence, ie the minimization of error between the desired and computed unit values.  The aim is to determine a set of weights which minimizes the error. One well-known method, which is common to many learning paradigms is the least mean square (LMS) convergence.
  • 36. Supervised Learning  In this sort of learning, the human teacher’s experience is used to tell the NN which outputs are correct and which are not.  This does not mean that a human teacher needs to be present at all times, only the correct classifications gathered from the human teacher on a domain needs to be present.  The network then learns from its error, that is, it changes its weight to reduce its prediction error.
  • 37. Unsupervised Learning  Unsupervised learning uses no external teacher and is based upon only local information. It is also referred to as self-organization, in the sense that it self-organizes data presented to the network and detects their emergent collective properties.  The network is then used to construct clusters of similar patterns.  This is particularly useful is domains were a instances are checked to match previous scenarios. For example, detecting credit card fraud.
  • 38. Neural Network in Use Since neural networks are best at identifying patterns or trends in data, they are well suited for prediction or forecasting needs including: – sales forecasting – industrial process control – customer research – data validation – risk management ANN are also used in the following specific paradigms: recognition of speakers in communications; diagnosis of hepatitis; undersea mine detection; texture analysis; three- dimensional object recognition; hand-written word recognition; and facial recognition.
  • 39. Neural networks in Medicine  Artificial Neural Networks (ANN) are currently a 'hot' research area in medicine and it is believed that they will receive extensive application to biomedical systems in the next few years.  At the moment, the research is mostly on modeling parts of the human body and recognizing diseases from various scans (e.g. cardiograms, CAT scans, ultrasonic scans, etc.).
  • 40. Continued  Neural networks are ideal in recognizing diseases using scans since there is no need to provide a specific algorithm on how to identify the disease.  Neural networks learn by example so the details of how to recognize the disease are not needed.  What is needed is a set of examples that are representative of all the variations of the disease.  The quantity of examples is not as important as the 'quality'. The examples need to be selected very carefully if the system is to perform reliably and efficiently.
  • 41. Modeling and Diagnosing the Cardiovascular System  Neural Networks are used experimentally to model the human cardiovascular system.  Diagnosis can be achieved by building a model of the cardiovascular system of an individual and comparing it with the real time physiological measurements taken from the patient.  If this routine is carried out regularly, potential harmful medical conditions can be detected at an early stage and thus make the process of combating the disease much easier.
  • 42. Continued  A model of an individual's cardiovascular system must mimic the relationship among physiological variables (i.e., heart rate, systolic and diastolic blood pressures, and breathing rate) at different physical activity levels.  If a model is adapted to an individual, then it becomes a model of the physical condition of that individual. The simulator will have to be able to adapt to the features of any individual without the supervision of an expert. This calls for a neural network.
  • 43. Continued  Another reason that justifies the use of NN technology, is the ability of NNs to provide sensor fusion which is the combining of values from several different sensors.  Sensor fusion enables the NNs to learn complex relationships among the individual sensor values, which would otherwise be lost if the values were individually analyzed.  In medical modeling and diagnosis, this implies that even though each sensor in a set may be sensitive only to a specific physiological variable, NNs are capable of detecting complex medical conditions by fusing the data from the individual biomedical sensors.
  • 44. Use of Neural Networks in C.A.I.S. Project  In C.A.I.S. Project, a Neural Network implementing supervised learning should be used.  This would allows the system to first predict an output and then compare its output with the gold standard to calculate its error.  By implementing a Neural Network utilizing the Back propagation algorithm, we can assure that the network will be able to reduce its prediction error given the right training examples.
  • 45. Continued  However, to do this, the network needs to have enough information/ variables to make connection to the diagnosis of different variations of heart murmur.  Simply the signal processed sound may give us an output, but it has the possibility to be error prone.  Hence the network will need to have a supply of contextual data, gathered from experts while training, to make strong and correct predictions.
  • 46. Disadvantage of Neural Network  The individual relations between the input variables and the output variables are not developed by engineering judgment so that the model tends to be a black box or input/output table without analytical basis.  The sample size has to be large.  Requires lot of trial and error so training can be time consuming.
  • 47. Conclusion  A Neural Network can be successfully used in the C.A.I.S. project, given that we have access to enough training data.  Since we need it to be highly accurate, we need to carefully gather contextual variables that closely relate to diagnosing heart murmur and its various grades.