SlideShare una empresa de Scribd logo
1 de 28
Descargar para leer sin conexión
Phoneme
Serialisation in
Speech
Production
Supervisor
Dr Andrew Olson
NOT
Serial Order in Speech
   Common Task Demands:
       System designed to recall a sequence of words
       System designed to recall a sequence of sounds or
        phonemes

   Speech
       the human brain is required to complete both tasks:
           sequencing phonemes in order to form word
           sequencing words in order to form coherent sentences

   Romani (Dec 2010)
     Parsimonious argument
           Same mechanisms underlie both operations.
Short Term Memory Solutions
   Substantial literature investigating our ability to
    recall word series from short term memory (STM)
           Henson 1998


   Relatively little is known about mechanisms
    underlying ability to recall and order phonemes
    in word production

   Aim:
       Use solutions to the problem of serial order within
        the phonological short term memory literature, to
        assess evidence for the presence of similar
        mechanisms in the sequencing of phonemes within
        word production.
Error Patterns
   Powerful means by which models can be assessed
             (Glasspool, 1998; Glasspool & Houghton, 2005)


       Intuitive that accurate model should provide error
        patterns that fit experimental data


   However:
       control groups do not produce errors when performing
        familiar word repetition tasks


   Romani (2010):
       Aphasic patients do provide error patterns in such tasks
Candidate Models
   Three distinct solutions (Henson, 1998)

       Chaining

       Ordinal

       Positional




   Compare          the   strength   of   evidence   within   aphasic
    phonological error data, in support of underlying chaining and
    positional based mechanisms

       Information Theoretic approach (Burnham & Anderson, 1998)
Formalisation
   Constraints:

       P(X) = probability of the next phoneme (x) in sequence being
        correct

       Covariates limited to values derived from patient data found in
        Romani et al (2010)

       Representative of the assigned theoretical perspective

       Parameters are limited to minimally sufficient set
Simple Chaining

   Chaining

       Simple & intuitive solution

       Influential in earlier literature

       Largely unsuccessful in modelling serial recall (Henson 1998).
Simple Chaining



               ������ ������ = ������1 ������1 + ������2 ������2 . ������3

where:
         x1 = binary value representing successful production of previous
         phoneme


         x2 = binary value indicating initial phoneme


         x3 = word frequency taken from Barcelona corpus, 1998.
Compound Chaining
   Cue:
       Association between element and preceding elements

   Avoids recovery limitations of simple chaining

   Limited success in modelling serial recall
       (Baddeley, 1996; Henson, 1996).
Compound Chaining


                           ������
                                     ������1,������
            ������ ������ = ������1                       ������2
                                                    + ������3 (������3 ) + ������4 ������4
                          ������=1
                                 1 + ������2,������

where:

n = position within word of previously produced phoneme

x1 = value representing production of correct phoneme at position i

x2 = distance of phoneme i from current position

x3 = binary value indicating initial phoneme

x4 = word frequency taken from Barcelona corpus, 1998.
Positional
   Capture key characteristics of serial recall

   Positional mechanism + competitive queuing framework
       Common to group of models with success in modelling
        experimental data across multiple domains
           (Glasspool et al, 2005).
Positional



                  ������1               1                 1
������ ������ = ������1            ������2
                           + ������3         ������4
                                             + ������5         ������6
                                                               + ������7 ������5
               1 + ������2           1 + ������3           1 + ������4


where:
x1 = phoneme of same type previously displayed within response
x2 = distance from current position to previous occurrences of phoneme in
response
x3 = distance from first phoneme
x4 = distance from final phoneme
x5 = word frequency taken from Barcelona corpus, 1998.
Fitting
   Exponent relationships

       required to capture defining characteristics of theoretical
        perspectives

   Log Likelihood function Maximised

       optimisation algorithm:
           variant of a simulated annealing (Belisle, 1992)


       Likelihood = response - model output

           Model output = probability of the next phoneme in sequence being
            correct

           Response = phoneme in sequence correctly produced
Patient Data
   Romani et al (2010)

       All patients were Italian speaking

       Selected for study due to phonological errors made in speech
        production.

           articulatory planning difficulties (apraxia)

           difficulty retrieving phonological representations (phonological),


       Word repetition task.

           Response:

               successfully produced (1) = match target

               unsuccessfully produced (0).
Artificial Data
   Models assessed on their ability to identify the underlying
    mechanism used to produce artificially generated data.

   Three artificial data sets produced using a Monte Carlo method

       Simple Chaining, Compound Chaining & Positional



   Akaike’s Information Criteria:
       selects ‘best’ model irrespective of the quality of the models with
        candidate set


   Can models produce distinct data sets?

   Can Information Theoretic approach correctly identify underlying
    mechanism?
Selection
   Information Theoretic approach

       Burnham & Anderson (1998)

       Allows models to be ranked according to AIC

           An estimate of the fitted model’s expected distance
            from the unknown true mechanism that underlies the
            observed data

           Effective measure when models are not hierarchically
            related
Results

 Results
        of fitting models to patient data are
 not currently available
Results
             Table 1: Regression coefficients for models fit to artificial data sets

                                                                  Coefficients
Data Set              Model                  β1       β2       β3        β4       β5      β6      β7
Basic Chaining        True*                0.300    0.000    0.070
                      Basic Chaining       0.305   -0.095    0.069
                      Compound Chaining    0.541    5.609   -0.068     0.072
                      Positional          -0.109    4.288   -0.065     2.733     0.024   2.600   0.097
Compound Chaining     True*                0.300    2.000    0.000     0.070
                      Basic Chaining       0.039   -2.127    0.089
                      Compound Chaining    0.205    1.881    0.042     0.078
                      Positional          -0.026    5.024   -0.090     0.621     0.057   4.038   0.091
Positional            True*               -0.175    2.000    0.270     2.000     0.270   2.000   0.070
                      Basic Chaining       0.045    4.469    0.080
                      Compound Chaining   -0.070    4.463   -1.103     0.090
                      Positional          -0.018    2.018    0.260     2.165     0.263   4.103   0.071



       * ‘True’ indicates coefficients used in the creation of artificial
       data sets, they do not represent the true values for artificially
       created data sets due to the randomising procedures used in
       the data creation process.
Results
  Table 2: Likelihood and AIC values for models fit to artificial data sets

                 Data Set            Model        Likelihood   K     AIC
          Basic Chaining      True*               -3025.597    3   6057.19
                              Basic Chaining      -3031.740    3   6069.48
                              Compound Chaining   -3029.349    4    6066.7
                              Positional (SEM)    -3267.722    7   6549.44
          Compound Chaining   True*               -3389.640    4   6787.28
                              Basic Chaining      -3447.349    3    6900.7
                              Compound Chaining   -3396.606    4   6801.21
                              Positional (SEM)    -3481.528    7   6977.06
          Positional          True*               -3691.895    7   7397.79
                              Basic Chaining      -3849.396    3   7704.79
                              Compound Chaining   -3896.782    4   7801.56
                              Positional (SEM)    -3701.070    7   7416.14




    * ‘True’ indicates coefficients used in the creation of artificial
    data sets, they do not represent the true values for artificially
    created data sets due to the randomising procedures used in
    the data creation process.
Results
Accuracy of phoneme production in relative word locations for
models fit to artificial Basic Chaining data


                                                   0.6
       Probability of producing correct phoneme




                                                  0.55


                                                   0.5
                                                                                                                          Artificial Data
                                                                                                                          Basic Chain
                                                  0.45
                                                                                                                          Comp Chain
                                                                                                                          Positional
                                                   0.4


                                                  0.35
                                                         0-20           21-40         41-60         61-80        81-100
                                                                Location of phoneme as a percentage of word length
Results
Accuracy of phoneme production in relative word locations for models
fit to artificial Compound Chaining data

                                           0.56

                                           0.54
Probability of producing correct phoneme




                                           0.52

                                            0.5

                                           0.48                                                                        Artificial Data
                                                                                                                       Basic Chain
                                           0.46
                                                                                                                       Comp Chain

                                           0.44                                                                        Positional


                                           0.42

                                            0.4
                                                  0-20       21-40             41-60             61-80        81-100
                                                         Location of phoneme as a percentage of word length
Results
                               Accuracy of phoneme production in relative word locations for
                               models fit to artificial Positional data


                                           0.65
Probability of producing correct phoneme




                                            0.6



                                           0.55
                                                                                                                       M.C. Data
                                                                                                                       Basic Chain
                                            0.5
                                                                                                                       Comp Chain
                                                                                                                       Positional
                                           0.45



                                            0.4
                                                  0-20      21-40              41-60             61-80        81-100
                                                         Location of phoneme as a percentage of word length
Discussion
 Fitting   models to artificial data:

     Models based on positional and chaining
      processes are able to produce distinct data
      sets

     Model fitting and selection process applied
      provides valid insight into the mechanisms that
      underlie data production
Discussion
 ‘Best’   model selected:

     Do models capture theoretical perspective

     Applying similar models and procedures to
      analyse STM data may provide an insightful
      means of assessing the quality of models
      defined in this study.
Discussion
   Improvements to fitting procedure:

       Re-writing R scripts to exploit the most efficient functions and
        data structures within the language


       Examining the true function to inform selection of initial
        parameters


       Condense patient data, e.g. calculate probabilities for
        accuracy at each position within a word for each word length


       Simplify models
Discussion
 Extend   analysis to entire patient data set

    Inform our understanding of variation in the
     mechanisms underlying patient error both
     within and between patient categories.
Skills Developed
   Competence in formalisation

   Fit and evaluation of mathematical models

   Proficiency in the use of statistics program R to
    construct and analyse model performance

   Ability to interpret and evaluate model performance
    in the context of behavioural data and assess wider
    theoretical implications

Más contenido relacionado

Destacado

Production and Comprehension Process of Language
Production and Comprehension Process of LanguageProduction and Comprehension Process of Language
Production and Comprehension Process of LanguageRiska Daenangsari
 
How children learn language
How children learn languageHow children learn language
How children learn languageArash Yazdani
 
Intro to phonology lectr 2
Intro to phonology lectr 2Intro to phonology lectr 2
Intro to phonology lectr 2Hina Honey
 
Phoneme and feature theory
Phoneme and feature theoryPhoneme and feature theory
Phoneme and feature theoryHina Honey
 
Language production
Language productionLanguage production
Language productionWinda Widia
 
Introduction to linguistic (6)
Introduction to linguistic (6)Introduction to linguistic (6)
Introduction to linguistic (6)Florizqul Shodiq
 
LANGUAGE PRODUCTION IN PSYCOLINGUISTIC
LANGUAGE PRODUCTION IN PSYCOLINGUISTICLANGUAGE PRODUCTION IN PSYCOLINGUISTIC
LANGUAGE PRODUCTION IN PSYCOLINGUISTICAnisa Asharie
 
Allophone and phoneme. persentation
Allophone and phoneme. persentationAllophone and phoneme. persentation
Allophone and phoneme. persentationDessy Restu Restu
 
The production of speech
The production of speechThe production of speech
The production of speechHiroshi Sakae
 
Production of Speech
Production of SpeechProduction of Speech
Production of SpeechImas Niŋʃih
 
Intro. to Linguistics_6 Phonetics (Organ of Speech, Segment, Articulation)
Intro. to Linguistics_6 Phonetics (Organ of Speech, Segment, Articulation)Intro. to Linguistics_6 Phonetics (Organ of Speech, Segment, Articulation)
Intro. to Linguistics_6 Phonetics (Organ of Speech, Segment, Articulation)Edi Brata
 
The Articulation Of Consonants
The Articulation Of ConsonantsThe Articulation Of Consonants
The Articulation Of ConsonantsIsabel gutierrez
 
speech production in psycholinguistics
speech production in psycholinguistics speech production in psycholinguistics
speech production in psycholinguistics Aseel K. Mahmood
 
English vowel sounds. Classification
English vowel sounds. ClassificationEnglish vowel sounds. Classification
English vowel sounds. ClassificationIsrael Reyes Alvarez
 
Anatomy of speech production
Anatomy of speech productionAnatomy of speech production
Anatomy of speech productionbethfernandezaud
 
GENERAL CLASSIFICATION OF VOWELS
GENERAL CLASSIFICATION OF VOWELSGENERAL CLASSIFICATION OF VOWELS
GENERAL CLASSIFICATION OF VOWELSnorielr
 
Vowel sounds
Vowel soundsVowel sounds
Vowel soundssrare
 

Destacado (20)

Production and Comprehension Process of Language
Production and Comprehension Process of LanguageProduction and Comprehension Process of Language
Production and Comprehension Process of Language
 
How children learn language
How children learn languageHow children learn language
How children learn language
 
Intro to phonology lectr 2
Intro to phonology lectr 2Intro to phonology lectr 2
Intro to phonology lectr 2
 
Speech sounds
Speech sounds Speech sounds
Speech sounds
 
Phoneme and feature theory
Phoneme and feature theoryPhoneme and feature theory
Phoneme and feature theory
 
Language production
Language productionLanguage production
Language production
 
Introduction to linguistic (6)
Introduction to linguistic (6)Introduction to linguistic (6)
Introduction to linguistic (6)
 
LANGUAGE PRODUCTION IN PSYCOLINGUISTIC
LANGUAGE PRODUCTION IN PSYCOLINGUISTICLANGUAGE PRODUCTION IN PSYCOLINGUISTIC
LANGUAGE PRODUCTION IN PSYCOLINGUISTIC
 
Allophone and phoneme. persentation
Allophone and phoneme. persentationAllophone and phoneme. persentation
Allophone and phoneme. persentation
 
The production of speech
The production of speechThe production of speech
The production of speech
 
Production of Speech
Production of SpeechProduction of Speech
Production of Speech
 
Intro. to Linguistics_6 Phonetics (Organ of Speech, Segment, Articulation)
Intro. to Linguistics_6 Phonetics (Organ of Speech, Segment, Articulation)Intro. to Linguistics_6 Phonetics (Organ of Speech, Segment, Articulation)
Intro. to Linguistics_6 Phonetics (Organ of Speech, Segment, Articulation)
 
Phonemes and allophones
Phonemes and allophonesPhonemes and allophones
Phonemes and allophones
 
The Articulation Of Consonants
The Articulation Of ConsonantsThe Articulation Of Consonants
The Articulation Of Consonants
 
speech production in psycholinguistics
speech production in psycholinguistics speech production in psycholinguistics
speech production in psycholinguistics
 
English vowel sounds. Classification
English vowel sounds. ClassificationEnglish vowel sounds. Classification
English vowel sounds. Classification
 
Anatomy of speech production
Anatomy of speech productionAnatomy of speech production
Anatomy of speech production
 
GENERAL CLASSIFICATION OF VOWELS
GENERAL CLASSIFICATION OF VOWELSGENERAL CLASSIFICATION OF VOWELS
GENERAL CLASSIFICATION OF VOWELS
 
Vowel sounds
Vowel soundsVowel sounds
Vowel sounds
 
The sounds of english
The sounds of englishThe sounds of english
The sounds of english
 

Similar a Phoneme Serialisation In Speech Production

Experiments in genetic programming
Experiments in genetic programmingExperiments in genetic programming
Experiments in genetic programmingLars Marius Garshol
 
NEURAL Network Design Training
NEURAL Network Design  TrainingNEURAL Network Design  Training
NEURAL Network Design TrainingESCOM
 
VALIDATION METHOD OF FUZZY ASSOCIATION RULES BASED ON FUZZY FORMAL CONCEPT AN...
VALIDATION METHOD OF FUZZY ASSOCIATION RULES BASED ON FUZZY FORMAL CONCEPT AN...VALIDATION METHOD OF FUZZY ASSOCIATION RULES BASED ON FUZZY FORMAL CONCEPT AN...
VALIDATION METHOD OF FUZZY ASSOCIATION RULES BASED ON FUZZY FORMAL CONCEPT AN...cscpconf
 
Standard Statistical Feature analysis of Image Features for Facial Images usi...
Standard Statistical Feature analysis of Image Features for Facial Images usi...Standard Statistical Feature analysis of Image Features for Facial Images usi...
Standard Statistical Feature analysis of Image Features for Facial Images usi...Bulbul Agrawal
 
Comparing Machine Learning Algorithms in Text Mining
Comparing Machine Learning Algorithms in Text MiningComparing Machine Learning Algorithms in Text Mining
Comparing Machine Learning Algorithms in Text MiningAndrea Gigli
 
Comparison of hybrid pso sa algorithm and genetic algorithm for classification
Comparison of hybrid pso sa algorithm and genetic algorithm for classificationComparison of hybrid pso sa algorithm and genetic algorithm for classification
Comparison of hybrid pso sa algorithm and genetic algorithm for classificationAlexander Decker
 
B.sc biochem i bobi u 3.1 sequence alignment
B.sc biochem i bobi u 3.1 sequence alignmentB.sc biochem i bobi u 3.1 sequence alignment
B.sc biochem i bobi u 3.1 sequence alignmentRai University
 
B.sc biochem i bobi u 3.1 sequence alignment
B.sc biochem i bobi u 3.1 sequence alignmentB.sc biochem i bobi u 3.1 sequence alignment
B.sc biochem i bobi u 3.1 sequence alignmentRai University
 
11.comparison of hybrid pso sa algorithm and genetic algorithm for classifica...
11.comparison of hybrid pso sa algorithm and genetic algorithm for classifica...11.comparison of hybrid pso sa algorithm and genetic algorithm for classifica...
11.comparison of hybrid pso sa algorithm and genetic algorithm for classifica...Alexander Decker
 
Improving Classifier Accuracy using Unlabeled Data..doc
Improving Classifier Accuracy using Unlabeled Data..docImproving Classifier Accuracy using Unlabeled Data..doc
Improving Classifier Accuracy using Unlabeled Data..docbutest
 
Final Semester Project
Final Semester ProjectFinal Semester Project
Final Semester ProjectDebraj Paul
 
Data Trend Analysis by Assigning Polynomial Function For Given Data Set
Data Trend Analysis by Assigning Polynomial Function For Given Data SetData Trend Analysis by Assigning Polynomial Function For Given Data Set
Data Trend Analysis by Assigning Polynomial Function For Given Data SetIJCERT
 
Interpretable Machine Learning Using LIME Framework - Kasia Kulma (PhD), Data...
Interpretable Machine Learning Using LIME Framework - Kasia Kulma (PhD), Data...Interpretable Machine Learning Using LIME Framework - Kasia Kulma (PhD), Data...
Interpretable Machine Learning Using LIME Framework - Kasia Kulma (PhD), Data...Sri Ambati
 
Convolutional neural network
Convolutional neural networkConvolutional neural network
Convolutional neural networkItachi SK
 
Ann model and its application
Ann model and its applicationAnn model and its application
Ann model and its applicationmilan107
 
Cs221 lecture5-fall11
Cs221 lecture5-fall11Cs221 lecture5-fall11
Cs221 lecture5-fall11darwinrlo
 
Artificial Intelligence and Optimization with Parallelism
Artificial Intelligence and Optimization with ParallelismArtificial Intelligence and Optimization with Parallelism
Artificial Intelligence and Optimization with ParallelismOlivier Teytaud
 

Similar a Phoneme Serialisation In Speech Production (20)

Experiments in genetic programming
Experiments in genetic programmingExperiments in genetic programming
Experiments in genetic programming
 
NEURAL Network Design Training
NEURAL Network Design  TrainingNEURAL Network Design  Training
NEURAL Network Design Training
 
VALIDATION METHOD OF FUZZY ASSOCIATION RULES BASED ON FUZZY FORMAL CONCEPT AN...
VALIDATION METHOD OF FUZZY ASSOCIATION RULES BASED ON FUZZY FORMAL CONCEPT AN...VALIDATION METHOD OF FUZZY ASSOCIATION RULES BASED ON FUZZY FORMAL CONCEPT AN...
VALIDATION METHOD OF FUZZY ASSOCIATION RULES BASED ON FUZZY FORMAL CONCEPT AN...
 
Standard Statistical Feature analysis of Image Features for Facial Images usi...
Standard Statistical Feature analysis of Image Features for Facial Images usi...Standard Statistical Feature analysis of Image Features for Facial Images usi...
Standard Statistical Feature analysis of Image Features for Facial Images usi...
 
Matching Conceptual Models Using Multivariate Analysis
Matching Conceptual Models Using Multivariate AnalysisMatching Conceptual Models Using Multivariate Analysis
Matching Conceptual Models Using Multivariate Analysis
 
Comparing Machine Learning Algorithms in Text Mining
Comparing Machine Learning Algorithms in Text MiningComparing Machine Learning Algorithms in Text Mining
Comparing Machine Learning Algorithms in Text Mining
 
Comparison of hybrid pso sa algorithm and genetic algorithm for classification
Comparison of hybrid pso sa algorithm and genetic algorithm for classificationComparison of hybrid pso sa algorithm and genetic algorithm for classification
Comparison of hybrid pso sa algorithm and genetic algorithm for classification
 
14078956.ppt
14078956.ppt14078956.ppt
14078956.ppt
 
B.sc biochem i bobi u 3.1 sequence alignment
B.sc biochem i bobi u 3.1 sequence alignmentB.sc biochem i bobi u 3.1 sequence alignment
B.sc biochem i bobi u 3.1 sequence alignment
 
B.sc biochem i bobi u 3.1 sequence alignment
B.sc biochem i bobi u 3.1 sequence alignmentB.sc biochem i bobi u 3.1 sequence alignment
B.sc biochem i bobi u 3.1 sequence alignment
 
11.comparison of hybrid pso sa algorithm and genetic algorithm for classifica...
11.comparison of hybrid pso sa algorithm and genetic algorithm for classifica...11.comparison of hybrid pso sa algorithm and genetic algorithm for classifica...
11.comparison of hybrid pso sa algorithm and genetic algorithm for classifica...
 
Improving Classifier Accuracy using Unlabeled Data..doc
Improving Classifier Accuracy using Unlabeled Data..docImproving Classifier Accuracy using Unlabeled Data..doc
Improving Classifier Accuracy using Unlabeled Data..doc
 
Final Semester Project
Final Semester ProjectFinal Semester Project
Final Semester Project
 
Data Trend Analysis by Assigning Polynomial Function For Given Data Set
Data Trend Analysis by Assigning Polynomial Function For Given Data SetData Trend Analysis by Assigning Polynomial Function For Given Data Set
Data Trend Analysis by Assigning Polynomial Function For Given Data Set
 
Interpretable Machine Learning Using LIME Framework - Kasia Kulma (PhD), Data...
Interpretable Machine Learning Using LIME Framework - Kasia Kulma (PhD), Data...Interpretable Machine Learning Using LIME Framework - Kasia Kulma (PhD), Data...
Interpretable Machine Learning Using LIME Framework - Kasia Kulma (PhD), Data...
 
Convolutional neural network
Convolutional neural networkConvolutional neural network
Convolutional neural network
 
Protein Structure Alignment
Protein Structure AlignmentProtein Structure Alignment
Protein Structure Alignment
 
Ann model and its application
Ann model and its applicationAnn model and its application
Ann model and its application
 
Cs221 lecture5-fall11
Cs221 lecture5-fall11Cs221 lecture5-fall11
Cs221 lecture5-fall11
 
Artificial Intelligence and Optimization with Parallelism
Artificial Intelligence and Optimization with ParallelismArtificial Intelligence and Optimization with Parallelism
Artificial Intelligence and Optimization with Parallelism
 

Phoneme Serialisation In Speech Production

  • 2. NOT
  • 3. Serial Order in Speech  Common Task Demands:  System designed to recall a sequence of words  System designed to recall a sequence of sounds or phonemes  Speech  the human brain is required to complete both tasks:  sequencing phonemes in order to form word  sequencing words in order to form coherent sentences  Romani (Dec 2010)  Parsimonious argument  Same mechanisms underlie both operations.
  • 4. Short Term Memory Solutions  Substantial literature investigating our ability to recall word series from short term memory (STM)  Henson 1998  Relatively little is known about mechanisms underlying ability to recall and order phonemes in word production  Aim:  Use solutions to the problem of serial order within the phonological short term memory literature, to assess evidence for the presence of similar mechanisms in the sequencing of phonemes within word production.
  • 5. Error Patterns  Powerful means by which models can be assessed  (Glasspool, 1998; Glasspool & Houghton, 2005)  Intuitive that accurate model should provide error patterns that fit experimental data  However:  control groups do not produce errors when performing familiar word repetition tasks  Romani (2010):  Aphasic patients do provide error patterns in such tasks
  • 6. Candidate Models  Three distinct solutions (Henson, 1998)  Chaining  Ordinal  Positional  Compare the strength of evidence within aphasic phonological error data, in support of underlying chaining and positional based mechanisms  Information Theoretic approach (Burnham & Anderson, 1998)
  • 7. Formalisation  Constraints:  P(X) = probability of the next phoneme (x) in sequence being correct  Covariates limited to values derived from patient data found in Romani et al (2010)  Representative of the assigned theoretical perspective  Parameters are limited to minimally sufficient set
  • 8. Simple Chaining  Chaining  Simple & intuitive solution  Influential in earlier literature  Largely unsuccessful in modelling serial recall (Henson 1998).
  • 9. Simple Chaining ������ ������ = ������1 ������1 + ������2 ������2 . ������3 where: x1 = binary value representing successful production of previous phoneme x2 = binary value indicating initial phoneme x3 = word frequency taken from Barcelona corpus, 1998.
  • 10. Compound Chaining  Cue:  Association between element and preceding elements  Avoids recovery limitations of simple chaining  Limited success in modelling serial recall  (Baddeley, 1996; Henson, 1996).
  • 11. Compound Chaining ������ ������1,������ ������ ������ = ������1 ������2 + ������3 (������3 ) + ������4 ������4 ������=1 1 + ������2,������ where: n = position within word of previously produced phoneme x1 = value representing production of correct phoneme at position i x2 = distance of phoneme i from current position x3 = binary value indicating initial phoneme x4 = word frequency taken from Barcelona corpus, 1998.
  • 12. Positional  Capture key characteristics of serial recall  Positional mechanism + competitive queuing framework  Common to group of models with success in modelling experimental data across multiple domains  (Glasspool et al, 2005).
  • 13. Positional ������1 1 1 ������ ������ = ������1 ������2 + ������3 ������4 + ������5 ������6 + ������7 ������5 1 + ������2 1 + ������3 1 + ������4 where: x1 = phoneme of same type previously displayed within response x2 = distance from current position to previous occurrences of phoneme in response x3 = distance from first phoneme x4 = distance from final phoneme x5 = word frequency taken from Barcelona corpus, 1998.
  • 14. Fitting  Exponent relationships  required to capture defining characteristics of theoretical perspectives  Log Likelihood function Maximised  optimisation algorithm:  variant of a simulated annealing (Belisle, 1992)  Likelihood = response - model output  Model output = probability of the next phoneme in sequence being correct  Response = phoneme in sequence correctly produced
  • 15. Patient Data  Romani et al (2010)  All patients were Italian speaking  Selected for study due to phonological errors made in speech production.  articulatory planning difficulties (apraxia)  difficulty retrieving phonological representations (phonological),  Word repetition task.  Response:  successfully produced (1) = match target  unsuccessfully produced (0).
  • 16. Artificial Data  Models assessed on their ability to identify the underlying mechanism used to produce artificially generated data.  Three artificial data sets produced using a Monte Carlo method  Simple Chaining, Compound Chaining & Positional  Akaike’s Information Criteria:  selects ‘best’ model irrespective of the quality of the models with candidate set  Can models produce distinct data sets?  Can Information Theoretic approach correctly identify underlying mechanism?
  • 17. Selection  Information Theoretic approach  Burnham & Anderson (1998)  Allows models to be ranked according to AIC  An estimate of the fitted model’s expected distance from the unknown true mechanism that underlies the observed data  Effective measure when models are not hierarchically related
  • 18. Results  Results of fitting models to patient data are not currently available
  • 19. Results Table 1: Regression coefficients for models fit to artificial data sets Coefficients Data Set Model β1 β2 β3 β4 β5 β6 β7 Basic Chaining True* 0.300 0.000 0.070 Basic Chaining 0.305 -0.095 0.069 Compound Chaining 0.541 5.609 -0.068 0.072 Positional -0.109 4.288 -0.065 2.733 0.024 2.600 0.097 Compound Chaining True* 0.300 2.000 0.000 0.070 Basic Chaining 0.039 -2.127 0.089 Compound Chaining 0.205 1.881 0.042 0.078 Positional -0.026 5.024 -0.090 0.621 0.057 4.038 0.091 Positional True* -0.175 2.000 0.270 2.000 0.270 2.000 0.070 Basic Chaining 0.045 4.469 0.080 Compound Chaining -0.070 4.463 -1.103 0.090 Positional -0.018 2.018 0.260 2.165 0.263 4.103 0.071 * ‘True’ indicates coefficients used in the creation of artificial data sets, they do not represent the true values for artificially created data sets due to the randomising procedures used in the data creation process.
  • 20. Results Table 2: Likelihood and AIC values for models fit to artificial data sets Data Set Model Likelihood K AIC Basic Chaining True* -3025.597 3 6057.19 Basic Chaining -3031.740 3 6069.48 Compound Chaining -3029.349 4 6066.7 Positional (SEM) -3267.722 7 6549.44 Compound Chaining True* -3389.640 4 6787.28 Basic Chaining -3447.349 3 6900.7 Compound Chaining -3396.606 4 6801.21 Positional (SEM) -3481.528 7 6977.06 Positional True* -3691.895 7 7397.79 Basic Chaining -3849.396 3 7704.79 Compound Chaining -3896.782 4 7801.56 Positional (SEM) -3701.070 7 7416.14 * ‘True’ indicates coefficients used in the creation of artificial data sets, they do not represent the true values for artificially created data sets due to the randomising procedures used in the data creation process.
  • 21. Results Accuracy of phoneme production in relative word locations for models fit to artificial Basic Chaining data 0.6 Probability of producing correct phoneme 0.55 0.5 Artificial Data Basic Chain 0.45 Comp Chain Positional 0.4 0.35 0-20 21-40 41-60 61-80 81-100 Location of phoneme as a percentage of word length
  • 22. Results Accuracy of phoneme production in relative word locations for models fit to artificial Compound Chaining data 0.56 0.54 Probability of producing correct phoneme 0.52 0.5 0.48 Artificial Data Basic Chain 0.46 Comp Chain 0.44 Positional 0.42 0.4 0-20 21-40 41-60 61-80 81-100 Location of phoneme as a percentage of word length
  • 23. Results Accuracy of phoneme production in relative word locations for models fit to artificial Positional data 0.65 Probability of producing correct phoneme 0.6 0.55 M.C. Data Basic Chain 0.5 Comp Chain Positional 0.45 0.4 0-20 21-40 41-60 61-80 81-100 Location of phoneme as a percentage of word length
  • 24. Discussion  Fitting models to artificial data:  Models based on positional and chaining processes are able to produce distinct data sets  Model fitting and selection process applied provides valid insight into the mechanisms that underlie data production
  • 25. Discussion  ‘Best’ model selected:  Do models capture theoretical perspective  Applying similar models and procedures to analyse STM data may provide an insightful means of assessing the quality of models defined in this study.
  • 26. Discussion  Improvements to fitting procedure:  Re-writing R scripts to exploit the most efficient functions and data structures within the language  Examining the true function to inform selection of initial parameters  Condense patient data, e.g. calculate probabilities for accuracy at each position within a word for each word length  Simplify models
  • 27. Discussion  Extend analysis to entire patient data set  Inform our understanding of variation in the mechanisms underlying patient error both within and between patient categories.
  • 28. Skills Developed  Competence in formalisation  Fit and evaluation of mathematical models  Proficiency in the use of statistics program R to construct and analyse model performance  Ability to interpret and evaluate model performance in the context of behavioural data and assess wider theoretical implications