SlideShare una empresa de Scribd logo
1 de 31
Current design
       21-may-09
        SRL System 1.0.
Code extended by Hector Franco
       UML description
DepTree                                                 Name of class
+ vector<string> stanza;
+ vector<unsigned int> pred;                     atributes
+ vector<vector<pair<unsigned int, string> > > apreds;
+ vector<DepTreeNode *> all_nodes;
+ vector<unsigned int> pos_order_vector
+ DepTree(void);
+ DepTree(vector<string> input);                                  methods
+ bool build_tree(void);
+ void export_tree(ofstream &f);
+ bool is_pred(unsigned int index);
+ bool get_sem_dep(unsigned int dep_index, unsigned int head_index, string &rel);
+ bool set_sem_dep(unsigned int dep_index, unsigned int head_index, string &rel);
+ bool get_syn_dep(unsigned int dep_index, unsigned int head_index, string &rel);
+ DepTreeNode *get_lub(DepTreeNode *dt1, DepTreeNode *dt2, vector<DepTreeNode *>&
lpath, vector<DepTreeNode *>& rpath);
 + bool make_subset_for_pred(unsigned int p_index, vector<DepTreeNode *>& nodes);
+ bool make_fresh_tree_for_pred(unsigned int p_index, DepTree& sub_tree);
+ bool export_a_sub_tree(unsigned int p_index, ofstream& sub_tree_f);
+ bool set_post_order_index()
+ boid set_pointers()
+ unsigned int post_order_2_dep_order(unsigned int val);
+ unsigned int sub_tree_order_2_sentence_order(unsigned int val);
// debug:
+ void dot_show(void);
 + void dot_show(ofstream &f);
 + void show_stanza(void);
 + void show_words(void);
+ test1();
DepTreeNode
+vector<DepTreeNode *> dtrs; // this node's dependents
+ DepTreeNode *parent; // points to this node's 'head‘
+ unsigned int index; // which node is it
+ unsigned int post_order_index;
+ DepTreeNode * me;
+ string form;
+ string lemma;
+ string pos;
+ string dep_rel; // what rel. between head and this node
+ string pred;
+ DepTree *cntr; // container via this can back up to whole tree if necessary


+ DepTreeNode(void);
+ set_post_order_index(unsigned int & counter, vector<unsigned int & post_order_vector>)
+ DepTreeNode(DepTree *dt, unsigned int i);
+ void show();
Tree // it is a struct
Postorder_list *postorder_list;       NO changes in Tree Distance implementation.
int *keyroots
none

Postorder_list // it is a struct
Int type;
Enum wildtype wild
Float weight
Int father
Int sons
Int leftmostleaf
none

AlignRecord // it is a struct      alignOutcome// it is a struct
Start_of_match                     N // node index
End_of_match                       Enum matchtype match_type
Align_src_trg;
                                   none
Align_trg_src
none
Description:
  + void re_label(Corpus_type training_data);

• Generate sub_dep_trees in traing and testing.
• For each sub_dep_tree:
  Get a ordered list of the most near sub dependences
    trees on training data with out alignment, and
    generat the alignment only if is need it.
  For each semantic relation take the K first and select the
    most frequent label (or other KNN voting method).
  Using the pointer on info copy back the selected
    semantic relation and update stanza.
1                 2
                                                          4
                                    Time ->
                                        3               sub
                                                        Tree
                                                       predicate

                                         Deep           sub
                                          Tree          Tree
                                        sentence
                                         Deep          predicate
  Hard disk:                              Tree          sub
                      Corpus 1
Training data                           sentence
                                                        Tree
                                                       predicate


Testing data

                                         Deep           sub
                      Corpus 2            Tree          Tree
 1 training and                          Deep
                                        sentence       predicate
 testing data                             Tree
 are at the file                        sentence
 system            2 data is read
                   it, and            3 each corpus    4 each
                   created a          contains         sentence is
                   corpus.            sentences in a   discomposed
                                      dependency       in sub-
                                      tree structure   trees, one for
                                                       each
5                                            6                                   7
                               Time ->
 sub
                                         t1
 Tree                                                                            q1
predicate
                                              t2
 sub
 Tree                                              t3
predicate
 sub
 Tree
predicate



 sub
 Tree                                         q1
predicate



      5-6 each sub-tree is translated               6-7 for each sub-tree from the training data, is
      into a tree structure, for tree-              found the K nearest (knn),
      distance algorithm                            And save the possible alignments.
                                                    If found just one of the most nears trees, who
                                                    contain the wanted semantic relation beween
                                                    the wanted 2 nodes.
8                                                   11                                   12
                            -
                                Time ->
                                  10


    q1                                             sentence          Corpus 2


                                                                                    Output file
                                  Deep
                                  Tree
                                 sentence


8-10 using maping nodes                     10-11 for each relation it’s selected the
dep-tree to tree-distance                   most voted label, and save the results in
                                            the sentences,


                                            11-12, all corpus is write back into the file
                                            system in the same format with new the
                                            labels, ready for the scoring script.
Current design
 first design
   SRL System
 UML description
DepTree                                       Name of class
+ vector<string> stanza;
+ vector<unsigned int> pred;         atributes
+ vector<vector<pair<unsigned int, string> > > apreds;              Current design
+ vector<DepTreeNode *> all_nodes;
+ DepTree(void);
                                                    methods
+ DepTree(vector<string> input);
+ bool build_tree(void);
+ void export_tree(ifstream &f); // not yet implemented
+ bool is_pred(unsigned int index);
+ bool get_sem_dep(unsigned int dep_index, unsigned int head_index, string &rel);
+ bool get_syn_dep(unsigned int dep_index, unsigned int head_index, string &rel);
+ DepTreeNode *get_lub(DepTreeNode *dt1, DepTreeNode *dt2, vector<DepTreeNode
*>& lpath, vector<DepTreeNode *>& rpath);
+ bool make_subset_for_pred(unsigned int p_index, vector<DepTreeNode *>& nodes);
+ bool make_fresh_tree_for_pred(unsigned int p_index, DepTree& sub_tree);
+ bool export_a_sub_tree(unsigned int p_index, ofstream& sub_tree_f);

// debug:
+ void dot_show(void);
 + void dot_show(ofstream &f);
 + void show_stanza(void);
 + void show_words(void);
Current design




DepTreeNode
+vector<DepTreeNode *> dtrs; // this node's dependents
+ DepTreeNode *parent; // points to this node's 'head‘
+ unsigned int index; // which node is it
+ string form;
+ string lemma;
+ string pos;
+ string dep_rel; // what rel. between head and this node
+ string pred;
+ DepTree *cntr; // container via this can back up to whole tree if necessary
+ DepTreeNode(void);
+ DepTreeNode(DepTree *dt, unsigned int i);
+ void show();
Tree // it is a struct
Postorder_list *postorder_list;       Current design
int *keyroots
none                                  No objects in tree distance implementation

Postorder_list // it is a struct
Int type;
Enum wildtype wild
Float weight
Int father
Int sons
Int leftmostleaf
none

AlignRecord // it is a struct      alignOutcome// it is a struct
Start_of_match                     N // node index
End_of_match                       Enum matchtype match_type
Align_src_trg;
                                   none
Align_trg_src
none
New design

  proposal
DepTree
- vector<string> stanza;
- vector<unsigned int> pred;                                          new design
- vector<vector<pair<unsigned int, string> > > apreds;
-vector<DepTreeNode *> all_nodes;
+ DepTree(vector<string> input);
+ bool export_a_sub_tree(unsigned int p_index, ofstream& sub_tree_f);
+ void export_tree(ifstream &f); // not yet implemented
+bool get_subset_for_preds( * vector<DepTree>)
-DepTree(void);
- bool build_tree(void);
-bool is_pred(unsigned int index);
-DepTreeNode *get_lub(DepTreeNode *dt1, DepTreeNode *dt2, vector<DepTreeNode *>&
lpath, vector<DepTreeNode *>& rpath);
- bool make_subset_for_pred(unsigned int p_index, vector<DepTreeNode *>& nodes);
- bool make_fresh_tree_for_pred(unsigned int p_index, DepTree& sub_tree);

// debug:
+ bool get_sem_dep(unsigned int dep_index, unsigned int head_index, string &rel);
+ bool get_syn_dep(unsigned int dep_index, unsigned int head_index, string &rel);
+ void dot_show(void);
+ void dot_show(ofstream &f);
+ void show_stanza(void);
+ void show_words(void);
new design




DepTreeNode
- vector<DepTreeNode *> dtrs; // this node's dependents
- DepTreeNode *parent; // points to this node's 'head‘
- int Info; // it works as a pointer in the symbol table.
- DepTree *cntr; // container via this can back up to whole tree if necessary
+ DepTreeNode(string);
+ void update_string(string);
- DepTreeNode(void);
- DepTreeNode(DepTree *dt, unsigned int i);
//Debug:
+ void show();
new design




Corpus_type


- vector<DepTree *> data_set
+ Corpus_type (string file);
+ boolean write(string file);
+ void re_label(Corpus_type training_data);
- get_label(DepTree s, DepTree t, node1, node2)
Description:
  + void re_label(Corpus_type training_data);

• Generate sub_dep_trees in traing and testing.
• For each sub_dep_tree:
  Get a ordered list of the most near sub
    dependences trees on training data with
    alignment,
  For each semantic relation take the K first and select
    the most frequent label (or other KNN voting
    method).
  Using the pointer on info copy back the selected
    semantic relation and update stanza.
new design

Info_data //struct type.
unsigned int index; // which node is it
string form;
string lemma;
string pos;
string dep_rel; // what rel. between head and this node
string pred;
string * p_cad // to update the string that generates it-selve.
DepTreeNode * p_depTreeNode; // back to the

 none
new design




Symbol_table_type
- vector<Info_data *> data; // this node's dependents
+ Symbol_table_type()
+ int encode(Info_data* d)
+ Info_data * decode(int code);
+ double distance(int codeA, int codB);
1                 2
                                                          4
                                    Time ->
                                        3               sub
                                                        Tree
                                                       predicate

                                         Deep           sub
                                          Tree          Tree
                                        sentence
                                         Deep          predicate
  Hard disk:                              Tree          sub
                      Corpus 1
Training data                           sentence
                                                        Tree
                                                       predicate


Testing data

                                         Deep           sub
                      Corpus 2            Tree          Tree
 1 training and                          Deep
                                        sentence       predicate
 testing data                             Tree
 are at the file                        sentence
 system            2 data is read
                   it, and            3 each corpus    4 each
                   created a          contains         sentence is
                   corpus.            sentences in a   discomposed
                                      dependency       in sub-trees,
                                      tree structure   one for each
                                                       predicate
5                                            6                                 7
                               Time ->
 sub
                                         t1
 Tree                                                                           q1
predicate
                                              t2
 sub
 Tree                                              t3
predicate
 sub
 Tree
predicate



 sub
 Tree                                         q1
predicate



      5-6 each sub-tree is translated               6-7 for each sub-tree from the
      into a tree structure, for tree-              training data, is found the K
      distance algorithm                            nearest (knn),
                                                    And save the possible
                                                    alignments.
8                                                              11                                   12
                          9
                                      Time ->
                                             10


    q1                                                        sentence          Corpus 2

                        Info
                        node                                                                   Output file
                                            Deep
                                            Tree
                                            sentence


8-10 each node can access to this                      10-11 for each relation it’s selected the
information on the symbol table, and from              most voted label, and save the results in
that information, access to the original               the sentences,
tree, node to node.

                                                       11-12, all corpus is write back into the file
                                                       system in the same format with new the
                                                       labels, ready for the scoring script.
Design II
 11/5
 proposal
DepTree
+ vector<string> stanza;
+ vector<unsigned int> pred;                                       Design II
+ vector<vector<pair<unsigned int, string> > > apreds;
+ vector<DepTreeNode *> all_nodes;
+ DepTree(void);
+ DepTree(vector<string> input);
+ bool build_tree(void);
+ void export_tree(ifstream &f); // TO IMPLEMENT
+ bool is_pred(unsigned int index);
+ bool get_sem_dep(unsigned int dep_index, unsigned int head_index, string &rel);
+ bool get_syn_dep(unsigned int dep_index, unsigned int head_index, string &rel);
+ DepTreeNode *get_lub(DepTreeNode *dt1, DepTreeNode *dt2, vector<DepTreeNode
*>& lpath, vector<DepTreeNode *>& rpath);
 + bool make_subset_for_pred(unsigned int p_index, vector<DepTreeNode *>& nodes);
+ bool make_fresh_tree_for_pred(unsigned int p_index, DepTree& sub_tree);
+ bool export_a_sub_tree(unsigned int p_index, ofstream& sub_tree_f);
+ getMap_tdep_tdis(AlignRecord &)
// debug:
+ void dot_show(void);
 + void dot_show(ofstream &f);
 + void show_stanza(void);
 + void show_words(void);
Design II




DepTreeNode
+vector<DepTreeNode *> dtrs; // this node's dependents
+ DepTreeNode *parent; // points to this node's 'head‘
+ unsigned int index; // which node is it
+ string form;
+ string lemma;
+ string pos;
+ string dep_rel; // what rel. between head and this node
+ string pred;
+ DepTree *cntr; // container via this can back up to whole tree if necessary
+ DepTreeNode(void);
+ DepTreeNode(DepTree *dt, unsigned int i);
+ void show();
Design II




Corpus_type


- vector<DepTree *> data_set
+ Corpus_type (string file);
+ boolean write(string file);
+ void re_label(Corpus_type training_data);
- get_label(DepTree s, DepTree t, node1, node2)
Description:
  + void re_label(Corpus_type training_data);

• Generate sub_dep_trees in traing and testing.
• For each sub_dep_tree:
  Get a ordered list of the most near sub dependences
    trees on training data with out alignment, and
    generat the alignment only if is need it.
  For each semantic relation take the K first and select the
    most frequent label (or other KNN voting method).
  Using the pointer on info copy back the selected
    semantic relation and update stanza.
1                 2
                                                          4
                                    Time ->
                                        3               sub
                                                        Tree
                                                       predicate

                                         Deep           sub
                                          Tree          Tree
                                        sentence
                                         Deep          predicate
  Hard disk:                              Tree          sub
                      Corpus 1
Training data                           sentence
                                                        Tree
                                                       predicate


Testing data

                                         Deep           sub
                      Corpus 2            Tree          Tree
 1 training and                          Deep
                                        sentence       predicate
 testing data                             Tree
 are at the file                        sentence
 system            2 data is read
                   it, and            3 each corpus    4 each
                   created a          contains         sentence is
                   corpus.            sentences in a   discomposed
                                      dependency       in sub-
                                      tree structure   trees, one for
                                                       each
5                                            6                                 7
                               Time ->
 sub
                                         t1
 Tree                                                                           q1
predicate
                                              t2
 sub
 Tree                                              t3
predicate
 sub
 Tree
predicate



 sub
 Tree                                         q1
predicate



      5-6 each sub-tree is translated               6-7 for each sub-tree from the
      into a tree structure, for tree-              training data, is found the K
      distance algorithm                            nearest (knn),
                                                    And save the possible
                                                    alignments.
8                                                   11                                   12
                            -
                                Time ->
                                  10


    q1                                             sentence          Corpus 2


                                                                                    Output file
                                  Deep
                                  Tree
                                 sentence


8-10 using maping nodes                     10-11 for each relation it’s selected the
dep-tree to tree-distance                   most voted label, and save the results in
                                            the sentences,


                                            11-12, all corpus is write back into the file
                                            system in the same format with new the
                                            labels, ready for the scoring script.
Description:
     + getMap(Corpus_type training_data);

• New functions:
• On depTree:
+ getMap_tdep_tdis(AlignRecord &)
// gives a post order alignment.
Corpus_type
+ void re_label(Corpus_type training_data);

Más contenido relacionado

La actualidad más candente

07 ds and algorithm session_10
07 ds and algorithm session_1007 ds and algorithm session_10
07 ds and algorithm session_10
Niit Care
 
Keyword proximity search in xml trees andrada astefanoaie - presentation
Keyword proximity search in xml trees   andrada astefanoaie - presentationKeyword proximity search in xml trees   andrada astefanoaie - presentation
Keyword proximity search in xml trees andrada astefanoaie - presentation
Andrada Astefanoaie
 
08 ds and algorithm session_11
08 ds and algorithm session_1108 ds and algorithm session_11
08 ds and algorithm session_11
Niit Care
 

La actualidad más candente (11)

Recurrent Neural Networks (RNN) | RNN LSTM | Deep Learning Tutorial | Tensorf...
Recurrent Neural Networks (RNN) | RNN LSTM | Deep Learning Tutorial | Tensorf...Recurrent Neural Networks (RNN) | RNN LSTM | Deep Learning Tutorial | Tensorf...
Recurrent Neural Networks (RNN) | RNN LSTM | Deep Learning Tutorial | Tensorf...
 
07 ds and algorithm session_10
07 ds and algorithm session_1007 ds and algorithm session_10
07 ds and algorithm session_10
 
#02 Next RNN
#02 Next RNN#02 Next RNN
#02 Next RNN
 
Keyword proximity search in xml trees andrada astefanoaie - presentation
Keyword proximity search in xml trees   andrada astefanoaie - presentationKeyword proximity search in xml trees   andrada astefanoaie - presentation
Keyword proximity search in xml trees andrada astefanoaie - presentation
 
SNA-ASG-03 - A.pdf
SNA-ASG-03 - A.pdfSNA-ASG-03 - A.pdf
SNA-ASG-03 - A.pdf
 
Sequence to sequence (encoder-decoder) learning
Sequence to sequence (encoder-decoder) learningSequence to sequence (encoder-decoder) learning
Sequence to sequence (encoder-decoder) learning
 
08 ds and algorithm session_11
08 ds and algorithm session_1108 ds and algorithm session_11
08 ds and algorithm session_11
 
Machine-learning scoring functions for molecular docking
Machine-learning scoring functions for molecular dockingMachine-learning scoring functions for molecular docking
Machine-learning scoring functions for molecular docking
 
From DNA Sequence Variation to .NET Bits and Bobs
From DNA Sequence Variation to .NET Bits and BobsFrom DNA Sequence Variation to .NET Bits and Bobs
From DNA Sequence Variation to .NET Bits and Bobs
 
IRJET-Block-Level Message Encryption for Secure Large File to Avoid De-Duplic...
IRJET-Block-Level Message Encryption for Secure Large File to Avoid De-Duplic...IRJET-Block-Level Message Encryption for Secure Large File to Avoid De-Duplic...
IRJET-Block-Level Message Encryption for Secure Large File to Avoid De-Duplic...
 
Deep learning based recommender systems (lab seminar paper review)
Deep learning based recommender systems (lab seminar paper review)Deep learning based recommender systems (lab seminar paper review)
Deep learning based recommender systems (lab seminar paper review)
 

Destacado

A Comparative Study On Featuree Selection In Text2
A Comparative Study On Featuree Selection In Text2A Comparative Study On Featuree Selection In Text2
A Comparative Study On Featuree Selection In Text2
Trector Rancor
 
Tara Mc Lean Portfolio Ppt
Tara Mc Lean Portfolio PptTara Mc Lean Portfolio Ppt
Tara Mc Lean Portfolio Ppt
taramclean
 
Tree distance algorithm
Tree distance algorithmTree distance algorithm
Tree distance algorithm
Trector Rancor
 

Destacado (18)

TelExcell Profile 2015-16
TelExcell  Profile 2015-16TelExcell  Profile 2015-16
TelExcell Profile 2015-16
 
Telexcell 2010 Final Version
Telexcell 2010 Final VersionTelexcell 2010 Final Version
Telexcell 2010 Final Version
 
ULGNOYP - May Meeting
ULGNOYP - May MeetingULGNOYP - May Meeting
ULGNOYP - May Meeting
 
Virtual Journalist
Virtual JournalistVirtual Journalist
Virtual Journalist
 
ULGNOYP - July Meeting
ULGNOYP - July MeetingULGNOYP - July Meeting
ULGNOYP - July Meeting
 
Anypresence slides
Anypresence slidesAnypresence slides
Anypresence slides
 
going to uni
going to unigoing to uni
going to uni
 
A Comparative Study On Featuree Selection In Text2
A Comparative Study On Featuree Selection In Text2A Comparative Study On Featuree Selection In Text2
A Comparative Study On Featuree Selection In Text2
 
Tara Mc Lean Portfolio Ppt
Tara Mc Lean Portfolio PptTara Mc Lean Portfolio Ppt
Tara Mc Lean Portfolio Ppt
 
Borderline Smote
Borderline SmoteBorderline Smote
Borderline Smote
 
Who am i : Part 1 What am i here for
Who am i : Part 1 What am i here forWho am i : Part 1 What am i here for
Who am i : Part 1 What am i here for
 
Introduction to 3D Modeling Workshop - Session 2
Introduction to 3D Modeling Workshop - Session 2Introduction to 3D Modeling Workshop - Session 2
Introduction to 3D Modeling Workshop - Session 2
 
Come and see
Come and seeCome and see
Come and see
 
Mobotix S Olution
Mobotix S OlutionMobotix S Olution
Mobotix S Olution
 
Tree distance algorithm
Tree distance algorithmTree distance algorithm
Tree distance algorithm
 
Oregon 150 Lake
Oregon 150 LakeOregon 150 Lake
Oregon 150 Lake
 
Sol biology and_som
Sol biology and_somSol biology and_som
Sol biology and_som
 
The 16 dreams
The 16 dreamsThe 16 dreams
The 16 dreams
 

Similar a Class Diagram Uml

Trees - Data structures in C/Java
Trees - Data structures in C/JavaTrees - Data structures in C/Java
Trees - Data structures in C/Java
geeksrik
 
Intro to Hadoop
Intro to HadoopIntro to Hadoop
Intro to Hadoop
jeffturner
 
Write a program in Java to implement the ADT Binary Tree part of who.docx
Write a program in Java to implement the ADT Binary Tree part of who.docxWrite a program in Java to implement the ADT Binary Tree part of who.docx
Write a program in Java to implement the ADT Binary Tree part of who.docx
rochellwa9f
 

Similar a Class Diagram Uml (20)

LDAP em VDM++
LDAP em VDM++LDAP em VDM++
LDAP em VDM++
 
LDAP em VDM++
LDAP em VDM++LDAP em VDM++
LDAP em VDM++
 
How to make DSL
How to make DSLHow to make DSL
How to make DSL
 
6 attributed grammars
6  attributed grammars6  attributed grammars
6 attributed grammars
 
NLP State of the Art | BERT
NLP State of the Art | BERTNLP State of the Art | BERT
NLP State of the Art | BERT
 
Trees - Data structures in C/Java
Trees - Data structures in C/JavaTrees - Data structures in C/Java
Trees - Data structures in C/Java
 
User biglm
User biglmUser biglm
User biglm
 
Sas basis imp intrw ques
Sas basis imp intrw quesSas basis imp intrw ques
Sas basis imp intrw ques
 
Cassandra & Python - Springfield MO User Group
Cassandra & Python - Springfield MO User GroupCassandra & Python - Springfield MO User Group
Cassandra & Python - Springfield MO User Group
 
What's New In Python 2.4
What's New In Python 2.4What's New In Python 2.4
What's New In Python 2.4
 
[Deprecated] Integrating libSyntax into the compiler pipeline
[Deprecated] Integrating libSyntax into the compiler pipeline[Deprecated] Integrating libSyntax into the compiler pipeline
[Deprecated] Integrating libSyntax into the compiler pipeline
 
Hadoop
HadoopHadoop
Hadoop
 
RaleighFS v5
RaleighFS v5RaleighFS v5
RaleighFS v5
 
Intro to Hadoop
Intro to HadoopIntro to Hadoop
Intro to Hadoop
 
Write a program in Java to implement the ADT Binary Tree part of who.docx
Write a program in Java to implement the ADT Binary Tree part of who.docxWrite a program in Java to implement the ADT Binary Tree part of who.docx
Write a program in Java to implement the ADT Binary Tree part of who.docx
 
Binary Trees
Binary TreesBinary Trees
Binary Trees
 
Tricks in natural language processing
Tricks in natural language processingTricks in natural language processing
Tricks in natural language processing
 
Understanding hdfs
Understanding hdfsUnderstanding hdfs
Understanding hdfs
 
DataStax | Building a Spark Streaming App with DSE File System (Rocco Varela)...
DataStax | Building a Spark Streaming App with DSE File System (Rocco Varela)...DataStax | Building a Spark Streaming App with DSE File System (Rocco Varela)...
DataStax | Building a Spark Streaming App with DSE File System (Rocco Varela)...
 
Linux Device Tree
Linux Device TreeLinux Device Tree
Linux Device Tree
 

Último

Beyond the EU: DORA and NIS 2 Directive's Global Impact
Beyond the EU: DORA and NIS 2 Directive's Global ImpactBeyond the EU: DORA and NIS 2 Directive's Global Impact
Beyond the EU: DORA and NIS 2 Directive's Global Impact
PECB
 

Último (20)

Z Score,T Score, Percential Rank and Box Plot Graph
Z Score,T Score, Percential Rank and Box Plot GraphZ Score,T Score, Percential Rank and Box Plot Graph
Z Score,T Score, Percential Rank and Box Plot Graph
 
Q4-W6-Restating Informational Text Grade 3
Q4-W6-Restating Informational Text Grade 3Q4-W6-Restating Informational Text Grade 3
Q4-W6-Restating Informational Text Grade 3
 
Advanced Views - Calendar View in Odoo 17
Advanced Views - Calendar View in Odoo 17Advanced Views - Calendar View in Odoo 17
Advanced Views - Calendar View in Odoo 17
 
Sanyam Choudhary Chemistry practical.pdf
Sanyam Choudhary Chemistry practical.pdfSanyam Choudhary Chemistry practical.pdf
Sanyam Choudhary Chemistry practical.pdf
 
Call Girls in Dwarka Mor Delhi Contact Us 9654467111
Call Girls in Dwarka Mor Delhi Contact Us 9654467111Call Girls in Dwarka Mor Delhi Contact Us 9654467111
Call Girls in Dwarka Mor Delhi Contact Us 9654467111
 
Measures of Central Tendency: Mean, Median and Mode
Measures of Central Tendency: Mean, Median and ModeMeasures of Central Tendency: Mean, Median and Mode
Measures of Central Tendency: Mean, Median and Mode
 
Accessible design: Minimum effort, maximum impact
Accessible design: Minimum effort, maximum impactAccessible design: Minimum effort, maximum impact
Accessible design: Minimum effort, maximum impact
 
Measures of Dispersion and Variability: Range, QD, AD and SD
Measures of Dispersion and Variability: Range, QD, AD and SDMeasures of Dispersion and Variability: Range, QD, AD and SD
Measures of Dispersion and Variability: Range, QD, AD and SD
 
SOCIAL AND HISTORICAL CONTEXT - LFTVD.pptx
SOCIAL AND HISTORICAL CONTEXT - LFTVD.pptxSOCIAL AND HISTORICAL CONTEXT - LFTVD.pptx
SOCIAL AND HISTORICAL CONTEXT - LFTVD.pptx
 
A Critique of the Proposed National Education Policy Reform
A Critique of the Proposed National Education Policy ReformA Critique of the Proposed National Education Policy Reform
A Critique of the Proposed National Education Policy Reform
 
Introduction to Nonprofit Accounting: The Basics
Introduction to Nonprofit Accounting: The BasicsIntroduction to Nonprofit Accounting: The Basics
Introduction to Nonprofit Accounting: The Basics
 
Interactive Powerpoint_How to Master effective communication
Interactive Powerpoint_How to Master effective communicationInteractive Powerpoint_How to Master effective communication
Interactive Powerpoint_How to Master effective communication
 
Key note speaker Neum_Admir Softic_ENG.pdf
Key note speaker Neum_Admir Softic_ENG.pdfKey note speaker Neum_Admir Softic_ENG.pdf
Key note speaker Neum_Admir Softic_ENG.pdf
 
Código Creativo y Arte de Software | Unidad 1
Código Creativo y Arte de Software | Unidad 1Código Creativo y Arte de Software | Unidad 1
Código Creativo y Arte de Software | Unidad 1
 
9548086042 for call girls in Indira Nagar with room service
9548086042  for call girls in Indira Nagar  with room service9548086042  for call girls in Indira Nagar  with room service
9548086042 for call girls in Indira Nagar with room service
 
Holdier Curriculum Vitae (April 2024).pdf
Holdier Curriculum Vitae (April 2024).pdfHoldier Curriculum Vitae (April 2024).pdf
Holdier Curriculum Vitae (April 2024).pdf
 
Disha NEET Physics Guide for classes 11 and 12.pdf
Disha NEET Physics Guide for classes 11 and 12.pdfDisha NEET Physics Guide for classes 11 and 12.pdf
Disha NEET Physics Guide for classes 11 and 12.pdf
 
Grant Readiness 101 TechSoup and Remy Consulting
Grant Readiness 101 TechSoup and Remy ConsultingGrant Readiness 101 TechSoup and Remy Consulting
Grant Readiness 101 TechSoup and Remy Consulting
 
Software Engineering Methodologies (overview)
Software Engineering Methodologies (overview)Software Engineering Methodologies (overview)
Software Engineering Methodologies (overview)
 
Beyond the EU: DORA and NIS 2 Directive's Global Impact
Beyond the EU: DORA and NIS 2 Directive's Global ImpactBeyond the EU: DORA and NIS 2 Directive's Global Impact
Beyond the EU: DORA and NIS 2 Directive's Global Impact
 

Class Diagram Uml

  • 1. Current design 21-may-09 SRL System 1.0. Code extended by Hector Franco UML description
  • 2. DepTree Name of class + vector<string> stanza; + vector<unsigned int> pred; atributes + vector<vector<pair<unsigned int, string> > > apreds; + vector<DepTreeNode *> all_nodes; + vector<unsigned int> pos_order_vector + DepTree(void); + DepTree(vector<string> input); methods + bool build_tree(void); + void export_tree(ofstream &f); + bool is_pred(unsigned int index); + bool get_sem_dep(unsigned int dep_index, unsigned int head_index, string &rel); + bool set_sem_dep(unsigned int dep_index, unsigned int head_index, string &rel); + bool get_syn_dep(unsigned int dep_index, unsigned int head_index, string &rel); + DepTreeNode *get_lub(DepTreeNode *dt1, DepTreeNode *dt2, vector<DepTreeNode *>& lpath, vector<DepTreeNode *>& rpath); + bool make_subset_for_pred(unsigned int p_index, vector<DepTreeNode *>& nodes); + bool make_fresh_tree_for_pred(unsigned int p_index, DepTree& sub_tree); + bool export_a_sub_tree(unsigned int p_index, ofstream& sub_tree_f); + bool set_post_order_index() + boid set_pointers() + unsigned int post_order_2_dep_order(unsigned int val); + unsigned int sub_tree_order_2_sentence_order(unsigned int val); // debug: + void dot_show(void); + void dot_show(ofstream &f); + void show_stanza(void); + void show_words(void); + test1();
  • 3. DepTreeNode +vector<DepTreeNode *> dtrs; // this node's dependents + DepTreeNode *parent; // points to this node's 'head‘ + unsigned int index; // which node is it + unsigned int post_order_index; + DepTreeNode * me; + string form; + string lemma; + string pos; + string dep_rel; // what rel. between head and this node + string pred; + DepTree *cntr; // container via this can back up to whole tree if necessary + DepTreeNode(void); + set_post_order_index(unsigned int & counter, vector<unsigned int & post_order_vector>) + DepTreeNode(DepTree *dt, unsigned int i); + void show();
  • 4. Tree // it is a struct Postorder_list *postorder_list; NO changes in Tree Distance implementation. int *keyroots none Postorder_list // it is a struct Int type; Enum wildtype wild Float weight Int father Int sons Int leftmostleaf none AlignRecord // it is a struct alignOutcome// it is a struct Start_of_match N // node index End_of_match Enum matchtype match_type Align_src_trg; none Align_trg_src none
  • 5. Description: + void re_label(Corpus_type training_data); • Generate sub_dep_trees in traing and testing. • For each sub_dep_tree: Get a ordered list of the most near sub dependences trees on training data with out alignment, and generat the alignment only if is need it. For each semantic relation take the K first and select the most frequent label (or other KNN voting method). Using the pointer on info copy back the selected semantic relation and update stanza.
  • 6. 1 2 4 Time -> 3 sub Tree predicate Deep sub Tree Tree sentence Deep predicate Hard disk: Tree sub Corpus 1 Training data sentence Tree predicate Testing data Deep sub Corpus 2 Tree Tree 1 training and Deep sentence predicate testing data Tree are at the file sentence system 2 data is read it, and 3 each corpus 4 each created a contains sentence is corpus. sentences in a discomposed dependency in sub- tree structure trees, one for each
  • 7. 5 6 7 Time -> sub t1 Tree q1 predicate t2 sub Tree t3 predicate sub Tree predicate sub Tree q1 predicate 5-6 each sub-tree is translated 6-7 for each sub-tree from the training data, is into a tree structure, for tree- found the K nearest (knn), distance algorithm And save the possible alignments. If found just one of the most nears trees, who contain the wanted semantic relation beween the wanted 2 nodes.
  • 8. 8 11 12 - Time -> 10 q1 sentence Corpus 2 Output file Deep Tree sentence 8-10 using maping nodes 10-11 for each relation it’s selected the dep-tree to tree-distance most voted label, and save the results in the sentences, 11-12, all corpus is write back into the file system in the same format with new the labels, ready for the scoring script.
  • 9. Current design first design SRL System UML description
  • 10. DepTree Name of class + vector<string> stanza; + vector<unsigned int> pred; atributes + vector<vector<pair<unsigned int, string> > > apreds; Current design + vector<DepTreeNode *> all_nodes; + DepTree(void); methods + DepTree(vector<string> input); + bool build_tree(void); + void export_tree(ifstream &f); // not yet implemented + bool is_pred(unsigned int index); + bool get_sem_dep(unsigned int dep_index, unsigned int head_index, string &rel); + bool get_syn_dep(unsigned int dep_index, unsigned int head_index, string &rel); + DepTreeNode *get_lub(DepTreeNode *dt1, DepTreeNode *dt2, vector<DepTreeNode *>& lpath, vector<DepTreeNode *>& rpath); + bool make_subset_for_pred(unsigned int p_index, vector<DepTreeNode *>& nodes); + bool make_fresh_tree_for_pred(unsigned int p_index, DepTree& sub_tree); + bool export_a_sub_tree(unsigned int p_index, ofstream& sub_tree_f); // debug: + void dot_show(void); + void dot_show(ofstream &f); + void show_stanza(void); + void show_words(void);
  • 11. Current design DepTreeNode +vector<DepTreeNode *> dtrs; // this node's dependents + DepTreeNode *parent; // points to this node's 'head‘ + unsigned int index; // which node is it + string form; + string lemma; + string pos; + string dep_rel; // what rel. between head and this node + string pred; + DepTree *cntr; // container via this can back up to whole tree if necessary + DepTreeNode(void); + DepTreeNode(DepTree *dt, unsigned int i); + void show();
  • 12. Tree // it is a struct Postorder_list *postorder_list; Current design int *keyroots none No objects in tree distance implementation Postorder_list // it is a struct Int type; Enum wildtype wild Float weight Int father Int sons Int leftmostleaf none AlignRecord // it is a struct alignOutcome// it is a struct Start_of_match N // node index End_of_match Enum matchtype match_type Align_src_trg; none Align_trg_src none
  • 13. New design proposal
  • 14. DepTree - vector<string> stanza; - vector<unsigned int> pred; new design - vector<vector<pair<unsigned int, string> > > apreds; -vector<DepTreeNode *> all_nodes; + DepTree(vector<string> input); + bool export_a_sub_tree(unsigned int p_index, ofstream& sub_tree_f); + void export_tree(ifstream &f); // not yet implemented +bool get_subset_for_preds( * vector<DepTree>) -DepTree(void); - bool build_tree(void); -bool is_pred(unsigned int index); -DepTreeNode *get_lub(DepTreeNode *dt1, DepTreeNode *dt2, vector<DepTreeNode *>& lpath, vector<DepTreeNode *>& rpath); - bool make_subset_for_pred(unsigned int p_index, vector<DepTreeNode *>& nodes); - bool make_fresh_tree_for_pred(unsigned int p_index, DepTree& sub_tree); // debug: + bool get_sem_dep(unsigned int dep_index, unsigned int head_index, string &rel); + bool get_syn_dep(unsigned int dep_index, unsigned int head_index, string &rel); + void dot_show(void); + void dot_show(ofstream &f); + void show_stanza(void); + void show_words(void);
  • 15. new design DepTreeNode - vector<DepTreeNode *> dtrs; // this node's dependents - DepTreeNode *parent; // points to this node's 'head‘ - int Info; // it works as a pointer in the symbol table. - DepTree *cntr; // container via this can back up to whole tree if necessary + DepTreeNode(string); + void update_string(string); - DepTreeNode(void); - DepTreeNode(DepTree *dt, unsigned int i); //Debug: + void show();
  • 16. new design Corpus_type - vector<DepTree *> data_set + Corpus_type (string file); + boolean write(string file); + void re_label(Corpus_type training_data); - get_label(DepTree s, DepTree t, node1, node2)
  • 17. Description: + void re_label(Corpus_type training_data); • Generate sub_dep_trees in traing and testing. • For each sub_dep_tree: Get a ordered list of the most near sub dependences trees on training data with alignment, For each semantic relation take the K first and select the most frequent label (or other KNN voting method). Using the pointer on info copy back the selected semantic relation and update stanza.
  • 18. new design Info_data //struct type. unsigned int index; // which node is it string form; string lemma; string pos; string dep_rel; // what rel. between head and this node string pred; string * p_cad // to update the string that generates it-selve. DepTreeNode * p_depTreeNode; // back to the none
  • 19. new design Symbol_table_type - vector<Info_data *> data; // this node's dependents + Symbol_table_type() + int encode(Info_data* d) + Info_data * decode(int code); + double distance(int codeA, int codB);
  • 20. 1 2 4 Time -> 3 sub Tree predicate Deep sub Tree Tree sentence Deep predicate Hard disk: Tree sub Corpus 1 Training data sentence Tree predicate Testing data Deep sub Corpus 2 Tree Tree 1 training and Deep sentence predicate testing data Tree are at the file sentence system 2 data is read it, and 3 each corpus 4 each created a contains sentence is corpus. sentences in a discomposed dependency in sub-trees, tree structure one for each predicate
  • 21. 5 6 7 Time -> sub t1 Tree q1 predicate t2 sub Tree t3 predicate sub Tree predicate sub Tree q1 predicate 5-6 each sub-tree is translated 6-7 for each sub-tree from the into a tree structure, for tree- training data, is found the K distance algorithm nearest (knn), And save the possible alignments.
  • 22. 8 11 12 9 Time -> 10 q1 sentence Corpus 2 Info node Output file Deep Tree sentence 8-10 each node can access to this 10-11 for each relation it’s selected the information on the symbol table, and from most voted label, and save the results in that information, access to the original the sentences, tree, node to node. 11-12, all corpus is write back into the file system in the same format with new the labels, ready for the scoring script.
  • 23. Design II 11/5 proposal
  • 24. DepTree + vector<string> stanza; + vector<unsigned int> pred; Design II + vector<vector<pair<unsigned int, string> > > apreds; + vector<DepTreeNode *> all_nodes; + DepTree(void); + DepTree(vector<string> input); + bool build_tree(void); + void export_tree(ifstream &f); // TO IMPLEMENT + bool is_pred(unsigned int index); + bool get_sem_dep(unsigned int dep_index, unsigned int head_index, string &rel); + bool get_syn_dep(unsigned int dep_index, unsigned int head_index, string &rel); + DepTreeNode *get_lub(DepTreeNode *dt1, DepTreeNode *dt2, vector<DepTreeNode *>& lpath, vector<DepTreeNode *>& rpath); + bool make_subset_for_pred(unsigned int p_index, vector<DepTreeNode *>& nodes); + bool make_fresh_tree_for_pred(unsigned int p_index, DepTree& sub_tree); + bool export_a_sub_tree(unsigned int p_index, ofstream& sub_tree_f); + getMap_tdep_tdis(AlignRecord &) // debug: + void dot_show(void); + void dot_show(ofstream &f); + void show_stanza(void); + void show_words(void);
  • 25. Design II DepTreeNode +vector<DepTreeNode *> dtrs; // this node's dependents + DepTreeNode *parent; // points to this node's 'head‘ + unsigned int index; // which node is it + string form; + string lemma; + string pos; + string dep_rel; // what rel. between head and this node + string pred; + DepTree *cntr; // container via this can back up to whole tree if necessary + DepTreeNode(void); + DepTreeNode(DepTree *dt, unsigned int i); + void show();
  • 26. Design II Corpus_type - vector<DepTree *> data_set + Corpus_type (string file); + boolean write(string file); + void re_label(Corpus_type training_data); - get_label(DepTree s, DepTree t, node1, node2)
  • 27. Description: + void re_label(Corpus_type training_data); • Generate sub_dep_trees in traing and testing. • For each sub_dep_tree: Get a ordered list of the most near sub dependences trees on training data with out alignment, and generat the alignment only if is need it. For each semantic relation take the K first and select the most frequent label (or other KNN voting method). Using the pointer on info copy back the selected semantic relation and update stanza.
  • 28. 1 2 4 Time -> 3 sub Tree predicate Deep sub Tree Tree sentence Deep predicate Hard disk: Tree sub Corpus 1 Training data sentence Tree predicate Testing data Deep sub Corpus 2 Tree Tree 1 training and Deep sentence predicate testing data Tree are at the file sentence system 2 data is read it, and 3 each corpus 4 each created a contains sentence is corpus. sentences in a discomposed dependency in sub- tree structure trees, one for each
  • 29. 5 6 7 Time -> sub t1 Tree q1 predicate t2 sub Tree t3 predicate sub Tree predicate sub Tree q1 predicate 5-6 each sub-tree is translated 6-7 for each sub-tree from the into a tree structure, for tree- training data, is found the K distance algorithm nearest (knn), And save the possible alignments.
  • 30. 8 11 12 - Time -> 10 q1 sentence Corpus 2 Output file Deep Tree sentence 8-10 using maping nodes 10-11 for each relation it’s selected the dep-tree to tree-distance most voted label, and save the results in the sentences, 11-12, all corpus is write back into the file system in the same format with new the labels, ready for the scoring script.
  • 31. Description: + getMap(Corpus_type training_data); • New functions: • On depTree: + getMap_tdep_tdis(AlignRecord &) // gives a post order alignment. Corpus_type + void re_label(Corpus_type training_data);