Se ha denunciado esta presentación.
Utilizamos tu perfil de LinkedIn y tus datos de actividad para personalizar los anuncios y mostrarte publicidad más relevante. Puedes cambiar tus preferencias de publicidad en cualquier momento.

Meaningful Interaction Analysis

2.603 visualizaciones

Publicado el

Guest lecture given at the JiaoTong University, Shanghai, China.

  • Inicia sesión para ver los comentarios

Meaningful Interaction Analysis

  1. 1. December 7 th , 2010, Shanghai, China Latent Semantics and Social Interaction Fridolin Wild KMi, The Open University
  2. 2. Outline <ul><li>Context & Framing Theories </li></ul><ul><li>Latent Semantic Analysis (LSA) </li></ul><ul><li>(Social) Network Analysis (S/NA) </li></ul><ul><li>Meaningful Interaction Analysis (MIA) </li></ul><ul><li>Outlook: Analysing Practices </li></ul>
  3. 3. <ul><li>Context & Theories </li></ul>
  4. 4. Information what is Meaning could be the quality of a certain signal. Meaning could be a logical abstractor = a release mechanism. (96dpi) meaning
  5. 5. meaning is social <ul><ul><li>To develop a shared understanding is a natural and necessary process, because language underspecifies meaning: future understanding builds on it </li></ul></ul>Network effects make a network of shared understandings more valuable with growing size: allowing e.g. ‘distributed cognition’. <ul><ul><li>At same time: linguistic relativity (Sapir-Whorf hypothesis): our own language culture restricts our thinking </li></ul></ul>
  6. 7. Concepts & Competence things we can (not) construct from language <ul><li>Tying shoelaces </li></ul><ul><li>Douglas Adams’ ‘meaning of liff’: </li></ul><ul><ul><li>Epping : The futile movements of forefingers and eyebrows used when failing to attract the attention of waiters and barmen. </li></ul></ul><ul><ul><li>Shoeburyness : The vague uncomfortable feeling you get when sitting on a seat which is still warm from somebody else's bottom </li></ul></ul>I have been convincingly Sapir-Whorfed by this book.
  7. 8. A “Semantic Community” LSA SNA Associative Closeness Concept (disambiguated term) Person Social relation
  8. 9. LSA
  9. 10. Latent Semantic Analysis <ul><li>Two-Mode factor analysis of the co-occurrences in the terminology </li></ul><ul><li>Results in a latent-semantic vector space </li></ul>“ Humans learn word meanings and how to combine them into passage meaning through experience with ~paragraph unitized verbal environments.” “ They don’t remember all the separate words of a passage; they remember its overall gist or meaning.” “ LSA learns by ‘reading’ ~paragraph unitized texts that represent the environment.” “ It doesn’t remember all the separate words of a text it; it remembers its overall gist or meaning.” -- Landauer, 2007
  10. 11. latent-semantic space Singular values (factors, dims, …) Term Loadings Document Loadings
  11. 12. (Landauer, 2007)
  12. 13. Associative Closeness You need factor stability? > Project using fold-ins! Term 1 Document 1 Document 2 Angle 2 Angle 1 Y dimension X dimension
  13. 14. Example: Classic Landauer { M } = Deerwester, Dumais, Furnas, Landauer, and Harshman (1990): Indexing by Latent Semantic Analysis, In: Journal of the American Society for Information Science, 41(6):391-407 Only the red terms appear in more than one document, so strip the rest. term = feature vocabulary = ordered set of features TEXTMATRIX
  14. 15. Reconstructed, Reduced Matrix m4: Graph minors : A survey
  15. 16. doc2doc - similarities <ul><ul><li>Unreduced = pure vector space model </li></ul></ul><ul><ul><li>- Based on M = TSD’ </li></ul></ul><ul><ul><li>- Pearson Correlation over document vectors </li></ul></ul><ul><ul><li>reduced </li></ul></ul><ul><ul><li>- based on M 2 = TS 2 D’ </li></ul></ul><ul><ul><li>- Pearson Correlation over document vectors </li></ul></ul>
  16. 17. (S)NA
  17. 18. Social Network Analysis <ul><li>Existing for a long time (term coined 1954) </li></ul><ul><li>Basic idea: </li></ul><ul><ul><li>Actors and Relationships between them (e.g. Interactions) </li></ul></ul><ul><ul><li>Actors can be people (groups, media, tags, …) </li></ul></ul><ul><ul><li>Actors and Ties form a Graph (edges and nodes) </li></ul></ul><ul><ul><li>Within that graph, certain structures can be investigated </li></ul></ul><ul><ul><ul><li>Betweenness, Degree of Centrality, Density, Cohesion </li></ul></ul></ul><ul><ul><ul><li>Structural Patterns can be identified (e.g. the Troll) </li></ul></ul></ul>
  18. 19. Constructing a network from raw data forum postings incidence matrix IM adjacency matrix AM IM x IM T
  19. 20. Visualization: Sociogramme
  20. 21. Measuring Techniques (Sample) Degree Centrality number of (in/out) connections to others Closeness how close to all others Betweenness how often intermediary Components e.g. kmeans cluster (k=3)
  21. 22. Example: Joint virtual meeting attendance (Flashmeeting co-attendance in the Prolearn Network of Excellence)
  22. 23. Example: Subscription structures in a blogging network (2 nd trial of the iCamp project)
  23. 24. MIA
  24. 25. Meaningful Interaction Analysis (MIA) <ul><li>Combines latent semantics with the means of network analysis </li></ul><ul><li>Allows for investigating associative closeness structures at the same time as social relations </li></ul><ul><li>In latent-semantic spaces only or in spaces with additional and different (!) relations </li></ul>
  25. 26. The mathemagics behind Meaningful Interaction Analysis
  26. 27. Network Analysis
  27. 28. MIA of the classic Landauer
  28. 32. Applications
  29. 33. Capturing traces in text: medical student case report
  30. 34. Internal latent-semantic graph structure (MIA output)
  31. 36. Evaluating Chats with PolyCAFe
  32. 38. Conclusion
  33. 39. Conclusion <ul><li>Both LSA and SNA alone are not sufficient for a modern representation theory </li></ul><ul><li>MIA provides one possible bridge between them </li></ul><ul><li>It is a powerful technique </li></ul><ul><li>And it is simple to use (in R) </li></ul>
  34. 40. #eof.
  35. 41. Contextualised Doc & Term Vectors <ul><li>T k = left-hand sided matrix = ‚term loadings‘ on the singular value </li></ul><ul><li>D k = right-hand sided matrix = ‚document loadings‘ on the singular value </li></ul><ul><li>Multiply them into same space </li></ul><ul><ul><li>V T = T k S k </li></ul></ul><ul><ul><li>V D = D k T S k </li></ul></ul><ul><li>Cosine Closeness Matrix over ... = adjacency matrix = a graph </li></ul><ul><li>More: e.g. add author vectors V A through cluster centroids or vector addition of their publication vectors </li></ul>Speed: use existing space and fold in e.g. author vectors latent-semantic space
  36. 42. Influencing Parameters (LSA) Pearson(eu, österreich) Pearson(jahr, wien)