Se ha denunciado esta presentación.
Utilizamos tu perfil de LinkedIn y tus datos de actividad para personalizar los anuncios y mostrarte publicidad más relevante. Puedes cambiar tus preferencias de publicidad en cualquier momento.

Using mashup technology to improve findability

  • Inicia sesión para ver los comentarios

Using mashup technology to improve findability

  1. 1. USING MASHUPTECHNOLOGY TO IMPROVE FINDABILITY Sten Govaerts Promotor: Erik Duval Co-promotor: Katrien Verbert
  2. 2. OVERVIEW• Research outline• Music• Technology Enhanced Learning• Publication list• Further planning
  3. 3. FINDABILITYFindability is the ability of users to identify an appropriate websiteand navigate the pages of the site to discover and retrieve relevant information resources. Peter Morville (2005)
  4. 4. MASHUPS
  5. 5. MASHUPS• mashups in music: re-mixing multiple existing songs to create a new one.
  6. 6. MASHUPS• mashups in music: re-mixing multiple existing songs to create a new one.•amashup is an application that combines data from multiple online sources to create a new result which was not the original intend of the data.
  7. 7. MASHUPS• mashups in music: re-mixing multiple existing songs to create a new one.•amashup is an application that combines data from multiple online sources to create a new result which was not the original intend of the data.• data is key! • tweaking and enriching data is important • interesting data makes an interesting mashup
  8. 8. THE ROCKANANGO PROJECT
  9. 9. SCOPE.
  10. 10. SCOPE.• roots in HORECA.
  11. 11. SCOPE.• roots in HORECA. • how does a bartender select his music?
  12. 12. SCOPE.• roots in HORECA. • how does a bartender select his music? • how does an expert select his music?
  13. 13. SCOPE.• roots in HORECA. • how does a bartender select his music? • how does an expert select his music?• makingthe expert data accessible in a usable way for a bartender
  14. 14. SCOPE.• roots in HORECA. • how does a bartender select his music? • how does an expert select his music?• making the expert data accessible in a usable way for a bartenderA musical context is a musical description for situations based on atmospheres and musical properties.
  15. 15. SCOPE.• roots in HORECA. • how does a bartender select his music? • how does an expert select his music?• making the expert data accessible in a usable way for a bartenderA musical context is a musical description for situations based on atmospheres and musical properties.
  16. 16. SCOPE.• roots in HORECA. • how does a bartender select his music? • how does an expert select his music?• making the expert data accessible in a usable way for a bartenderA musical context is a musical description for situations based on atmospheres and musical properties.
  17. 17. METADATA SCHEMAS
  18. 18. Corthaut, Nik; Govaerts, Sten; Verbert, Katrien; Duval, Erik. Connecting the dots: music metadatageneration, schemas and applications, Bello, Juan Pablo; Chew, Elaine; Turnbull, Douglas (eds.), ISMIR,Philadelphia, USA, 14-18 September 2008, Proceedings of the 9th International Conference on MusicInformation Retrieval, pages 249-254
  19. 19. context subcontext A subcontext B songs with songs with subgenre(easy listening genre(pop) OR pop café) + + 75 25 songs with songs with mood(intimate OR mood(relax OR tasteful OR stylish) + romantic OR sensual) + + songs with songs with popularity(5 UNTIL 7) popularity(6 UNTIL 7)Govaerts, Sten; Corthaut, Nik; Duval, Erik. Moody tunes: the rockanango project, Lemström, Kjell;Tindale, Adam; Dannenberg, Roger (eds.), International Conference on Music Information Retrieval,ISMIR, Victoria, BC, 8-12 October 2006, pages 308-313, University of Victoria
  20. 20. APPLICATIONS.
  21. 21. APPLICATIONS.• HORECA (+1000 pubs & restaurants)• Retail (Fun, Prémaman, Carrefour,...)• Banks (Dexia & Fortis)• Webradio (HUMO, TMF, Mars, Overtoom, ...)
  22. 22. GENERATE THE METADATA• from different sources: • the audio signal • web sources • the Aristo database • attention metadata• using our metadata generation framework: SamgI
  23. 23. ORIGIN OF AN ARTIST
  24. 24. METADATA GENERATION: COUNTRY & CONTINENT• why is it useful? • subgenres • popularity • recommendations• expensive to annotate• very few existing research• very hard with signal processing
  25. 25. METADATA GENERATION: COUNTRY & CONTINENT • why is it useful? • subgenres • popularity • recommendations • expensive to annotate • very few existing research! • very hard with signal processing
  26. 26. IN THE BACKGROUND... google maps freebase last.fm google app enginetwitter last.fmlast.fm yahoo! pipes last on am/fmwebsite dapper youtube
  27. 27. Coverage
  28. 28. http://www.cs.kuleuven.be/~sten/lastonamfm
  29. 29. LAST.FM MASHUPhttp://www.cs.kuleuven.be/~sten/lastonamfm
  30. 30. FUTURE• follow up research by Markus Schedl• want to test our algorithm on his data setGovaerts, Sten; Duval, Erik. A Web-based approach to determine the origin of an artist, ISMIR, Kobe,Japan, 26-30 October 2009, Proceedings of ISMIR2009: 10th International Society for Music InformationRetrieval Conference, pages 261-266, ISMIR-The International Society for Music Information Retrieval
  31. 31. CLASSIFY BY SEARCH ENGINE
  32. 32. ONE APPROACH...• can classify Genre and more!• M. Schedl, T. Pohle, P. Knees, G. Widmer, “Assigning and Visualizing Music Genres by Web-based Co-occurrence Analysis”, Proceedings of the 7th International Conference on Music Information Retrieval, 2006, pp. 260-265.• G. Geleijnse, J. Korst, "Web-based Artist Categorization", Proceedings of the 7th International Conference on Music Information Retrieval, 2006, pp. 266 - 271.
  33. 33. CLASSIFICATION WITH SEARCH ENGINES using co-occurrence
  34. 34. CLASSIFICATION WITH SEARCH ENGINES using co-occurrence Artist + Genre + Schema
  35. 35. CLASSIFICATION WITH SEARCH ENGINES using co-occurrence Artist + Genre + Schema
  36. 36. Rock: Jazz:Blues: Pop:Country: Metal:
  37. 37. Rock: Jazz: 0,013 0,013Blues: Pop: 0,009 0,015Country: Metal: 0,009 0,005
  38. 38. RESULTS• 1st results were much worse• what happened?• re-run the original experiment • evaluate on the same data set: 1995 artists and 9 genres.• different search engines: Google,Yahoo! and Live! Search.• over time: 8 times over a period of 36 days.
  39. 39. WHAT TO USE?• use Google when it’s stable else rely on Yahoo!• when is it stable? test with a small set • some artists get classified incorrectly on bad days • compare the accuracy achieved with the test set to the average. Govaerts, Sten; Corthaut, Nik; Duval, Erik. Using search engine for classification: does it still work?, AdMIRe: International Workshop on Advances in Music Information Research 2009, San Diego, USA, 14-16 December 2009, Proceedings of AdMIRe: International Workshop on Advances in Music Information Research 2009, pages 483-488, IEEE
  40. 40. STUDENT ACTIVITY METER
  41. 41. PROBLEM
  42. 42. PROBLEM
  43. 43. PROBLEM
  44. 44. PROBLEM
  45. 45. PROBLEM
  46. 46. OBJECTIVES• self-monitoring for learners• awareness for teachers• time tracking• learning resource recommendation
  47. 47. STUDENT ACTIVITY MONITOR
  48. 48. STUDENT ACTIVITY MONITOR
  49. 49. STUDENT ACTIVITY MONITOR
  50. 50. 3 EVALUATIONS• with CS students• withCGIAR courses and teachers• with Learning and Knowledge Analytics course participants
  51. 51. CS STUDENTS CASE STUDY• usability and user satisfaction evaluation• 12 CS students•2 evaluation sessions: • task based interview with think aloud (after 1 week of tracking) • user satisfaction (SUS & MSDT) (after 1 month)Govaerts, Sten; Verbert, Katrien; Klerkx, Joris; Duval, Erik. Visualizing activities for self-reflectionand awareness, ICWL10: International Conference on Web based Learning, Shanghai, China, 7-11December 2010, Lecture Notes in Computer Science, volume 6483, pages 91-100, Springer
  52. 52. USABILITY & USER SATISFACTION• in general, people understand the visualizations well!• some small issues were uncovered...• average SUS score: 73% (stdv: 9,35)
  53. 53. USABILITY & USER SATISFACTION
  54. 54. USABILITY & USER SATISFACTION
  55. 55. USABILITY & USER SATISFACTION
  56. 56. USABILITY & USER SATISFACTION
  57. 57. CGIAR CASE STUDY
  58. 58. CGIAR CASE STUDY
  59. 59. CGIAR CASE STUDY wants to more details search for detect outliers on student students good indicator for effort understand the workloadmore metrics use for course design optimizationobtain course overview compare students increase awarenesswant better 1 to 1 progress evolution comparison tool
  60. 60. LAK CASE STUDY• open course on learning and knowledge analytics• visual analytics enthousiasts + experts (who can also teach)
  61. 61. LAK CASE STUDY• open course on learning and knowledge analytics• visual analytics enthousiasts + experts (who can also teach)
  62. 62. LAK CASE STUDY verify the status of the more metrics classroom activity chronological course dwell timeself-reflection to measure progress and increase motivation find students experiencing problems and low engagement more data
  63. 63. FEDERATED SEARCH ANDSOCIAL RECOMMENDATION WIDGET
  64. 64. WHAT’S A WIDGET ?!?
  65. 65. WHAT’S A WIDGET ?!?
  66. 66. WHAT’S A WIDGET ?!?
  67. 67. WHAT’S A WIDGET ?!?
  68. 68. CONTEXT• Personal Learning Environment: • customizable • re-use, creation & mashup of tools, resources• enable users to access content • in different contexts
  69. 69. CONTEXT• Personal Learning Environment: • customizable • re-use, creation & mashup of tools, resources• enable users to access content • in different contexts
  70. 70. CONTEXT• Personal Learning Environment: • customizable • re-use, creation & mashup of tools, resources• enable users to access content • in different contexts
  71. 71. ARCHITECTURE
  72. 72. WIDGET
  73. 73. WIDGET
  74. 74. WIDGET
  75. 75. EXTENDED PAGERANK hare d R1 d /s save saved/shared d en tion Sten R2 fri ec dis n like c on d lik ed R3 d n Sandy ien ctio fr e c on n lik ed R4Erik R5
  76. 76. EVALUATION• 15 PhD students at K.U. Leuven and EPFL.• What? • usability • user satisfaction • usefulness
  77. 77. FIRST PHASE• current media search tool: Google & YouTube• understanding recommendations: 6/15 from like/dislike
  78. 78. SECOND PHASE
  79. 79. SECOND PHASE• only 14 participants (one less)• open questions• usefulness of recommendations: 11/14 pro.• user satisfaction: System Usability Scale (SUS) & MS Desirability Toolkit• SUS score: 66,25% •2 groups
  80. 80. SECOND PHASE• only 14 participants (one less)• open questions• usefulness of recommendations: 11/14 pro.• user satisfaction: System Usability Scale (SUS) & MS Desirability Toolkit• SUS score: 66,25% •2 K.U. Leuven: high (75%) groups EPFL + one K.U.Leuven: low (50%)
  81. 81. WHY THE DIFFERENT SUS?• 1st phase by 2 interviewers• issues: • distracts of unrelated widget’s UI updates. • layout too dense • height of widgets too small• KULeuven student had prior experience with iGoogle.• not evaluating the widget but the whole experience...
  82. 82. DESIRABILITY...
  83. 83. DESIRABILITY...
  84. 84. DESIRABILITY...
  85. 85. FURTHER EVALUATION• evaluation at a company and university
  86. 86. PUBLICATIONS: MUSIC• Govaerts, Sten; Corthaut, Nik; Duval, Erik. Moody tunes: the rockanango project, Lemström, Kjell; Tindale, Adam; Dannenberg, Roger (eds.), International Conference on Music Information Retrieval, ISMIR, Victoria, BC, 8-12 October 2006, International Conference on Music Information Retrieval, ISMIR, pages 308-313, University of Victoria• Govaerts, Sten; Corthaut, Nik; Duval, Erik. Mood-ex-machina: towards automation of moody tunes, Dixon, Simon; Bainbridge, David; Typke, Rainer (eds.), International Conference on Music Information Retrieval, ISMIR, Vienna, Austria, 23-27 September 2007, Proceedings of the 8th International Conference on Music Information Retrieval, ISMIR 2007, pages 347-350, Österreichische Computer Gesellschaft• Corthaut, Nik; Govaerts, Sten; Verbert, Katrien; Duval, Erik. Connecting the dots: music metadata generation, schemas and applications, Bello, Juan Pablo; Chew, Elaine; Turnbull, Douglas (eds.), ISMIR, Philadelphia, USA, 14-18 September 2008, Proceedings of the 9th International Conference on Music Information Retrieval, pages 249-254• Corthaut, Nik; Lippens, Stefaan; Govaerts, Sten; Duval, Erik; Martens, Jean-Pierre. The integration of a metadata generation framework in a music annotation workflow, ISMIR, Kobe, Japan, 26-30 October 2009, Proceedings of ISMIR2009: 10th International Society for Music Information Retrieval Conference, ISMIR-The International Society for Music Information Retrieval• Govaerts, Sten; Duval, Erik. A Web-based approach to determine the origin of an artist, ISMIR, Kobe, Japan, 26-30 October 2009, Proceedings of ISMIR2009: 10th International Society for Music Information Retrieval Conference, pages 261-266, ISMIR-The International Society for Music Information Retrieval• Govaerts, Sten; Corthaut, Nik; Duval, Erik. Using search engine for classification: does it still work?, AdMIRe: International Workshop on Advances in Music Information Research 2009, San Diego, USA, 14-16 December 2009, Proceedings of AdMIRe: International Workshop on Advances in Music Information Research 2009, pages 483-488, IEEE
  87. 87. ISMIRI think ISMIR was motivated from two directions: the desire for a focus on music indexing, search, and retrieval, which cuts across many disciplines, and a desire for a focused technical and scientific forum for music research. -Roger Dannenberg It is not just statistics and computer science (as Wikipedia explains for "bioinformatics") but also many other aspects, including social, musicological, perceptual etc. ones. -Michael Fingerhut
  88. 88. PUBLICATIONS: TEL• Parra Chico, Gonzalo; Govaerts, Sten; Duval, Erik. More! a social discovery tool for researchers, DIR 2010: Dutch- Belgian Information Retrieval Workshop, Nijmegen, Nederland, 25 January 2010, DIR 2010: 10th Dutch-Belgian Information Retrieval Workshop• Renzel, D.; Hobelt, C.; Dahrendorf, D.; Friedrich, M.; Modritscher, F.; Verbert, Katrien; Govaerts, Sten; Palmer, M.; Bogdanov, E.. Collaborative development of a PLE for language learning, International Journal of Emerging Technologies in Learning, volume 5, 2010• Govaerts, Sten; Verbert, Katrien; Klerkx, Joris; Duval, Erik. Visualizing activities for self-reflection and awareness, ICWL10: International Conference on Web based Learning, Shanghai, China, 7-11 December 2010, Lecture Notes in Computer Science, volume 6483, pages 91-100, Springer• Govaerts, Sten; El Helou, Sandy; Duval, Erik; Gillet, Denis. A federated search and social recommendation widget, Proceedings of the 2nd International Workshop on Social Recommender Systems (SRS 2011) in conjunction with the 2011 ACM Conference on Computer Supported Cooperative Work (CSCW 2011), Hangzhou, China, 19-23 March 2011, pages 1-8.
  89. 89. PUBLICATIONS IN THE PIPELINE• Felix Mödritscher, Barbara Krumay, Sten Govaerts, Erik Duval, Sandy El Helou, Denis Gillet, Alexander Nussbaumer, Dietrich Albert, Carsten Ullrich. May I suggest? Three PLE recommender strategies in comparison, PLE Conference 2011, Southampthon, UK. => ACCEPTED, not a core part of my PhD.• Govaerts, Sten; Verbert, Katrien; Duval, Erik. Visualizing student activities for teachers, IEEE Conference on Visual Analytics Science and Technology (IEEE VAST), Providence, USA => UNDER REVIEW, notification 8 June.• Sten Govaerts, Katrien Verbert, Daniel Dahrendorf, Carsten Ullrich, Manuel Schmidt, Michael Werkle, Arunangsu Chatterjee, Alexander Nussbaumer, Dominik Renzel, Maren Scheffel, Martin Friedrich, Jose Luis Santos, Effie L-C Law, Erik Duval. Towards reponsive open learning environments: the ROLE interoperability framework. The 6th European Conference on Technology Enhanced Learning Towards Ubiquitous Learning, Lecture Notes of Computer Science. => UNDER REVIEW, notification 31 May.
  90. 90. FUTURE PLANNING• Ph.D. on papers• if 2 papers under review are accepted => FINISH.• potential for writing (a) journal article(s): •0 articles: submit end September •1 article: submit end November •2 articles: submit Xmas.
  91. 91. THANK YOU! QUESTIONS?slides will appear on http://www.slideshare.net/stengovaerts

×