This meta-analysis of 41 evaluation studies of the Europeana Digital Library categorizes them by their constructs, contexts, criteria, and methodologies using Saracevic’s digital library evaluation framework. The analysis shows that system-centered evaluations prevail over user-centered evaluations and evaluations from a societal or institutional perspective are missing. The study reveals, which Europeana components have received focused attention in the last decade (e.g. the metadata) and can serve as a reference for identifying gaps, selecting methodologies and re-using data for future evaluations.
A Decade of Evaluating Europeana: Constructs, Contexts, Methods & Criteria
1. A Decade of Evaluating Europeana
CONSTRUCTS, CONTEXTS, METHODS & CRITERIA
Vivien Petras & Juliane Stiller,
Thessaloniki, 19. September 2017, TPDL 2017
uropeana-5 by Europeana EU CC BY-SA 2.0
6. Saracevic‘s Framework
6
Elements Description
Construct What is evaluated? Describes the aspect, which is the
focus of the evaluation, for example the metadata or
the search functionality
Context Which perspective is used for the evaluation? user-
centered perspective, the interface perspective and
the system-centered perspective?
Criteria Which objective is evaluated?
Measures How are the criteria evaluated?
Methodology Which approach, instrument or tool is used for data
collection and analysis for the evaluation?
Saracevic, T. (2000). Digital library evaluation: Toward an evolution of concepts. Library
Trends, 49(2), 350
7. Data and Information Extraction
Existing lists of evaluations, Web of Science and Google
Scholar
38 evaluations with Europeana as the object,
3 evaluations using Europeana data, and
14 meta-studies, which named Europeana as a use case
Extracted information from 41 publications
As close to source as possible
Grouped in categories designed by Saracevic, e.g. Context
Developed categories for other elements
7
11. Contructs & Contexts
11
15
4
1
4
0
1
8
7
5
3
6
2
1 0
0
0
5
10
15
20
25
Europeana DL External
Service
Europeana
component
Europeana in
comparison
Algorithm
User-centered System-centered Interface
*studies can appear in more than one context
12. Criteria & Measures
Often studies define their own criteria
Usability and the effectiveness of the interaction design, user
behavior and algorithm performance were used as objectives more
often than others
12
13. Methods
Method Description #
Criteria-based Certain criteria were determined to
assess a service or algorithm
16
Gold standard-based Use of a manually created gold
standard to assess performance
9
Logfile analysis Uses an automatically created logfile
of user interactions
8
Usability study Several methods to assess usability of
a service, e.g. user studies, interviews,
surveys
7
Impact study Expert assessment of the overall value
of a service
2
13
15. Conclusions
Studies with a focus on system perspective prevail
Take on the institutional and societal perspective
Studies on overall impact of Europeana are still rare: What are
success criteria of CH systems?
Europeana Impact Framework tries to find answers to this question:
https://pro.europeana.eu/what-we-do/impact
15
16. Conclusions
Learn from an Evaluation Archive
Track improvements over time
Standardize evaluations
Document the changes
16