1. EVIDENCE BASED
LIBRARIANSHIP
IN PRACTICE
USING EVIDENCE IN HEALTH SCIENCES LIBRARIES
Lorie Kloda, MLIS, PhD, AHIP
McGill University
Central New York Library Resources Council, Syracuse, March 2014
3. Introductions
1. Your name
2. Your title/position
3. Your city, institution
4. What is your interest in evidence based
practice? Why are you here today?
5. Course objectives
• Identify the steps in evidence based practice
• Formulate answerable questions relevant to their own
work setting
• Define what constitutes evidence in their own work
setting
• Identify strategies for locating local or external evidence
to answer their questions
• Make use of tools for critically appraising published
research
• Provide examples of how evidence can be applied by
health librarians in the real world
8. What is EBLIP?
“an approach to information science that promotes the
collection, interpretation and integration of valid, important
and applicable user-reported, librarian observed, and
research-derived evidence. The best available evidence,
moderated by user needs and preferences, is applied to
improve the quality of professional judgements.”
(Booth, 2000)
9. Why should you care?
“Wisdom means acting with knowledge while doubting what
you know.”
Jeffrey Pfeffer & Robert I. Sutton
10. A brief history
1997 Hypothesis article by Jon Eldredge
2000 MLA Research Section created an Evidence-Based
Librarianship Implementation Committee
2000 Eldredge publishes papers that
provide the framework for EBL
2001 1st Evidence Based Librarianship
conference held in Sheffield, UK
2004 Booth and Brice book on EBIP
2006 EBLIP journal launches
11. The 5 As of EBLIP
1) Formulate a focused question (Ask)
2) Find the best evidence to help answer that
question (Acquire)
3) Critically appraise what you have found to
ensure the quality of the evidence
(Appraise)
4) Apply what you have learned to your practice
(Apply)
5) Evaluate your performance (Assess)
13. Is the EBLIP model used?
• The ideal vs reality
• Criticisms of EBLIP
• Barriers to practicing in an evidence based manner
14. Barriers to evidence use
• Organizational dynamics
• Lack of time/competing demands on time
• Personal outlook / lack of confidence
• Education and training gaps
• Information needs not being met
• Financial limits
16. Other considerations
• individual vs group decision making
• influences / biases
• impact of work environment
• types of evidence
• enablers
17. Widening the model
A revised process:
Articulate – come to an understanding of the problem and
articulate it.
Assemble – assemble evidence from multiple sources that
are most appropriate to the problem at hand.
Assess – place the evidence against all components of the
wider overarching problem. Assess the evidence for its
quantity and quality.
Agree – determine the best way forward and if working
with a group, try to achieve consensus based on the
evidence and organisational goals.
Adapt –revisit goals and needs. Reflect on the success of
the implementation.
19. Questions to ask yourself
What do I already
know?
What local evidence
is available?
What does the
literature say?
What other
information do I need
to gather?
How does the
information I have
apply to my context?
Make a decision
What worked? What
didn’t? What did I
learn?
PRACTITIONER
20. Case examples
Academic librarian wants to know what professors think of
information literacy instruction to students
Librarian at a pediatric hospital wonders if residents’
searches are improved with librarian assistance
23. ―Questions drive the entire EBL process.
[…] The wording and content of the
questions will determine what kinds
of research designs are needed
to secure answers.”
(J. Eldredge, 2000)
24. SPICE question structure
Setting the context (e.g., hospital library, academic
health center)
Perspective the stakeholder(s) (e.g., graduate students,
managers, reference librarians)
Intervention the service being offered (e.g., chat reference,
RefWorks workshops)
Comparison the service to which it is being compared
(optional)
Evaluation the measure used to determine
change/success/impact (e.g., usage statistics,
course grade)
25. Librarianship domains
Reference/Enquiries—providing service and access to information that meets
the needs of library users.
Education— Incorporating teaching methods and strategies to educate users
about library resources and how to improve research skills.
LIS Education subset – Specifically pertaining to the professional education of
librarians.
Collections—Building a high-quality collection of print and electronic materials
that is useful, cost-effective and meets the users’ needs.
Management—managing people and resources within an organization. This
includes marketing and promotion as well as human resources.
Information access and retrieval—creating better systems and methods for
information retrieval and access.
Professional Issues—exploring issues that affect librarians as a profession.
(Koufogiannakis, Crumley, and Slater, 2004)
27. Burning question example 1
What are university faculty members’
perceptions of information literacy?
28. SPICE example 1
Setting Research university
Perspective Librarians
Professors
Intervention Survey questionnaire to determine attitudes,
perceptions, experiences
Comparison Not applicable
Evaluation Ratings of information literacy competencies
Inclusion of IL in courses
Disciplinary differences
29. Burning question example 2
Are pediatric residents’ search results improved
with help from a librarian?
30. SPICE example 2
Setting Pediatric teaching hospital
Perspective Librarians
Intervention Help from a medical librarian for a literature
search
Comparison Literature search without assistance
Evaluation Relevance of retrieved results; Quality of
search strategy
33. Definition of evidence
“the available body of facts or information indicating
whether a belief or proposition is true or valid”
(Oxford English Dictionary, 2011)
34. ACTIVITY 3
What are some possible
evidence sources we use to
make decisions in libraries?
35. Evidence Sources
Hard evidence Soft evidence
Published literature Input from colleagues
Statistics Tacit knowledge
Local research and
evaluation
Feedback from users
Other documents Anecdotal evidence
Facts
45. Locating published evidence
Evidence summaries
http://ejournals.library.ualberta.ca/index.php/EBLIP
Evidence Based Library and Information Practice journal,
2006-
>250 evidence summaries
46.
47. Creating evidence
Data and findings
• Usage data
• Transaction data
• Evaluation results
• Survey, interview, focus group findings
48. Creating evidence
Sources for local evidence already available
• Library assessment department
• University planning and institutional analysis
• Annual reports
• Internal reports
• "Stats"
50. Evidence for example 1
Locating evidence
• Databases: LISA
• Systematic Review Wiki
• Journals: Communications in IL, J of IL, J of Academic
Librarianship
• Conferences: LILAC, LOEX, WILU
• EBLIP Evidence Summary
Creating evidence
• survey questionnaire
51. Evidence for example 2
Locating evidence
• Databases: LibValue, LISA
• Systematic review wiki
• Journals: JMLA, HILJ, etc.
• Conferences: MLA
• EBLIP Evidence Summary
Creating evidence
• ???
52. ACTIVITY 4
1. identify 2-3 sources for locating evidence to
answer your question
2. consider 1 potential source of local
evidence to look into
54. Critical appraisal
Weigh up the evidence
• Reliable
• Valid
• Applicable
Checklists help with critical appraisal process
Language is different for interpretive (qualitative) research
55. Reliability
1. Results clearly explained
2. Response rate
3. Useful analysis
4. appropriate analysis
5. Results address research question(s)
6. Limitations
7. Conclusions based on actual results
56. Validity
1. Focused issue/question
2. Conflict of interest
3. Appropriate and replicable method
4. Population and representative sample
5. Validated instrument
58. ReLIANT
For appraising research on information skills instruction
Focuses on:
• Study design
• Educational context
• Results
• Relevance
Koufogiannakis, D., Booth, A., & Brettle, A. (2006) Reliant: Reader's Guide to the Literature on
Interventions Addressing the Need for Education and Training. Library & Information Research
30(94), 44-51.
59. CRiSTAL Checklist
For appraising research on user studies
Focuses on:
• Study design
• Results
• Relevance
Developed by Andrew Booth and Anne Brice. Available from:
http://nettingtheevidence.pbworks.com/w/page/11403006/Critical%20Appraisal%20Checklists
63. Ways to apply evidence
1) The evidence is directly applicable
2) The evidence needs to be locally validated
3) The evidence improves understanding
Reflection