6. What is a Systematic Review?
Scientific & Unbiased:
●
● “A systematic review involves the application of scientific strategies,
in ways that limit bias, to the assembly, critical appraisal, and
synthesis of all relevant studies that address a specific clinical
question.”
Summary:
●
● “A meta-analysis is a type of systematic review that uses statistical
methods to combine and summarize the results of several primary
studies.”
Clearly Reported:
●
● “Because the review process itself (like any other type of research) is
subject to bias, a useful review requires clear reporting of information
obtained using rigorous methods.”
● Cook DJ, Mulrow CD, Haynes RB. Systematic Reviews:
Synthesis of Best Evidence for Clinical Decisions. Annals of
Internal Medicine 1997 126(5):376-380.
8. Cochrane Review Teams
Clinical expert
●
● Initiates, defines, selects topic.
Clinical expert
●
● Partners in above process, and collaborates in review to prevent
bias.
Statistician
●
● Provides methodological oversight, ensures process quality for
entire project.
Librarian
●
● Provides methodological oversight, ensues process quality for
information search process.
Healthcare Consumer
●
● Provides insight into the priorities for research, information conduit
for relating priorities and findings between consumers and
clinicians.
9. What is Evidence-Based Healthcare?
●According to the ADA policy statement on
EBD, the term "best evidence" "refers to
information obtained from randomized
controlled clinical trials, nonrandomized
controlled clinical trials, cohort studies, case-
control studies, crossover studies, cross-
sectional studies, case studies or, in the
absence of scientific evidence, the
consensus opinion of experts in the
appropriate fields of research or clinical
practice. The strength of the evidence
follows the order of the studies or opinions
listed above.”
● Ismail AI, Bader JD. Evidence-based dentistry
in clinical practice. JADA 2004 135(1):78-83
10. What is Evidence-Based Healthcare?
●Short version:
●‘Make your [clinical] decisions based on the
best evidence available, integrated with your
clinical judgment. That’s all it means. The best
evidence, whatever that is.’
● Paraphrased from Dr. Ismail in
conversation, circa 2003.
13. Process & Methodology, Overview
Team meets
●
● Define topic, overview literature base, suggest inclusion/exclusion
criteria, discuss methodology & timeline.
Librarian
●
● Generates data for the team
● FRIAR/MEMORABLE/SECT
● Topic experts collaborate
Topic experts
●
● Review data at 3-4 levels (title, abstract, article, [request additional
information]), achieve consensus
● Handsearching (librarian generates list, experts implement)
● Determine level of evidence for remaining research
● Generate review tables
Share findings (Publication)
●
● Strength of evidence available (strong, weak, inadequate); suggest
directions for future research to fill gaps in research base
14. Expectations of the Librarian Role
●Search strategy
● Background research of already published similar search strategies, systematic
reviews on related topics
● Suggest appropriate terms & concepts for review by clinical experts
● Clear communications with team about relationship of search to question and
methodology
● Sensitivity / specificity
● Validated, revised, adhering to standards / guidelines / best practices
● Publication-ready copy of strategy
● Variant strategies for other databases
●Data set
● In appropriate format
● Data set management support
●Methodology oversight
● Write / revise methodology & results as appropriate
● Assure replicability of methods
16. Special Issues
Authorship
●
● Justifying co-authorship
● Criteria
● Avoiding plagiarism, self-plagiarism, and other questionable
writing practices: A guide to ethical writing: http://ori.dhhs.
gov/education/products/plagiarism/index.shtml
● Establishing authorship: http://ori.dhhs.
gov/education/products/plagiarism/33.shtml
● Co-authorship versus fee-for-services-rendered (or both)
● When do you NOT want to be a co-author?
18. Frame = Question (PIO/PICO/PICOT)
●P = Patient
●I = Intervention
●C = Control group or comparison
● NOTE: In very small research domains, this portion may not be
included. A systematic review would not reach clinical
significance, but would focus on levels of evidence available and
directions for future research.
●O = Outcome desired
●T = Time factor
19. R = Relevance
●Searches usually have 2-4 topics that are relevant and searchable.
●Rank relevant searchable topics by importance.
●TIP: Structure the search to place the greatest attention, focus, and
richness of term generation on the most important topic.
20. I = Irrelevant
●Too broad
●Too specific
●Unsearchable
●NOTE: Irrelevant topics are not irrelevant to the question or clinical
decision, and are important to making the final decisions. They are
irrelevant to the search process.
21. A = Alternates/Aliases
● Term generation process might include:
● Alternate terms, spellings (UK), archaic terms
● Acronyms & what they stand for
● Anatomical area, symptoms, diagnostic criteria
● Products, chemicals, microorganisms, registry
numbers, etc.
●NOTE: After asking the question, this is most
important part of the process.
●TIP: Have team brainstorm terms, then search for
more, have team review added terms.
22. R = Review, Revise, Repeat
●Did search retrieve what it should? (True positive)
● If not, why not?
●Did search retrieve what it shouldn’t (False positive)
● Are there patterns that would allow you to exclude the false
positives? Without losing the ones you want?
24. Sentinel Articles
●Number of sentinels desired - 3-5. Can have more or less, but this tends to
work best. Verify appropriateness of selected sentinels.
●Neither very recent (current year) or old (before 1985)
● Articles old enough to have MeSH assigned, new enough to have complete
indexing
● On topic, not broader or narrower
● Well-indexed with appropriate terms
●Representative of citations that would be retrieved by a well-done search
●Remember – purpose is for validating search, not proving you know the
best articles on the topic
●Each sentinel article must represent ALL desired concepts in the search
● Articles selected must meet all inclusion and exclusion criteria.
25. MESH Tip
●Earlier term mappings prior to
assignment of a MeSH term are often:
● presenting symptom or diagnosis
● anatomical area
●TMJD Example:
● TMJD = temporomandibular joint
disorder
● = (Temporomandibular joint
[anatomical area] + ("myofacial
pain" OR "Bone Diseases")
[presenting symptom OR
diagnosis]
Image: Frank Gaillard. Normal anatomy of the Temporomandibular joint. 14 Jan
2009. http://commons.wikimedia.org/wiki/File:Temporomandibular_joint.png
26. MESH & Sentinels 1 & 2
Process citations
●
● strip out everything except
MeSH terms
● sort alphabetically
●Verify sentinel citations in
● strip irrelevant Mesh terms
MEDLINE
● dedupe
●Save file with full citations
● note frequency of
(abstract, MeSH headings,
repetition of terms to aid
everything)
in determining weight of
●Make duplicate file to process
term
● note distribution and
frequency of subheadings
27. MESH & Sentinels 3
Analyse MeSH Terms
●
● retain topical terms
● retain methodology terms
● retain non-MeSH terms such as publication type and registry
numbers (but separate from core concept terms)
28. Systematic Review Search
Strategy Example
●Torabinejad M, Anderson P, Bader J, Brown LJ, Chen
LH, Goodacre CJ, Kattadiyil MT, Kutsenko D, Lozada J,
Patel R, Petersen F, Puterman I, White SN. Outcomes of
root canal treatment and restoration, implant-supported
single crowns, fixed partial dentures, and extraction
without replacement: a systematic review. J Prosthet
Dent. 2007 Oct;98(4):285-311. PMID: 17936128
29. Sources of Search Strategies
●Lingo: filters vs. hedges
●Search the Methods of existing systematic reviews.
●Warning:
● Many articles published as systematic reviews may have modified
the process.
● Many articles published as systematic reviews may not include a
replicable search methodology.
● Some articles published as systematic reviews may not actually
be systematic reviews.
30. Sources of Search Strategies
●NIH Consensus Development Conference on Dental Caries:
● http://www.lib.umich.edu/health-sciences-libraries/nih-consensus-
development-conference-diagnosis-and-management-dental-carie
●EBHC Strategies Wiki:
● http://ebhcstrategies.wetpaint.com/
31. Data Abstraction / Extraction
●Symbolic expression always
arises through abstraction and
simplification, and can only
reflect a small part of reality.
● Cajal, Advice to a Young
Investigator, p. 76
33. Evidence Table Example
●Levels of evidence
●Participant characteristics
●Study characteristics
●Intervention and outcome measurements
●Results
●Study limitations
●Inclusion/Exclusion criteria
http://www.aota.org/DocumentVault/AJOT/Template.aspx?FT=.pdf
34. Data abstraction / extraction samples
Cochrane:
●
● Forms: http://endoc.cochrane.org/data-extraction-sheets
● Elements: http://www.cochrane.org/handbook/table-73a-checklist-items-consider-data-
collection-or-data-extraction
● Cochrane CFGD November 2004 * (DOC): http://cfgd.cochrane.org/sites/cfgd.
cochrane.org/files/uploads/Study%20selection%20and%20%20extraction%20form.doc
Data Extraction Template, 2011 (XLS): http://www.latrobe.edu.
au/chcp/assets/downloads/DET_2011.xls
● Overview: http://www.cochrane-pro-mg.org/Documents/Reviewsheet-1-4-08.pdf
Other
●
● CDC Data Abstraction Form * : http://www.nccmt.ca/uploads/registry/CDC%20Tool.
pdf
● Social science example (suicide) http://www.cmaj.ca/content/suppl/2009/01/29/180.
3.291.DC2/ssri-barbui-3-at.pdf
35. Clearly Stating
the Evidence
(PRISMA)
NOTE:
PRISMA-P coming later
this year
http://www.prisma-statement.org/2.1.4%20-%
20PRISMA%20Flow%202009%20Diagram.
pdf
36. Guidelines for Assessing/Reporting
Results
●CONSORT (Consolidated Standards of Reporting Trials) 2001, 2010
●ASSERT (A Standard for the Scientific & Ethical Review of Trials)
2001
○ EQUATOR (Enhancing the QUAlity & Transparency of health
Research) 2008
○ SPIRIT (Standard Protocol Items for Randomized Trials) 2008
●QUORUM (Quality of Reporting of Meta-analyses) 1999, 2009
●MOOSE (Meta-analysis of Observational Studies in Epidemiology)
2000
●STROBE (Strengthening the Reporting of Observational Studies in
Epidemiology) 2007, STROBE-ME 2011
●Others:
CENT, COREQ, GNOSIS, ORION, RECORD, REFLECT, REMARK,
REMINDE, SQUIRE, STARD, STREGA, TREND, WIDER, ...
37. Evolution of Guidelines for
Assessing/Reporting Results
●CONSORT 2001 —> 2010 —> PRISMA
●QUORUM 1999 —> 2009 —> PRISMA
●ASSERT 2001 —> SPIRIT 2008 —> EQUATOR 2008
●STROBE 2007 —> STROBE-ME 2011
More on fragmentation & history:
Jan P. Vandenbroucke. Journal of Clinical Epidemiology 62 (2009):594-
596 http://www.veteditors.org/Publication%20Guidelines/JOURNAL%20VARIATION%20on%
20Guidelines%202009.pdf
CURRENT LIST: http://www.equator-network.org/resource-centre/library-of-
●
health-research-reporting/
38. Standard Team/Process vs Reality:
Case Studies
Elements Projects & Examples
● Team: size,
configuration ● NIH Consensus Development
● Domain size Conference on Dental Caries
● Torebinejad: Root Canal Treatment
● Librarian role, # ● Cochrane: Herpes Simplex (Oral
● Clerical support Health, Skin)
● Software / tech ● Delta Dental
support ● Diabetes
● Articaine vs. Lidocaine
● Handsearching / ● Healthcare Social Media
experts
● Outcomes
● Most common?
● Underestimating time/labor requirements
● First time findings: "insufficient evidence"