Librarians en University of Michigan Taubman Health Sciences Library
25 de Mar de 2018•0 recomendaciones•1,615 vistas
1 de 51
Rapid Reviews 101
25 de Mar de 2018•0 recomendaciones•1,615 vistas
Descargar para leer sin conexión
Denunciar
Salud y medicina
A basic introduction to rapid reviews, created for a graduate student workshop, March 2018, presented by PF Anderson from the University of Michigan. Includes links to more resources, standards and guidelines, tools, software, and more.
4. Rapid Review: Definition
“A formal definition for a rapid review does not exist. As such, we used the
following working definition, ‘a rapid review is a type of knowledge synthesis
in which components of the systematic review process are simplified or
omitted to produce information in a short period of time’.”
Andrea C. Tricco, Jesmin Antony, Wasifa Zarin, Lisa Strifler, Marco Ghassemi, John Ivory, Laure Perrier, Brian
Hutton, David Moher, and Sharon E. Straus. A scoping review of rapid review methods. BMC Med. 2015; 13:
224. https://www.ncbi.nlm.nih.gov/pmc/articles/PMC4574114/
7. Rapid Reviews in Context: More Types
● Comparative effectiveness
reviews
● Critical review
● Effectiveness reviews
● Health technology
assessment
● Literature review
● Mapping review /
Systematic map
Grant MJ, Booth A. A typology of reviews: an analysis of 14 review types and associated methodologies. Health Info Libr J. 2009
Jun;26(2):91-108. doi: 10.1111/j.1471-1842.2009.00848.x. https://onlinelibrary.wiley.com/doi/abs/10.1111/j.1471-1842.2009.00848.x
HLWiki International: Rapid Reviews http://hlwiki.slais.ubc.ca/index.php/Rapid_reviews (Dean Giustini, 9 March 2018)
● Meta-analysis
● Mixed methods / Mixed
studies
● Multi-arm systematic review
● Overview
● Qualitative systematic
review / qualitative evidence
synthesis
● Rapid review
● Review of reviews
● Scoping review / Quick
scoping review
● State-of-the-art review
● Systematic review
● Systematic search and
review
● Systematized review
● Umbrella review
8. Rapid Reviews in Context: Time
“Traditional literature reviews do not apply additional statistical methods to
the materials found.
Systematic reviews take exponentially more time to do, from the search
strategy creation itself, to going through each retrieved citation in duplicate
or triplicate, to analyzing the data from the included articles.”
Melissa L. Rethlefsen. “I Want To Do a Systematic Review.”
https://liblog.mayo.edu/2013/05/01/i-want-to-do-a-systematic-review/
9. Rapid Reviews in Context: Time
● Systematic review
○ Recommended time =
12 months
○ Average or typical time =
23 months
○ Can range up to several years
● Rapid review
○ Average or typical time =
6 months
○ Can range from 1 month to a
year
10. Rapid Reviews Are Different … How?
Elements standard in systematic reviews that may be altered in rapid reviews
● SCOPE (type/number of questions; number of studies included)
● COMPREHENSIVENESS (search - databases, hand searching, date, setting, languages; study types; text
analysis)
● RIGOR (Eliminate dual study selection and/or data extraction; peer review)
● SYNTHESIS (Limit or eliminate risk of bias testing, quality assessment of studies, quality assessment of
evidence; analysis reduced to either quantitative or qualitative)
Roberfroid D, Fairon N, San Miguel L, Paulus D. Method — Rapid Reviews. KCE (Belgian Health Care Knowledge Centre), KCE
Process Notes, 2017. https://kce.fgov.be/sites/default/files/atoms/files/Rapid_Review_0_0.pdf
11. Overview of Rapid Reviews (Tricco et al)
AC Tricco et al.. A scoping review of rapid review methods. BMC Med. 2015; 13: 224. https://www.ncbi.nlm.nih.gov/pmc/articles/PMC4574114/
13. Questions to ask yourself before you start
What is your topic?
What is your question?
Why this question? What is the goal?
Do you have the people and resources needed? How will you include stakeholders’ insights?
What data do you need to report? What data are you planning to capture? How?
How do you plan to minimize and assess bias in the question, team, process, data, etc?
When do you call it “done” or “good enough”? How will you decide?
Image credit: https://openclipart.org/detail/252546/prismatic-question-mark-fractal-5-no-background
14. #1 TIP:
TRANSPARENCY
RULES!!!
“Lesson 1: The notion of a rapid-review is
ill-defined. However, introducing one
methodology isn’t necessarily appropriate. What
is important is transparency behind the process.”
Rapid versus systematic reviews – part 2
https://blog.tripdatabase.com/2012/04/24/rapid
-versus-systematic-reviews-part-2/
Image source:
https://openclipart.org/detail/13842/beverage-glass-tumbler
15. #1 TIP:
TRANSPARENCY
RULES!!!
Keep detailed notes about your process
Keep records of search strategies, including
versions, and changes
Don’t get confused by test runs of data
analysis from draft versions of search; wait
until search is final, and clearly label the final
data set for analysis.
Include file naming conventions that include
metadata such as date of file or search. Image source:
https://openclipart.org/detail/182192/papillon-transparent
16. #2 TIP: AVOID BIAS!!!
Cochrane Risk of Bias Tool
http://methods.cochrane.org/bias/assessing-risk
-bias-included-studies
● Selection bias
● Performance bias
● Detection bias
● Attrition bias
● Reporting bias
● Other bias
Image source: Higgins JPT et al. The Cochrane Collaboration’s
tool for assessing risk of bias in randomised trials.
BMJ 2011; 343 doi: https://doi.org/10.1136/bmj.d5928
(Published 18 October 2011)
19. Limiting the search may introduce bias
“Systematic reviews may be compromised by selective inclusion and reporting of outcomes and analyses.
Selective inclusion occurs when there are multiple effect estimates in a trial report that could be included in
a particular meta-analysis (e.g. from multiple measurement scales and time points) and the choice of effect
estimate to include in the meta-analysis is based on the results (e.g. statistical significance, magnitude or
direction of effect). Selective reporting occurs when the reporting of a subset of outcomes and analyses in
the systematic review is based on the results (e.g. a protocol-defined outcome is omitted from the published
systematic review).” (Page et al, 2014, https://www.ncbi.nlm.nih.gov/pubmed/25271098)
“Do bodies of evidence that are based on abbreviated literature searches lead to different conclusions
about benefits and harms of interventions compared with bodies of evidence that are based on
comprehensive, systematic literature searches?” (Nussbaumer-Streit et al, 2016,
https://www.ncbi.nlm.nih.gov/pmc/articles/PMC5120483/)
20. Limiting the search (report accurately)
● Limiting terms
○ Most potentially serious way to introduce bias. If working from a prior search, may eliminate terms if validated
for sensitivity & specificity of retrieval. Validating terms can be time consuming in a complex search strategy.
○ Can make use of a term more specific by limiting MeSH heading to major [MAJR] or limiting textwords to when
they appear in title/abstract [TIAB] or title [TI].
● Databases
○ MEDLINE, COCHRANE, EMBASE, or … ?
● Handsearching
○ Grey literature sources, table of contents, number of journals selected for custom review, eliminate document
types such as theses, textbooks, etc.
● Dates
○ Be thoughtful. Don’t limit dates if this will exclude important developments in the topic being examined.
● Language
○ If your topic was invented, discovered, or is especially common in a non-English speaking country, you probably
need to include that language in the review.
21. Limiting the search (report accurately)
● Setting
● Study types
● Publication types
● Methodologies
● Age limits
● Geographic area
Image credit(s):
https://openclipart.org/detail/194606/zone-search-pattern |
https://openclipart.org/detail/194604/line-search-pattern |
https://openclipart.org/detail/194603/grid-search-pattern |
https://openclipart.org/detail/224996/spiral-search-pattern
23. Example Protocol Elements & Structure
Opening content
● Title
● Team & Institution
● Summary, Background, Purpose, Goals, Why
Body
● Question / Topic / Definitions of Terms
● Inclusion/Exclusion Criteria
● Risk of bias assessment for this study
● Outputs
Methods
● Search
● Data screening
● Quality assessment
● Bias assessment/evaluation in data
● Data extraction
● Synthesis
Closing content
● Stakeholders
● Funding & COI
● Additional or supporting materials
● Timeline
24. Sentinel (NOT Seminal) Articles: Selection
Seminal Articles
● Highly significant
● Influential
● Important
● By a leader in defining the research in the
field; often the first on a particular topic,
method, or concept
● Key studies. Also called: pivotal research,
landmark study, classic
Sentinel Articles
● On topic, not broader or narrower
● Well-indexed with appropriate terms
● Representative of citations that would be
retrieved by a well-done search
● Each sentinel article must represent ALL
desired concepts in the search
● Articles selected must meet ALL inclusion and
exclusion criteria.
25. Sentinel (NOT Seminal) Articles: Uses
Term generation process
● Variant terms & related concepts found in
sentinel articles are likely to include terms &
concepts that might have been otherwise
missed
● Look here:
○ Titles
○ Abstracts
○ Keywords
○ Cataloging terms (ie. MeSH)
○ Bibliographies
Search validation
● Can help to minimize accidental bias in the
search strategy
● Can draw attention to gaps in the search
which otherwise might not become apparent
until after publication
26. Inclusion / Exclusion Criteria
INCLUSION CRITERIA
● Required; reduces confounding variables
● Provide rationale or justification for these
criteria
● Examples:
○ Dates
○ Language
○ Types of participants / methodologies
○ Type of analysis
○ Context or location
○ Outcome measures
SEE: Meline T. Selecting Studies for Systematic Review: Inclusion and Exclusion Criteria. Contemporary Issues in
Communication Science and Disorders 2006 33:21–27.
https://www.asha.org/uploadedfiles/asha/publications/cicsd/2006sselectingstudiesforsystematicreview.pdf
Image credit: https://openclipart.org/detail/169757/check-and-cross-marks
EXCLUSION CRITERIA
● Makes sample or subject ineligible
● Provide rationale or justification for these
criteria
● Examples:
○ Publication type / article type
○ Out of scope, lack of diagnosis, ineligible for
treatment proposed
○ Unclear or mixed population
○ Reverse of specific inclusion criteria
○ Outcomes/population/methods not reported
in sufficient detail
27. Screening & Interrater Reliability (Simplified)
Step 0: Calibration & interrater reliability tuning
● Receive blinded dataset from search. Dataset should include only the title, abstract, and unique identifier.
Abstract may be truncated as delivered from the database
● Test initial sample (~50?) of articles with title/abstract screening with both screeners, independently.
● From the title and/or abstract (as exported from the database), are you able to determine whether an articles
matches inclusion/exclusion criteria, and should be selected or discarded?
● Both screeners meet to compare decisions, review inclusion/exclusion criteria, determine criteria for reaching
consensus, decide whether to revise methods or move forward.
Step 1: Title & Abstract screening
Step 2: Full article screening applied to articles remaining after Step 1 screening
Step 3: Request clarifying information from the original authors for final inclusion/exclusion decision regarding articles
remaining from Step 2 screening for which doubt remains or for which there is a lack of consensus between screeners.
28. Data Extraction / Abstraction
For final set of included articles, review each article for metadata, quality criteria, relevant methodology
details, as well as specific data regarding the inclusion criteria, and other items of interest.
Data is recorded in a form, template, spreadsheet, or other tool to allow comparison between the included
studies and support synthesis across the final dataset.
29. Data Extraction /
Abstraction: Simple
Example
Appendix 2: Example data extraction form for systematic
reviews. FROM: Effects of a demand-led evidence briefing
service on the uptake and use of research evidence by
commissioners of health services: a controlled
before-and-after study.
https://www.ncbi.nlm.nih.gov/books/NBK424005/
30. Data Extraction / Abstraction: Templates
Cochrane
● Training:
○ Data collection forms for intervention reviews http://training.cochrane.org/resource/data-collection-forms-intervention-reviews
○ Example file:
http://training.cochrane.org/sites/training.cochrane.org/files/public/uploads/resources/downloadable_resources/English/Collecting
%20data%20-%20form%20for%20RCTs%20and%20non-RCTs.doc
● Airways:
http://airways.cochrane.org/sites/airways.cochrane.org/files/public/uploads/Data%20collection%20form%20for%20in
tervention%20reviews%20for%20RCTs%20and%20non-RCTs.doc
● Cochrane Consumers & Communication Review Group
http://cccrg.cochrane.org/sites/cccrg.cochrane.org/files/public/uploads/det_2015_revised_final_june_20_2016_nov_2
9_revised.doc
● Cystic Fibrosis and Genetic Disorders:
http://cfgd.cochrane.org/sites/cfgd.cochrane.org/files/public/uploads/Study%20selection%20%26%20%20extraction%
20form%20RM5.doc
● Public Health Group:
https://ph.cochrane.org/sites/ph.cochrane.org/files/public/uploads/CPHG%20Data%20extraction%20template_0.docx
31. Data Extraction / Abstraction: Tools
Specialty Software
● Abstrackr http://abstrackr.cebm.brown.edu/account/login
● Covidence https://www.covidence.org/
● DistillerSR https://www.evidencepartners.com/products/distillersr-systematic-review-software/
● RevMan http://community.cochrane.org/tools/review-production-tools/revman-5
Software for systematic reviewing http://hlwiki.slais.ubc.ca/index.php/Software_for_systematic_reviewing
SR Toolbox: http://systematicreviewtools.com/index.php
38. Standards for Rapid Reviews: WHO
Andrea C. Tricco, Etienne V. Langlois and Sharon E. Straus. Rapid reviews to strengthen health
policy and systems: a practical guide. World Health Organization, Alliance for Health Policy and
Systems Research (Tricco).
<http://www.who.int/alliance-hpsr/resources/publications/rapid-review-guide/en/>
142 pages:
<http://apps.who.int/iris/bitstream/10665/258698/1/9789241512763-eng.pdf?ua=1>
>>>>> 8 page summary:
<http://www.who.int/alliance-hpsr/resources/publications/alliancehpsr_rapidreviewchapterbrie
fs_2018.pdf?ua=1>
2 page flyer:
<http://www.who.int/alliance-hpsr/events/HPSR-flyer-practical-guide-20170811.pdf?ua=1>
39. Standards for
Rapid Reviews:
WHO
Andrea C. Tricco, Etienne V. Langlois and
Sharon E. Straus. Rapid reviews to
strengthen health policy and systems: a
practical guide. World Health Organization,
Alliance for Health Policy and Systems
Research (Tricco).
<http://www.who.int/alliance-hpsr/resource
s/publications/rapid-review-guide/en/>
40. Standards for Rapid Reviews: AMSTAR
● AMSTAR <https://amstar.ca/>
● Checklist <https://amstar.ca/Amstar_Checklist.php>
● Checklist as PDF <https://amstar.ca/docs/AMSTAR-2.pdf>
Mattivi JT, Buchberger B. USING THE AMSTAR CHECKLIST FOR RAPID REVIEWS: IS IT FEASIBLE? International
Journal of Technology Assessment in Health Care 2016 32(4):276-283.
<https://www.cambridge.org/core/services/aop-cambridge-core/content/view/F9E1C970CE082DCCB83958
C0E32D9747/S0266462316000465a.pdf/using_the_amstar_checklist_for_rapid_reviews_is_it_feasible.pdf>
43. Why a Protocol?
“Without review protocols, how can we be assured that decisions made
during the research process aren’t arbitrary, or that the decision to
include/exclude studies/data in a review aren’t made in light of knowledge
about individual study findings?”
Larissa Shameer & David Moher. Planning a systematic review? Think protocols. BMC Research in progress
blog 2015.
<http://blogs.biomedcentral.com/bmcblog/2015/01/05/planning-a-systematic-review-think-protocols/>
44. Protocol Examples
*Rapid Review Protocol - Interventions to promote healthy eating choices when dining out: A systematic
review of reviews
<http://www.behaviourworksaustralia.org/wp-content/uploads/2017/10/Rapid-Review-Protocol.pdf>
*Rapid Review protocol for Post Operative Pain Outcome Measures Study (POPOS)
<http://www.comet-initiative.org/studies/details/1070>
*Rashek Kazi, Bryan Carroll, Andrea Ketchum. The quantification of patient pain: a rapid review of four
commonly employed rating scales. PROSPERO 2018 CRD42018091058 Available from:
<http://www.crd.york.ac.uk/PROSPERO/display_record.php?ID=CRD42018091058> [PROSPERO:
<https://www.crd.york.ac.uk/prospero/>]
Protocol for a Rapid Evidence Review of Traditional and Complementary Medicine for People with Diabetes
Receiving Palliative or End-of-Life Care
<https://pdfs.semanticscholar.org/8e2b/f4b9287ab7a2f5a9b6ce3c44d1df71d1acda.pdf>
45. Rapid Review Examples
Armoiry X et al. Digital Clinical Communication for Families and Caregivers of Children or Young People With
Short- or Long-Term Conditions: Rapid Review. J Med Internet Res. 2018 Jan 5;20(1):e5.
http://www.jmir.org/2018/1/e5/
Coster JE et al. Why Do People Choose Emergency and Urgent Care Services? A Rapid Review Utilizing a
Systematic Literature Search and Narrative Synthesis. Acad Emerg Med. 2017 Sep; 24(9): 1137–1149.
https://www.ncbi.nlm.nih.gov/pmc/articles/PMC5599959/
Manafò E et al. Patient and public engagement in priority setting: A systematic rapid review of the literature.
PLoS One. 2018 Mar 2;13(3):e0193579. doi: 10.1371/journal.pone.0193579. eCollection 2018.
http://journals.plos.org/plosone/article?id=10.1371/journal.pone.0193579
Slade SC, Philip K, Morris ME. Frameworks for embedding a research culture in allied health practice: a rapid
review. Health Res Policy Syst. 2018 Mar 21;16(1):29.
https://health-policy-systems.biomedcentral.com/articles/10.1186/s12961-018-0304-2
46. Variants, Methods
The use of rapid review methods in health technology assessments: 3 case studies. BMC Medical Research
Methodology December 2016, 16:108 https://link.springer.com/article/10.1186/s12874-016-0216-1
Expediting citation screening using PICo-based title-only screening for identifying studies in scoping searches
and rapid reviews https://www.ncbi.nlm.nih.gov/pmc/articles/PMC5702220/
A scoping review of rapid review methods
https://www.ncbi.nlm.nih.gov/pmc/articles/PMC4574114/pdf/12916_2015_Article_465.pdf
Quality of conduct and reporting in rapid reviews: an exploration of compliance with PRISMA and AMSTAR
guidelines. Syst Rev. 2016; 5: 79. https://www.ncbi.nlm.nih.gov/pmc/articles/PMC4862155/
Turner J, Preston L, Booth A, et al. “Chapter 3: Review methods / Rapid review methods.” What evidence is
there for a relationship between organisational features and patient outcomes in congenital heart disease
services? A rapid review. Health Services and Delivery Research, No. 2.43. Southampton (UK): NIHR Journals
Library; 2014 Nov. <https://www.ncbi.nlm.nih.gov/books/NBK263605/>
49. More Information on Rapid Reviews
Jon Brassey <https://rapid-reviews.info/>
Roberfroid D, Fairon N, San Miguel L, Paulus D. Method – Rapid reviews. Methods Brussels: Belgian Health Care Knowledge Centre (KCE).
2016. KCE Process Notes. D/2017/10.273/01. <https://kce.fgov.be/sites/default/files/atoms/files/Rapid_Review_0_0.pdf>
Bianca Kramer: <https://www.slideshare.net/bmkramer/how-rapid-is-a-rapid-review>
TheEvidenceDoc: <http://www.g-i-n.net/conference/past-conferences/10th-conference/tuesday/8-30-am-to-12-00-pm/ireland-52.pdf>
Short Course: <https://www.sheffield.ac.uk/scharr/shortcourseunit/rapidreviews2018>
Tricco et al: Systematic reviews vs. rapid reviews: What’s the difference?
<https://www.cadth.ca/media/events/Andrea-Tricco_RR-vs-Systematic-Reviews_Feb-4-2015.pdf>
Ottawa Hospital: A methodology for conducting repaid evidence reviews.
<http://www.g-i-n.net/document-store/regional-communities/g-i-n-north-america/slides-a-methodology-for-conducting-repaid-evidence-r
eviews>
50. Library Guides
* VCU Libraries: Research Guides: Rapid Review Protocol <https://guides.library.vcu.edu/rapidreview>
* HL Wiki International: Rapid Reviews: <http://hlwiki.slais.ubc.ca/index.php/Rapid_reviews>
Temple University: Systematic Reviews & Other Review Types: What is a Rapid Review?
<http://guides.temple.edu/c.php?g=78618&p=4156608>
Becker Medical Library: Systematic Reviews: Rapid Reviews
<http://beckerguides.wustl.edu/c.php?g=299565&p=2000687>