Peer reviewing search strategies

Last revised: 
2020-03-31

Introduction

Search strategy peer review, within the evidence synthesis context, is a process by which the searches designed and conducted for a Health Technology Assessment (HTA) or systematic review are designed, ideally, by an Information Specialist and subsequently reviewed by another Information Specialist, prior to the searches being run. The goal of peer review of search strategies is to detect errors in a timely fashion (that is, before the searches are run), to improve quality, and to reduce not only the risk of missing relevant studies but also the risk of identifying unnecessarily large numbers of irrelevant records.

As the search strategy is the cornerstone of a well-conducted HTA or systematic review, its quality could affect the results of the final review.  A study published in 2006 by Sampson and McGowan found that errors in search strategies were common: the principal mistakes being spelling errors, missed spelling variants, truncation errors, logical operator errors, use of wrong line numbers, missed or incorrect use of Medical Subject Heading index terms (e.g. MeSH), and the search strategy not being tailored for use in other databases (1).  A study by Franco et al (2) published in 2018 assessed a random sample of 70 Cochrane systematic reviews of interventions published in 2015, evaluating the design and reporting of their search strategies using the recommendations from the Cochrane Handbook for Systematic Reviews of Interventions (2011 version) (3), the Methodological Expectations of Cochrane Intervention Reviews (MECIR standards - 2013 version) (4) and the Peer Review of Electronic Search Strategies (PRESS) evidence‐based guideline (5, 6).  They found problems in the design of the search strategies in 73% of the reviews (95% CI, 60‐84%) and 53% of these contained problems (95% CI, 38‐69%) that could limit both the sensitivity and precision of the search strategies. More recently, a study by Salvador-Olivan et al (7) published in 2019 found that 92.7% of their 137 included systematic reviews, published in January 2018, contained some type of error in the MEDLINE/PubMed search strategy, and that 78.1% of these errors affected recall / sensitivity.

How is peer review of search strategies performed?

Peer review of search strategies has been performed informally since searching for studies for HTA and systematic reviews began. HTA Information Specialists who are part of information teams have always been able to check colleagues’ search strategies for mistakes, if project time allowed. The search strategy peer reviewer and the Information Specialist who designed the search strategy have been able to meet face-to-face to discuss errors and revisions. Not all Information Specialists, however, are based in teams and so may be unable to call on colleagues to peer review their search strategies. A forum has been established to enable Information Specialists to submit their searches for peer review by a fellow Information Specialist, on a reciprocal basis (http://pressforum.pbworks.com).

In addition to the forum mentioned above, a tool has been developed which enables Information Specialists to check search strategies in a more formal, structured way. The PRESS (Peer Review of Electronic Search Strategies) checklist is an evidence-based checklist that summarizes the main potential mistakes made in search strategies. This checklist can help Information Specialists to improve and assure the quality of their own search strategies and those of their colleagues. It provides clear guidance for peer reviewers to follow. It can also help non-searchers understand how search strategies have been constructed and what it is they have been designed to retrieve. Full details about the original PRESS project can be found in the original funder’s report (8) and the accompanying journal article (5). Further information, including the original PRESS checklist (now superseded by PRESS 2015 (9, 10)) can be found elsewhere (6). An update of the PRESS processes was published in 2016 (9). This involved an updated systematic review, a web-based survey of experts and a consensus meeting to update the PRESS tools. The 2015 Guideline Explanation and Elaboration (PRESS E&E) incorporates four components:

The six main domains of the updated PRESS 2015 evidence-based checklist are:

It is recommended that the peer review of search strategies is undertaken at the research protocol phase. Peer review of the search strategy should be performed before the searches are conducted, the results downloaded and researchers start the selection of studies. The latest version of the Cochrane Handbook chapter on searching for and selecting studies, published in October 2019, strongly recommends the peer review of search strategies at the protocol stage (11), whilst the draft PRISMA-S checklist includes an item explicitly for peer review of search strategies (12). Both also suggest the acknowledgment of search strategy peer reviewers.

Is there any evidence of the value of the peer review of search strategies?

The Agency for Healthcare Research and Quality (AHRQ) has conducted a study assessing use of the PRESS checklist and found that it “seems to cut down the time needed to do the review, increase response, and do a better job of identifying actual errors in search strategies” (13). The time burden of the review process using the PRESS checklist was less than two hours.

We have not been able to identify any evidence of whether this tool affects the final quality of systematic reviews or its economic cost. CADTH, however, conducted an internal investigation to see whether peer review of search strategies has an effect on the number and quality of articles included in CADTH Rapid Response reports (14, 15, 16) and found that both the number and quality of relevant articles retrieved were improved. We have also found increased reporting of both peer reviewing of search strategies and use of the PRESS checklist (without reporting evidence of the effectiveness).  Folb and colleagues conducted an evaluation of workshops they were running for librarians on systematic reviews and found that pre-class only 9% of librarians had ever provided peer review of search strategies but at 6-months post-class follow-up this had risen to 17% (17).  With respect to seeking peer review of their own searches, they found that pre-class only 36% of librarians had ever sought peer review of search strategies but at 6-months post-class follow-up this had risen to 48% (17).

It is worth noting that there is increasing interest, at least within the librarian and information specialist community, in librarian and information specialist involvement in peer reviewing search strategies at the manuscript submission for publication stage.  For example, a recent online survey of medical librarians and information specialists conducted by Grossetta Nardini and colleagues, found that only 22% (63/291) of respondents had ever been invited to peer review a systematic review or meta-analysis journal manuscript (18).  As clarified above, this aspect is beyond the scope of this summary, which focusses on peer review of search strategies prior to the searches being run but indicates increasing awareness of this related topic.

 

Reference list