Peer reviewing search strategies

Last revised: 


Search strategy peer review is a process by which the searches designed and conducted for a Health Technology Assessment (HTA) or systematic review are designed, ideally, by an Information Specialist and subsequently reviewed by another Information Specialist. The goal of peer review of search strategies is to detect errors, to improve quality, and to reduce not only the risk of missing relevant studies but also the risk of identifying unnecessarily large numbers of irrelevant records.

As the search strategy is the cornerstone of a well-conducted HTA or systematic review, its quality could affect the results of the final review.  A study found that errors in search strategies are common: the principal mistakes being spelling errors, missed spelling variants, truncation errors, logical operator errors, use of wrong line numbers, missed or incorrect use of Medical Subject Heading index terms (e.g. MeSH), and the search strategy not being tailored for use in other databases (1).

How is peer review of search strategies performed?

Peer review of search strategies has been performed informally since literature searching for HTA and systematic reviews began. HTA Information Specialists who are part of information teams have always been able to check colleagues’ search strategies for mistakes, if project time allowed. The search strategy peer reviewer and the Information Specialist who designed the search strategy have been able to meet face-to-face to discuss errors and revisions. However, not all Information Specialists are based in teams and so may be less able to call on colleagues to peer review their search strategies. A forum has been established to enable Information Specialists to submit their searches for peer review by a fellow Information Specialist, on a reciprocal basis (

In addition to the forum mentioned above, a tool has been developed which enables Information Specialists to check search strategies in a more formal, structured way. The PRESS (Peer Review of Electronic Search Strategies) checklist is an evidence-based checklist that summarizes the main potential mistakes made in search strategies. This checklist can help Information Specialists to improve and assure the quality of their own search strategies and those of their colleagues. It provides clear guidance for peer reviewers to follow. It can also help non-searchers understand how search strategies have been constructed and what it is they have been designed to retrieve. Full details about the original PRESS project can be found in the original funder’s report (2) and the accompanying journal article (3). Further information, including the original PRESS checklist itself can be found elsewhere (4). An update of the PRESS processes has recently been published (5). This involved an updated systematic review, a web-based survey of experts and a consensus meeting to update the PRESS tools. The 2015 Guideline Explanation and Elaboration (PRESS E&E) incorporates four components:

A Guideline Statement has been published as a companion document to the updated CADTH report (6).

The six main domains of the updated PRESS 2015 evidence-based checklist are:

It is recommended that the peer review of search strategies is undertaken at the research protocol phase. If peer review of the search strategy is performed before the searches are conducted, the results downloaded and researchers start the selection of studies, there will be no need to revisit strategies and rerun searches.

Is there any evidence of the value of the peer review of search strategies?

The Agency for Healthcare Research and Quality (AHRQ) has conducted a study assessing use of the PRESS checklist and found that it “seems to cut down the time needed to do the review, increase response, and do a better job of identifying actual errors in search strategies” (7). The time burden of the review process using the PRESS checklist was less than two hours. There is no evidence of whether this tool affects the final quality of systematic reviews or its economic cost. However, CADTH conducted an internal investigation to see whether peer review of search strategies has an effect on the number and quality of articles included in CADTH Rapid Responses reports (8, 9), and found that both the number and quality of relevant articles retrieved were improved.

Reference list