Assessing the performance of methodological search filters to improve the efficiency of evidence information retrieval: five literature reviews and a qualitative study



Lefebvre, Carol, Glanville, Julie, Beale, Sophie ORCID: 0000-0003-0164-103X, Boachie, Charles, Duffy, Steven, Fraser, Cynthia, Harbour, Jenny, McCool, Rachael and Smith, Lynne
(2017) Assessing the performance of methodological search filters to improve the efficiency of evidence information retrieval: five literature reviews and a qualitative study. HEALTH TECHNOLOGY ASSESSMENT, 21 (69). 1-+.

[img] Text
Lefebvre_2017_search filters_HTA.pdf - Published version

Download (1MB)

Abstract

<h4>Background</h4>Effective study identification is essential for conducting health research, developing clinical guidance and health policy and supporting health-care decision-making. Methodological search filters (combinations of search terms to capture a specific study design) can assist in searching to achieve this.<h4>Objectives</h4>This project investigated the methods used to assess the performance of methodological search filters, the information that searchers require when choosing search filters and how that information could be better provided.<h4>Methods</h4>Five literature reviews were undertaken in 2010/11: search filter development and testing; comparison of search filters; decision-making in choosing search filters; diagnostic test accuracy (DTA) study methods; and decision-making in choosing diagnostic tests. We conducted interviews and a questionnaire with experienced searchers to learn what information assists in the choice of search filters and how filters are used. These investigations informed the development of various approaches to gathering and reporting search filter performance data. We acknowledge that there has been a regrettable delay between carrying out the project, including the searches, and the publication of this report, because of serious illness of the principal investigator.<h4>Results</h4>The development of filters most frequently involved using a reference standard derived from hand-searching journals. Most filters were validated internally only. Reporting of methods was generally poor. Sensitivity, precision and specificity were the most commonly reported performance measures and were presented in tables. Aspects of DTA study methods are applicable to search filters, particularly in the development of the reference standard. There is limited evidence on how clinicians choose between diagnostic tests. No published literature was found on how searchers select filters. Interviewing and questioning searchers via a questionnaire found that filters were not appropriate for all tasks but were predominantly used to reduce large numbers of retrieved records and to introduce focus. The Inter Technology Appraisal Support Collaboration (InterTASC) Information Specialists' Sub-Group (ISSG) Search Filters Resource was most frequently mentioned by both groups as the resource consulted to select a filter. Randomised controlled trial (RCT) and systematic review filters, in particular the Cochrane RCT and the McMaster Hedges filters, were most frequently mentioned. The majority indicated that they used different filters depending on the requirement for sensitivity or precision. Over half of the respondents used the filters available in databases. Interviewees used various approaches when using and adapting search filters. Respondents suggested that the main factors that would make choosing a filter easier were the availability of critical appraisals and more detailed performance information. Provenance and having the filter available in a central storage location were also important.<h4>Limitations</h4>The questionnaire could have been shorter and could have included more multiple choice questions, and the reviews of filter performance focused on only four study designs.<h4>Conclusions</h4>Search filter studies should use a representative reference standard and explicitly report methods and results. Performance measures should be presented systematically and clearly. Searchers find filters useful in certain circumstances but expressed a need for more user-friendly performance information to aid filter choice. We suggest approaches to use, adapt and report search filter performance. Future work could include research around search filters and performance measures for study designs not addressed here, exploration of alternative methods of displaying performance results and numerical synthesis of performance comparison results.<h4>Funding</h4>The National Institute for Health Research (NIHR) Health Technology Assessment programme and Medical Research Council-NIHR Methodology Research Programme (grant number G0901496).

Item Type: Article
Uncontrolled Keywords: Humans, Qualitative Research, Databases, Bibliographic, Information Storage and Retrieval, Technology Assessment, Biomedical, Review Literature as Topic, Search Engine, Surveys and Questionnaires
Depositing User: Symplectic Admin
Date Deposited: 12 Dec 2017 13:02
Last Modified: 21 Aug 2023 09:50
DOI: 10.3310/hta21690
Related URLs:
URI: https://livrepository.liverpool.ac.uk/id/eprint/3014023