Analysing synthesis of evidence in a systematic review in health professions education: observations on struggling beyond Kirkpatrick



Maudsley, Gillian and Taylor, David ORCID: 0000-0002-3296-2963
(2020) Analysing synthesis of evidence in a systematic review in health professions education: observations on struggling beyond Kirkpatrick. Medical Education Online, 25 (1). p. 1731278.

This is the latest version of this item.

Access the full-text of this item by clicking on the Open Access link.
[img] Text
Analysing synthesis of evidence in a systematic review in health professions education observations on struggling beyond Kirkpatrick.pdf - Published version

Download (2MB) | Preview

Abstract

<b>Background</b>: Systematic reviews in health professions education may well under-report struggles to synthesize disparate evidence that defies standard quantitative approaches. This paper reports further process analysis in a previously reported systematic review about mobile devices on clinical placements.<b>Objective</b>: For a troublesome systematic review: (1) Analyse further the distribution and reliability of classifying the evidence to Maxwell quality dimensions (beyond <i>'Does it work?'</i>) and their overlap with Kirkpatrick K-levels. (2) Analyse how the abstracts represented those dimensions of the evidence-base. (3) Reflect on difficulties in synthesis and merits of Maxwell dimensions.<b>Design</b>: Following integrative synthesis of 45 K2-K4 primary studies (by combined content-thematic analysis in the pragmatism paradigm): (1) Hierarchical cluster analysis explored overlap between Maxwell dimensions and K-levels. Independent and consensus-coding to Maxwell dimensions compared (using: percentages; kappa; McNemar hypothesis-testing) pre- vs post-discussion and (2) article abstract vs main body. (3) Narrative summary captured process difficulties and merits.<b>Results</b>: (1) The largest cluster (five-cluster dendrogram) was acceptability-accessibility-K1-appropriateness-K3, with K1 and K4 widely separated. For article main bodies, independent coding agreed most for appropriateness (good; adjusted kappa = 0.78). Evidence increased significantly pre-post-discussion about acceptability (p = 0.008; 31/45→39/45), accessibility, and equity-ethics-professionalism. (2) Abstracts suggested efficiency significantly less than main bodies evidenced: 31.1% vs 44.4%, p = 0.031. 3) Challenges and merits emerged for before, during, and after the review.<b>Conclusions</b>: There should be more systematic reporting of process analysis about difficulties synthesizing suboptimal evidence-bases. In this example, Maxwell dimensions were a useful framework beyond K-levels for classifying and synthesizing the evidence-base.

Item Type: Article
Uncontrolled Keywords: Best evidence, cluster analysis, epistemology, evidence-based education, evidence synthesis, Kirkpatrick levels, Maxwell dimensions of quality, medical education, process analysis, systematic review
Depositing User: Symplectic Admin
Date Deposited: 14 Apr 2020 07:16
Last Modified: 18 Jan 2023 23:55
DOI: 10.1080/10872981.2020.1731278
Open Access URL: https://doi.org/10.1080/10872981.2020.1731278
Related URLs:
URI: https://livrepository.liverpool.ac.uk/id/eprint/3083157

Available Versions of this Item