Background: Digital health technologies (DHTs), such as electronic health records and prescribing systems, are transforming health care delivery around the world. The quality of information in DHTs... Show moreBackground: Digital health technologies (DHTs), such as electronic health records and prescribing systems, are transforming health care delivery around the world. The quality of information in DHTs is key to the quality and safety of care. We developed a novel clinical information quality (CLIQ) framework to assess the quality of clinical information in DHTs. Objective: This study explored clinicians' perspectives on the relevance, definition, and assessment of information quality dimensions in the CLIQ framework. Methods: We used a systematic and iterative eDelphi approach to engage clinicians who had information governance roles or personal interest in information governance; the clinicians were recruited through purposive and snowball sampling techniques. Data were collected using semistructured online questionnaires until consensus was reached on the information quality dimensions in the CLIQ framework. Responses on the relevance of the dimensions were summarized to inform decisions on retention of the dimensions according to prespecified rules. Thematic analysis of the free-text responses was used to revise definitions and the assessment of dimensions. Results: Thirty-five clinicians from 10 countries participated in the study, which was concluded after the second round. Consensus was reached on all dimensions and categories in the CLIQ framework: informativeness (accuracy, completeness, interpretability, plausibility, provenance, and relevance), availability (accessibility, portability, security, and timeliness), and usability (conformance, consistency, and maintainability). A new dimension, searchability, was introduced in the availability category to account for the ease of finding needed information in the DHTs. Certain dimensions were renamed, and some definitions were rephrased to improve clarity. Conclusions: The CLIQ framework reached a high expert consensus and clarity of language relating to the information quality dimensions. The framework can be used by health care managers and institutions as a pragmatic tool for identifying and forestalling information quality problems that could compromise patient safety and quality of care. Show less
Background: Digital health technologies (DHTs), such as electronic health records and prescribing systems, are transforming health care delivery around the world. The quality of information in DHTs... Show moreBackground: Digital health technologies (DHTs), such as electronic health records and prescribing systems, are transforming health care delivery around the world. The quality of information in DHTs is key to the quality and safety of care. We developed a novel clinical information quality (CLIQ) framework to assess the quality of clinical information in DHTs.Objective: This study explored clinicians’ perspectives on the relevance, definition, and assessment of information quality dimensions in the CLIQ framework.Methods: We used a systematic and iterative eDelphi approach to engage clinicians who had information governance roles or personal interest in information governance; the clinicians were recruited through purposive and snowball sampling techniques. Data were collected using semistructured online questionnaires until consensus was reached on the information quality dimensions in the CLIQ framework. Responses on the relevance of the dimensions were summarized to inform decisions on retention of the dimensions according to prespecified rules. Thematic analysis of the free-text responses was used to revise definitions and the assessment of dimensions.Results: Thirty-five clinicians from 10 countries participated in the study, which was concluded after the second round. Consensus was reached on all dimensions and categories in the CLIQ framework: informativeness (accuracy, completeness, interpretability, plausibility, provenance, and relevance), availability (accessibility, portability, security, and timeliness), and usability (conformance, consistency, and maintainability). A new dimension, searchability, was introduced in the availability category to account for the ease of finding needed information in the DHTs. Certain dimensions were renamed, and some definitions were rephrased to improve clarity.Conclusions: The CLIQ framework reached a high expert consensus and clarity of language relating to the information quality dimensions. The framework can be used by health care managers and institutions as a pragmatic tool for identifying and forestalling information quality problems that could compromise patient safety and quality of care. Show less
Background: Digital health technologies (DHTs), such as electronic health records and prescribing systems, are transforming health care delivery around the world. The quality of information in DHTs... Show moreBackground: Digital health technologies (DHTs), such as electronic health records and prescribing systems, are transforming health care delivery around the world. The quality of information in DHTs is key to the quality and safety of care. We developed a novel clinical information quality (CLIQ) framework to assess the quality of clinical information in DHTs.Objective: This study explored clinicians’ perspectives on the relevance, definition, and assessment of information quality dimensions in the CLIQ framework.Methods: We used a systematic and iterative eDelphi approach to engage clinicians who had information governance roles or personal interest in information governance; the clinicians were recruited through purposive and snowball sampling techniques. Data were collected using semistructured online questionnaires until consensus was reached on the information quality dimensions in the CLIQ framework. Responses on the relevance of the dimensions were summarized to inform decisions on retention of the dimensions according to prespecified rules. Thematic analysis of the free-text responses was used to revise definitions and the assessment of dimensions.Results: Thirty-five clinicians from 10 countries participated in the study, which was concluded after the second round. Consensus was reached on all dimensions and categories in the CLIQ framework: informativeness (accuracy, completeness, interpretability, plausibility, provenance, and relevance), availability (accessibility, portability, security, and timeliness), and usability (conformance, consistency, and maintainability). A new dimension, searchability, was introduced in the availability category to account for the ease of finding needed information in the DHTs. Certain dimensions were renamed, and some definitions were rephrased to improve clarity.Conclusions: The CLIQ framework reached a high expert consensus and clarity of language relating to the information quality dimensions. The framework can be used by health care managers and institutions as a pragmatic tool for identifying and forestalling information quality problems that could compromise patient safety and quality of care. Show less
Background: Digital health technologies (DHTs), such as electronic health records and prescribing systems, are transforming health care delivery around the world. The quality of information in DHTs... Show moreBackground: Digital health technologies (DHTs), such as electronic health records and prescribing systems, are transforming health care delivery around the world. The quality of information in DHTs is key to the quality and safety of care. We developed a novel clinical information quality (CLIQ) framework to assess the quality of clinical information in DHTs.Objective: This study explored clinicians’ perspectives on the relevance, definition, and assessment of information quality dimensions in the CLIQ framework.Methods: We used a systematic and iterative eDelphi approach to engage clinicians who had information governance roles or personal interest in information governance; the clinicians were recruited through purposive and snowball sampling techniques. Data were collected using semistructured online questionnaires until consensus was reached on the information quality dimensions in the CLIQ framework. Responses on the relevance of the dimensions were summarized to inform decisions on retention of the dimensions according to prespecified rules. Thematic analysis of the free-text responses was used to revise definitions and the assessment of dimensions.Results: Thirty-five clinicians from 10 countries participated in the study, which was concluded after the second round. Consensus was reached on all dimensions and categories in the CLIQ framework: informativeness (accuracy, completeness, interpretability, plausibility, provenance, and relevance), availability (accessibility, portability, security, and timeliness), and usability (conformance, consistency, and maintainability). A new dimension, searchability, was introduced in the availability category to account for the ease of finding needed information in the DHTs. Certain dimensions were renamed, and some definitions were rephrased to improve clarity.Conclusions: The CLIQ framework reached a high expert consensus and clarity of language relating to the information quality dimensions. The framework can be used by health care managers and institutions as a pragmatic tool for identifying and forestalling information quality problems that could compromise patient safety and quality of care. Show less
Car, L.T.; Kyaw, B.M.; Panday, R.S.N.; Kleij, R. van der; Chavannes, N.; Majeed, A.; Car, J. 2022
Background: Health professions education has undergone major changes with the advent and adoption of digital technologies worldwide.Objective: This study aims to map the existing evidence and... Show moreBackground: Health professions education has undergone major changes with the advent and adoption of digital technologies worldwide.Objective: This study aims to map the existing evidence and identify gaps and research priorities to enable robust and relevant research in digital health professions education.Methods: We searched for systematic reviews on the digital education of practicing and student health care professionals. We searched MEDLINE, Embase, Cochrane Library, Educational Research Information Center, CINAHL, and gray literature sources from January 2014 to July 2020. A total of 2 authors independently screened the studies, extracted the data, and synthesized the findings. We outlined the key characteristics of the included reviews, the quality of the evidence they synthesized, and recommendations for future research. We mapped the empirical findings and research recommendations against the newly developed conceptual framework.Results: We identified 77 eligible systematic reviews. All of them included experimental studies and evaluated the effectiveness of digital education interventions in different health care disciplines or different digital education modalities. Most reviews included studies on various digital education modalities (22/77, 29%), virtual reality (19/77, 25%), and online education (10/77,13%). Most reviews focused on health professions education in general (36/77, 47%), surgery (13/77, 17%), and nursing (11/77, 14%). The reviews mainly assessed participants' skills (51/77, 66%) and knowledge (49/77, 64%) and included data from high-income countries (53/77, 69%). Our novel conceptual framework of digital health professions education comprises 6 key domains (context, infrastructure, education, learners, research, and quality improvement) and 16 subdomains. Finally, we identified 61 unique questions for future research in these reviews; these mapped to framework domains of education (29/61, 47% recommendations), context (17/61, 28% recommendations), infrastructure (9/61, 15% recommendations), learners (3/61, 5% recommendations), and research (3/61, 5% recommendations).Conclusions: We identified a large number of research questions regarding digital education, which collectively reflect a diverse and comprehensive research agenda. Our conceptual framework will help educators and researchers plan, develop, and study digital education. More evidence from low-and middle-income countries is needed. Show less
Car, L.T.; Kyaw, B.M.; Panday, R.S.N.; Kleij, R. van der; Chavannes, N.; Majeed, A.; Car, J. 2021
Background: Medical schools worldwide are accelerating the introduction of digital health courses into their curricula. The COVID-19 pandemic has contributed to this swift and widespread transition... Show moreBackground: Medical schools worldwide are accelerating the introduction of digital health courses into their curricula. The COVID-19 pandemic has contributed to this swift and widespread transition to digital health and education. However, the need for digital health competencies goes beyond the COVID-19 pandemic because they are becoming essential for the delivery of effective, efficient, and safe care.Objective: This review aims to collate and analyze studies evaluating digital health education for medical students to inform the development of future courses and identify areas where curricula may need to be strengthened.Methods: We carried out a scoping review by following the guidance of the Joanna Briggs Institute, and the results were reported in accordance with the PRISMA-ScR (Preferred Reporting Items for Systematic Reviews and Meta-Analyses Extension for Scoping Reviews) guidelines. We searched 6 major bibliographic databases and gray literature sources for articles published between January 2000 and November 2019. Two authors independently screened the retrieved citations and extracted the data from the included studies. Discrepancies were resolved by consensus discussions between the authors. The findings were analyzed using thematic analysis and presented narratively.Results: A total of 34 studies focusing on different digital courses were included in this review. Most of the studies (22/34, 65%) were published between 2010 and 2019 and originated in the United States (20/34, 59%). The reported digital health courses were mostly elective (20/34, 59%), were integrated into the existing curriculum (24/34, 71%), and focused mainly on medical informatics (17/34, 50%). Most of the courses targeted medical students from the first to third year (17/34, 50%), and the duration of the courses ranged from 1 hour to 3 academic years. Most of the studies (22/34, 65%) reported the use of blended education. A few of the studies (6/34, 18%) delivered courses entirely digitally by using online modules, offline learning, massive open online courses, and virtual patient simulations. The reported courses used various assessment approaches such as paper-based assessments, in-person observations, and online assessments. Most of the studies (30/34, 88%) evaluated courses mostly by using an uncontrolled before-and-after design and generally reported improvements in students' learning outcomes.Conclusions: Digital health courses reported in literature are mostly elective, focus on a single area of digital health, and lack robust evaluation. They have diverse delivery, development, and assessment approaches. There is an urgent need for high-quality studies that evaluate digital health education. Show less
Objective To evaluate the development and implementation of clinical practice guidelines for the management of depression globally.Methods We conducted a systematic review of existing guidelines... Show moreObjective To evaluate the development and implementation of clinical practice guidelines for the management of depression globally.Methods We conducted a systematic review of existing guidelines for the management of depression in adults with major depressive or bipolar disorder. For each identified guideline, we assessed compliance with measures of guideline development quality (such as transparency in guideline development processes and funding, multidisciplinary author group composition, systematic review of comparative efficacy research) and implementation (such as quality indicators). We compared guidelines from low- and middle-income countries with those from high-income countries.Findings We identified 82 national and 13 international clinical practice guidelines from 83 countries in 27 languages. Guideline development processes and funding sources were explicitly specified in a smaller proportion of guidelines from low- and middle-income countries (8/29; 28%) relative to high-income countries (35/58; 60%). Fewer guidelines (2/29; 7%) from low- and middle-income countries, relative to high-income countries (22/58; 38%), were authored by a multidisciplinary development group. A systematic review of comparative effectiveness was conducted in 31% (9/29) of low- and middle-income country guidelines versus 71% (41/58) of high-income country guidelines. Only 10% (3/29) of low- and middle-income country and 19% (11/58) of high-income country guidelines described plans to assess quality indicators or recommendation adherence.Conclusion Globally, guideline implementation is inadequately planned, reported and measured. Narrowing disparities in the development and implementation of guidelines in low- and middle-income countries is a priority. Future guidelines should present strategies to implement recommendations and measure feasibility, cost-effectiveness and impact on health outcomes. Show less
Synthesizing evidence from randomized controlled trials of digital health education poses some challenges. These include a lack of clear categorization of digital health education in the literature... Show moreSynthesizing evidence from randomized controlled trials of digital health education poses some challenges. These include a lack of clear categorization of digital health education in the literature; constantly evolving concepts, pedagogies, or theories; and a multitude of methods, features, technologies, or delivery settings. The Digital Health Education Collaboration was established to evaluate the evidence on digital education in health professions; inform policymakers, educators, and students; and ultimately, change the way in which these professionals learn and are taught. The aim of this paper is to present the overarching methodology that we use to synthesize evidence across our digital health education reviews and to discuss challenges related to the process. For our research, we followed Cochrane recommendations for the conduct of systematic reviews; all reviews are reported according to the PRISMA (Preferred Reporting Items for Systematic Reviews and Meta-Analyses) guidance. This included assembling experts in various digital health education fields; identifying gaps in the evidence base; formulating focused research questions, aims, and outcome measures; choosing appropriate search terms and databases; defining inclusion and exclusion criteria; running the searches jointly with librarians and information specialists; managing abstracts; retrieving full-text versions of papers; extracting and storing large datasets, critically appraising the quality of studies; analyzing data; discussing findings; drawing meaningful conclusions; and drafting research papers. The approach used for synthesizing evidence from digital health education trials is commonly regarded as the most rigorous benchmark for conducting systematic reviews. Although we acknowledge the presence of certain biases ingrained in the process, we have clearly highlighted and minimized those biases by strictly adhering to scientific rigor, methodological integrity, and standard operating procedures. This paper will be a valuable asset for researchers and methodologists undertaking systematic reviews in digital health education. Show less