For more than 20 years, research has consistently demonstrated tremendous variation in the quality of medical care.1-6 An important study by RAND Corporation showed problems in the overall quality of care (patients received only 55% of recommended services among 439 quality measures) as well as variability of care (quality of medical services ranging from 10% to 79% depending on medical condition).4 Initiatives to improve the quality of patient care can be improved by (1) having a set of evidence-based interventions that are effective in changing clinician behavior and in improving patient outcomes,7 and (2) understanding how to apply these interventions appropriately in the context of healthcare quality-improvement efforts.8 To determine effectiveness and to ensure appropriate application, researchers and leaders in quality improvement and health professions education need precise definitions and descriptions of interventions.9
A common quality-improvement intervention in healthcare is academic detailing, also referred to as educational outreach, educational outreach visits, educational visiting, or university-based educational detailing.10 As a single intervention or as part of a multifaceted strategy, across clinicians in practice (ie, continuing education) and in training (eg, graduate medical education), academic detailing can be effective as a means of changing clinician behavior and improving patient outcomes.10 Described as a “personal visit by a trained person to health professionals in their own settings,”10 academic detailing has had a mixed effect on clinician behavior and patient outcomes,10 largely because it is a complex intervention with many characteristics that can vary among settings.11 A critical starting place for advancing the study and use of academic detailing is the development of a clear description of the intervention, reflecting evidence and expert opinion, which can be used to guide best practices. The conclusion of a recent publication about academic detailing indicates the need for such guidance.12
As an intervention, academic detailing is particularly worthy of additional study because it lends itself well to planning and assessing a prioritized list of important educational outcomes.8 For example, academic detailing visits help ensure that clinicians are aware of performance gaps at all levels, especially at community and practice levels. For example, a primary care provider may not be aware that only 60% of the practice’s patients with diabetes received an influenza vaccination in the past year or that the benchmark in his or her community is 85%. These visits also provide an opportunity to address clinicians’ needs in a timely fashion and in the contexts in which clinicians are providing care.8,13
Furthermore, academic detailing allows interventionists and educators to promote effectiveness (especially with respect to changing clinician performance and improving patient outcomes) by affording opportunities to discuss the relevance of improvement efforts (including barriers to, and facilitators of, change), adjust the intensiveness of an intervention strategy, apply logic to ensure sequencing and continuity of interventions, engage the thought processes of clinicians and staff involved in change, and request a commitment to change of participants involved in improvement efforts.13 Academic detailing has strong support from the standpoints of theory and evidence,11 but it requires additional study to clarify the optimal approach and thus reach its full potential as an effective quality-improvement intervention.
This article represents the first of a 2-part study. The second part, a summary of expert consensus about academic detailing, will be described in a future issue of American Health & Drug Benefits. Complementing previous research,12 the present study expands the set of well-designed academic detailing studies to include recent studies.
The present study, which includes elements of systematic and narrative reviews,14 was designed as a literature review plus quantitative analysis of the documentation of key characteristics along with a description of findings. Four categories were addressed: content of visits (information and interventions), clinicians being visited, communication process underlying visits, and outreach workers making visits.12 The study was aimed at building on aspects of the most recent (2007) Cochrane review, consisting of 69 studies published before August 2007.10
MEDLINE, Scopus, and the Cumulative Index to Nursing and Allied Health Literature (CINAHL) were searched for randomized controlled trials with publication dates ranging from August 2007 through March 2013. The search was limited to English-language publications. To identify potentially relevant titles, the following key search terms were used: academic detailing, educational outreach, educational outreach visits, educational visiting, and university-based detailing.
The titles and abstracts of the search results were reviewed by 2 authors (L.G.H., M.S.P.), who excluded articles that did not meet the initial search criteria plus the following additional criteria: 1 or more personal visits as part of the intervention, a trained person making visits, a healthcare provider receiving visits, making visits to a provider’s site of care, and explicit goal of visits was to improve patient or population care.
Four authors (L.G.H., N.E.M., M.S.P., T.J.V.H.) reviewed the full text of the remaining articles to confirm criteria and, if met, to abstract important information using an existing tool12 that reflects both theory and evidence.11 The authors also obtained and reviewed all but 1 article (which was in Spanish) from the most recent Cochrane review.10 One author (T.J.V.H.) reviewed the full text of all articles, and 3 other authors (L.G.H., N.E.M., M.S.P.) reviewed a subset of the total, yielding 2 independent abstractions of each article. When the authors became aware of other publications about a study, such articles were obtained and abstracted, and their findings combined with those of the initial study. As such, the unit of analysis became the study rather than the article. For studies that included more than 1 intervention arm related to academic detailing, the most intensive intervention arm was used for abstraction purposes.
All authors met to review and compare findings and to discuss and resolve discrepancies, if any. The final data underwent quantitative analysis to yield simple frequencies for each category of interest.
Using the specified date range, study design, and key terms, the initial search returned 850 results (Figure). Based on the review of titles and abstracts, 752 articles did not meet the study criteria and thus were excluded. (The authors did not maintain a list of the specific reasons for not including these articles.) Additional criteria were applied to the remaining 98 articles, which resulted in 60 articles being excluded, as noted in the Figure. The reasons were nonrandomized controlled design, other educational intervention, study protocol only, visit to persons other than healthcare providers, outside the specified date range, not available in English, and duplicate. Combining the 68 English-language articles from the last Cochrane review10 with the more recent set of 38 articles resulted in a total of 106 articles for review. For 8 studies, 1 or 2 related publications were evident from the references, yielding 115 total articles pertaining to the 106 studies. (The full list of reviewed articles is available by request.)
For the major characteristics examined (content, clinicians, communication, and outreach workers), the presence of some (essentially any item in the category) documentation ranged from 36.5% to 100% (Table).
The higher levels of documentation pertained to content of visits (92.5%-100%) and clinicians being visited (100%), and the lower levels of documentation related to the communication process underlying visits (41.0%-98.1%) and the outreach workers making visits (36.5%-89.6%). Across all 4 categories, the best overall documentation of any single study was 68%.
Content of Visits
In the majority (95.3%) of studies, multifaceted approaches (2 or more interventions) were used to change behavior. For example, 1 study involved academic detailing visits, performance feedback, and practice facilitation to improve heart disease prevention services of primary care clinicians.15 Common interventions were clinician education (86.8%) and performance feedback (71.7%), especially the use of provider-specific performance data (54.7%). Recommendations about practice change also were fairly common (63.2%), and patient education was used less often (31.1%). With regard to outcomes, more than half the studies (56.6%) measured only 1 outcome (eg, documentation of smoking cessation counseling) as opposed to 2 or more. Clinician behavior or performance (91.5%) was the most common outcome measure, followed by patient outcomes (43.4%), clinician knowledge or awareness (6.6%), and clinician skill (<1%). Most studies (70.4%) entailed some degree of tailoring interventions to the providers being visited, such as practicespecific opportunities for improvement and the introduction of goal setting and action plans during visits.15
Clinicians Being Visited
This category contains only 1 variable: criteria for selecting providers for outreach visits. Although more than 1 criterion often was used in a given study, the most common selection criterion was the geographic area or organization in which the provider practiced (99.1%). Other criteria included the specialty of the provider (84.0%), the patient population being served (17.0%), and the pattern of care observed (4.7%).
A study by Frijling and colleagues exemplifies the use of multiple selection criteria.16 They recruited general practitioners (specialty) from southern Netherlands (geography) to improve clinical decision-making (pattern of care) for patients with high cardiovascular risk (patient population) identified through baseline assessments (pattern of care) that supported performance feedback reports and academic detailing visits.16
Communication Process Underlying Visits
The communication process includes information about the in-person visits themselves, the audience for the visit, and the other types of contact involved. The average number of visits to providers was 2.8 (range, 1-50), with 1 visit being the most common. Among studies with multiple visits, the average frequency of visits was once every 3.5 months, with 6-month intervals being most common (range, 1 per 0.03 months [daily] to 1 per 7.0 months). The average visit duration was nearly 90 minutes, but 60-minute visits were most common (range, 7 minutes to 2 consecutive 8-hour days [960 minutes]). For longitudinal studies, the average duration of time involving visits (ie, portion of the study during which academic detailing visits were made) was 7.4 months, but 6.0 months was most common (range, 0.5-18.0 months).
An example of a more intensive study is that of Lemelin and colleagues, who made 33 in-person visits (1.75 hours each) over 18 months to offer an extensive multifaceted approach to improve preventive care of family physicians.17 Although audiences often included multiple stakeholders, the most common recipients of visits were clinicians (99.0%), followed by nonclinical staff (20.2%) and others (9.6%). In addition to the in-person visits, which were part of every study, 41.0% of studies included additional means of contact, such as postal mail (53.5%), telephone (37.2%), and e-mail (7.0%).
Outreach Workers Making Visits
The outreach workers who made personal visits were physicians (40.0%), pharmacists (33.7%), nurses (27.4%), public health practitioners (2.1%), and others (29.5%). Some studies (12.6%) had multiple outreach workers with different backgrounds making visits separately, and others (10.5%) had a team approach; that is, multiple outreach workers with different backgrounds making visits together. Many studies (45.3%) used some type of special training (eg, reading, courses, and role playing) to prepare the outreach workers for visits. When documented, the employer of the outreach workers usually (78.9%) differed from that of the providers being visited. The following quotation is a typical example of the level of detail provided about an outreach worker in the articles examined: “In the baseline year the facilitator was trained, baseline data collected, practices recruited and randomised, and educational materials prepared. In the second year (intervention year) the nurse facilitator trained staff in intervention practices, provided information materials, and reviewed communication between the laboratory and the practices.”18
Our literature review of academic detailing expands the previous base of articles by more than 50% and reveals important information for those who use academic detailing as part of quality-improvement initiatives or research studies.
Identification of Studies and Documentation of Important Characteristics
Although the methodology for identifying the newer studies was not identical to the Cochrane approach10 and did not include a meta-analysis, 38 studies were added, substantially increasing the number of well-designed studies reviewed. As noted in many systematic reviews of interventions, including a recent review of general continuing medical education (CME),9 incomplete documentation limits our ability to better understand interventions and their impact. The present study showed that documentation of the major elements examined ranged from 36.5% to 100%, with the best overall documentation of any single study being only 68%. Although the intervention strategy was not entirely clear from any publication, descriptions of the communication process underlying visits and the outreach workers making visits were usually less detailed than other categories of important characteristics.
A future systematic review and meta-analysis in which more robust characteristics are considered, such as those described herein and elsewhere,9,12 may provide additional insight into the factors that may better explain effectiveness. However, more accurate and complete documentation of academic detailing interventions in the literature is essential. Follow-up by an expert consensus panel is needed to fill many gaps in our knowledge and understanding of the most important intervention characteristics.
Content of Visits
It is encouraging that most studies used academic detailing as a means to bring 2 or more interventions to providers, because multifaceted strategies may increase the likelihood that improvement efforts will be successful.10,19 However, the fact that many studies (nearly one-third) did not involve tailoring of interventions suggests that some multifaceted strategies may not have addressed the specific needs of targeted providers. Given the labor-intensiveness of academic detailing,19 it may be helpful to tailor multifaceted interventions to the needs of providers, especially if more than 1 visit to a provider is feasible. Moreover, this approach would guide other common interventions being offered, such as clinician education, performance measurement and feedback, and practice-change recommendations.
Given the prevalence of multifaceted strategies, which are often aimed at addressing different shortcomings, we were surprised that only 1 outcome was measured in most studies. Although the 2 most often reported outcomes (clinician performance and patient outcomes) may be the most important indicators of improvement,8 other outcomes are necessary as well, such as clinician knowledge, skill, attitude, and competence, but these were measured infrequently. Many academic detailing efforts may benefit from an explicit approach to planning and assessment, such as the expanded outcomes framework.8
Clinicians Being Visited
Convenience (geographic or organizational) was the most common mechanism affecting the identification and/or recruitment of providers for academic detailing visits. Although convenience is a practical and relevant consideration when planning improvement efforts, many policy initiatives (eg, Centers for Medicare & Medicaid Services’ Accountable Care Organizations program and the Physician Quality Reporting System)20,21 are creating incentives to focus improvement on (1) the healthcare needs of patient populations, and (2) the patterns of care provided by clinicians. However, these 2 reasons were cited infrequently (17% and 5%, respectively) by study authors. In addition to quality and payment reforms, these reasons are consistent with the highest levels of educational outcomes in the expanded outcomes framework, in which it is recommended that improvement experts and educators determine performance gaps at the levels of community health, patient health, and clinician performance before proceeding to lower outcome levels—and to do so only if higher levels indicate opportunities for improvement.8
Communication Process Underlying Visits
Among studies in which the communication process was documented, the intensity of interventions varied widely, as evidenced by differences in the number of visits, frequency of visits, duration of visits, and duration of the intervention period involving visits. Generally, authors did not explain the level of intensity chosen but often referenced studies with similar intervention approaches. With better documentation may come stronger rationales for such choices. In the most recent Cochrane review, the authors cited the need for additional research to determine the relationship of intensity to effectiveness.10 The authors of the most recent synthesis of systematic reviews of CME concluded that CME activities are effective for improving clinician performance and patient outcomes if, among other things, they are highly interactive and involve multiple exposures.22 Despite the variation in intensity, most studies entailed only 1 visit, which lasted approximately 1 hour. Depending on the needs of providers and the level of outcome(s) being addressed, such a low-intensity approach is not likely to produce the types of changes that are linked to clinician performance and patient outcomes that were commonly reported in these studies.
In many instances, a mismatch appears to exist between the intervention strategy and the study outcomes, but underreporting of other means of contact (eg, telephone and mail) may underestimate the strength of some interventions. Investigators may wish to consider these communication options and report them, if utilized. Because most high-level outcomes require system changes,13 the tendency to focus visits on clinicians rather than clinicians plus staff may represent an opportunity for improvement given the importance of organizational factors that influence high-level outcomes.22
Outreach Workers Making Visits
The clinical background (MD or RN) of most outreach workers appears appropriate, but only to the extent that it prepares them for the type of care being discussed. Long-standing research in the diffusion of innovations has shown that credible change agents are people with similar backgrounds and education to those of the client but with additional expertise concerning an innovation.23 In addition to content expertise, process expertise (ie, knowing how to make changes in complex systems using quality-improvement principles and practices and knowing how to communicate information in ways that promote evidence-based care) may be important to academic detailing visits.24,25 Fewer than half the study reports in our review described process expertise or other types of special training, which may be critical if outreach workers are primarily making single brief visits yet trying to effect higher-level outcomes. Only a small percentage of studies used a cadre of experts with different backgrounds or a team approach to visits. Additional research is needed to determine whether these approaches would be more effective than a single expert (with appropriate training and support) making the visits.
In most studies, the employer of the outreach worker differed from that of the person(s) being visited. Although some familiarity with systems and culture may come with sharing the same employer, having different employers may yield greater objectivity by the outreach worker, possibly leading to better observations and advice. More research is warranted to understand the impact of this contextual variable.
Our strategy to identify well-designed studies was similar (but not identical) to that used in the most recent Cochrane review of academic detailing. A different methodology, including a review of the “gray” literature and other publication databases (eg, Cochrane EPOC Register, Embase), interviews with authors of manuscripts in preparation or in press, and inclusion of articles in languages other than English, might have resulted in a more comprehensive set of articles. However, it is unlikely that an alternate approach would have yielded substantial differences in the degree of documentation or the findings. Nonetheless, a full systematic review with a meta-analysis, if warranted, would be an important next step. Such a review should capture additional articles that use different terms for academic detailing26 and consider a broader set of variables that may influence effectiveness.
Our literature review of 106 well-designed studies demonstrated highly variable and generally incomplete documentation of potentially important characteristics of academic detailing. Improvements in documentation were most compelling in the categories of “communication process underlying visits” and “types of educational outreach workers making visits.” Although some characteristics were documented and used more consistently than others, what remains uncertain is how often such details are complete and to what extent they influence effectiveness. Meta-analyses and other methods of combining data among studies are useful only if they are based on complete and accurate data. Researchers and leaders in quality improvement and health professions education need to evaluate and report on a wide array of variables to support better use of, and research on, effective interventions such as academic detailing. Researchers and leaders may wish to consider available reporting guidelines and recommendations to support their evaluation and dissemination efforts.
The authors would like to thank E. Carol Polifroni, EdD, NEA-BC, CNE, RN, ANEF, for her guidance on the systematic review.
The Society for Academic Continuing Medical Education (SACME) provided financial support for this study. The views expressed in this article are those of the authors and not necessarily the views of SACME.
Author Disclosure Statement
Dr Fischer is a consultant for the Alosa Foundation, a nonprofit organization that provides academic detailing services. Dr Van Hoof, Ms Harrison, Ms Miller, and Ms Pappas have no conflicts of interest to report.
Dr Van Hoof is Associate Professor, University of Connecticut School of Nursing, Storrs, CT, and Associate Professor, Department of Community Medicine and Health Care, University of Connecticut School of Medicine, Farmington, CT; Ms Harrison is a PhD candidate, University of Connecticut School of Nursing, Storrs, CT; Ms Miller is a Nurse, Hartford Healthcare, Hartford, CT; Ms Pappas is a Nurse Practitioner, Division of Neuroscience, Hartford Hospital, Hartford, CT, and a PhD candidate, University of Connecticut School of Nursing, Storrs, CT; Dr Fischer is Director, National Resource Center for Academic Detailing, Division of Pharmacoepidemiology and Pharmacoeconomics, Brigham and Women’s Hospital, Associate Professor, Harvard Medical School, Boston, MA.
1. Wennberg J, Gittelsohn A. Variations in medical care among small areas. Sci Am. 1982;246:120-134.
2. Chassin MR, Galvin RW; National Roundtable on Health Care Quality. The urgent need to improve health care quality: Institute of Medicine national roundtable on health care quality. JAMA. 1998;280:1000-1005.
3. Jencks SF, Cuerdon T, Burwen DR, et al. Quality of medical care delivered to Medicare beneficiaries: a profile at state and national levels. JAMA. 2000;284:1670-1676.
4. McGlynn EA, Asch SM, Adams J, et al. The quality of health care delivered to adults in the United States. N Engl J Med. 2003;348:2635-2645.
5. Asch SM, Kerr EA, Keesey J, et al. Who is at greatest risk for receiving poor-quality health care? N Engl J Med. 2006;354:1147-1156.
6. Agency for Healthcare Research and Quality. 2010 national healthcare disparities report. AHRQ Publication No. 11-0005. March 2011. http://archive.ahrq.gov/research/findings/nhqrdr/nhdr10/nhdr10.pdf. Accessed August 12, 2012.
7. Mazmanian PE, Davis DA. Continuing medical education and the physician as a learner: guide to the evidence. JAMA. 2002;288:1057-1060.
8. Moore DE Jr, Green JS, Gallis HA. Achieving desired results and improved outcomes: integrating planning and assessment throughout learning activities. J Contin Educ Health Prof. 2009;29:1-15.
9. Davis D, Bordage G, Moores LK, et al. The science of continuing medical education: terms, tools, and gaps: effectiveness of continuing medical education: American College of Chest Physicians evidence-based educational guidelines. Chest. 2009;135:8S-16S.
10. O’Brien MA, Rogers S, Jamtvedt G, et al. Educational outreach visits: effects on professional practice and health care outcomes. Cochrane Database Syst Rev. 2007;CD000409.
11. Van Hoof TJ, Meehan TP. Using theory and evidence to guide the use of educational outreach to improve patient care. Am J Med Qual. 2012;27:467-471.
12. Van Hoof TJ, Miller NE, Meehan TP. Do published studies of educational outreach provide documentation of potentially important characteristics? Am J Med Qual. 2013;28:480-484.
13. Van Hoof TJ, Meehan TP. Integrating essential components of quality improvement into a new paradigm for continuing education. J Contin Educ Health Prof. 2011;31:207-214.
14. Uman LS. Systematic reviews and meta-analyses. J Can Acad Child Adolesc Psychiatry. 2011;20:57-59.
15. McBride P, Underbakke G, Plane MB, et al. Improving prevention systems in primary care practices: the health education and research trial (HEART). J Fam Pract. 2000;49:115-125.
16. Frijling BD, Lobo CM, Hulscher MEJL, et al. Intensive support to improve clinical decision making in cardiovascular care: a randomised controlled trial in general practice. Qual Saf Health Care. 2003;12:181-187.
17. Lemelin J, Hogg W, Baskerville N. Evidence to action: a tailored multifaceted approach to changing family physician practice patterns and improving preventive care. CMAJ. 2001;164:757-763.
18. Modell M, Wonke B, Anionwu E, et al. A multidisciplinary approach for improving services in primary care: randomised controlled trial of screening for haemoglobin disorders. BMJ. 1998;317:788-791.
19. Marinopoulos SS, Dorman T, Ratanawongsa N, et al. Effectiveness of continuing medical education. Evid Rep Technol Assess (Full Rep). 2007;1-69.
20. Centers for Medicare & Medicaid Services. Accountable care organizations (ACO). www.cms.gov/Medicare/Medicare-Fee-for-Service-Payment/ACO/index.html?redirect=/aco. Accessed July 22, 2015.
21. Centers for Medicare & Medicaid Services. Physician quality reporting system. www.cms.gov/Medicare/Quality-Initiatives-Patient-Assessment-Instruments/PQRS/. Accessed July 22, 2015.
22. Cervero RM, Gaines JK. The impact of CME on physician performance and patient health outcomes: an updated synthesis of systematic reviews. J Contin Educ Health Prof. 2015;35:131-138.
23. Rogers EM. Diffusion of Innovations. 5th ed. New York, NY: Free Press; 2003.
24. Avorn J, Fischer M. ‘Bench to behavior’: translating comparative effectiveness research into improved clinical practice. Health Aff (Millwood). 2010;29:1891-1900.
25. Fischer MA, Avorn J. Academic detailing can play a key role in assessing and implementing comparative effectiveness research findings. Health Aff (Millwood). 2012;31:2206-2212.
26. Van Hoof TJ, Miller NE. Consequences of lack of standardization of continuing education terminology: the case of practice facilitation and educational outreach. J Contin Educ Health Prof. 2014;34:83-86.