Skip to main content

The effectiveness and feasibility of an online educational program for improving evidence-based practice literacy: an exploratory randomized study of US chiropractors

Abstract

Background

Online education programs are becoming a popular means to disseminate knowledge about evidence-based practice (EBP) among healthcare practitioners. This mode of delivery also offers a viable and potentially sustainable solution for teaching consistent EBP content to learners over time and across multiple geographical locations. We conducted a study with 3 main aims: 1) develop an online distance-learning program about the principles of evidence-based practice (EBP) for chiropractic providers; 2) test the effectiveness of the online program on the attitudes, skills, and use of EBP in a sample of chiropractors; and 3) determine the feasibility of expanding the program for broader-scale implementation. This study was conducted from January 2013 to September 2014.

Methods

This was an exploratory randomized trial in which 293 chiropractors were allocated to either an online EBP education intervention or a waitlist control. The online EBP program consisted of 3 courses and 4 booster lessons, and was developed using educational resources created in previous EBP educational programs at 4 chiropractic institutions. Participants were surveyed using a validated EBP instrument (EBASE) with 3 rescaled (0 to 100) subscores: Attitudes, Skills, and Use of EBP. Multiple regression was used to compare groups, adjusting for personal and practice characteristics. Satisfaction and compliance with the program was evaluated to assess feasibility.

Results

The Training Group showed modest improvement compared to the Waitlist Group in attitudes (Δ =6.2, p < .001) and skills (Δ =10.0, p < .001) subscores, but not the use subscore (Δ = −2.3, p = .470). The majority of participants agreed that the educational program was ‘relevant to their profession’ (84 %) and ‘was worthwhile’ (82 %). Overall, engagement in the online program was less than optimal, with 48 % of the Training Group, and 42 % of the Waitlist Group completing all 3 of the program courses.

Conclusions

Online EBP training leads to modest improvements in chiropractors’ EBP attitudes and skill, but not their use of EBP. This online program can be delivered to a wide national audience, but requires modification to enable greater individualization and peer-to-peer interaction. Our results indicate that it is feasible to deliver an online EBP education on a broad scale, but that this mode of education alone is not sufficient for making large changes in chiropractors’ use of EBP.

Background

Evidence-based practice (EBP) has been widely adopted as a standard by all health disciplines. Defined as a systematic approach for integrating the best research evidence with clinical expertise and patients’ unique values and circumstances [1], it requires life-long self-directed learning on the part of the healthcare practitioner [2].

Interest in EBP among chiropractors, one of the most frequently-used Complementary and Integrative Health (CIH) disciplines in the US [3], has shown tremendous growth in recent years [4]. While there is some evidence to suggest that chiropractors and other integrative healthcare providers might be resistant to EBP [5, 6], a recent study by our group found that US chiropractors had generally positive attitudes toward EBP [7]. However, the same survey revealed that US chiropractors reported low self-perceived use of EBP in their clinical practices.

There are barriers to chiropractors engaging in EBP, including lack of training, skills and resources, as well as time constraints and concerns regarding the relevance of existing research [7]. These challenges are similar to those experienced by other complementary and conventional healthcare professionals [8, 9]. The rapidly growing volume of new research information further complicates the ability of practitioners to stay up-to-date with the best available research. A commonly advocated step in overcoming barriers to EBP uptake is to provide educational programs that serve 2 purposes: 1) to motivate individuals by enhancing EBP-related attitudes and beliefs, and 2) to increase capability by improving EBP-related knowledge and skills [2].

Online education programs have been shown to be as effective as in-person instruction for improving attitudes and knowledge of EBP among healthcare practitioners [2] and this mode of delivery offers a viable and sustainable solution for teaching consistent EBP content to learners over time and across multiple geographical locations [10]. Such programs also offer learners flexibility in terms of pacing and availability, advantages particularly useful for addressing health practitioners’ time constraints. The existing literature examining the effectiveness of EBP online education programs is limited however in both quantity and quality [2], especially in the CIH professions, warranting further rigorous study.

Our long term goal is to develop a comprehensive EBP literacy campaign on a national scale for chiropractors and other CIH providers. To this end, our study had 3 aims: 1) develop an online distance-learning program on the principles of evidence-based practice (EBP) for chiropractic providers; 2) test the effectiveness of the online program on the attitudes, skills, and use of EBP in a random sample of chiropractors; and 3) determine the feasibility of expanding the program for broader-scale implementation on a national basis.

The study adopted an exploratory, multi-phase, multi-method approach. We previously published the results of a national cross-sectional survey of US chiropractors, which comprised the first phase of the study and captured a baseline measure of EBP literacy [7]. The second phase of the study reported here consisted of a randomized controlled trial (RCT) comparing the effectiveness of an online education intervention with a wait-list control and assessing program feasibility. The third phase consisted of a descriptive, qualitative investigation of chiropractors’ perceived barriers and facilitators to participation; this will be reported in a forthcoming manuscript. Our hypotheses for this randomized trial were that an online EBP educational program would: 1) would be effective compared to a waitlist control in improving the attitudes, skills, and use of EBP in clinical chiropractic practice and 2) be feasible to implement.

Our study was designed to explore several questions and “unknowns” about the processes and outcomes of online EBP training. First, we found a paradoxical discordance between good attitudes but poor use of research evidence, which begged the question “why”? Secondly, although online educational programs have been shown to be generally effective, we wanted to know if an online educational program specifically focused on research and EBP would be effective in improving chiropractors’ level of EBP literacy and use. Lastly, we wanted to explore questions about which attributes of our online educational program would private practice chiropractors find useful, as well as the barriers and facilitators to completion of the online program.

Methods

Research design

The study was an open-label, prospective, parallel-groups randomized controlled trial with a delayed waitlist control. Volunteers came from a random sample of chiropractors who completed the first phase of the study. They were randomized using a computer-generated algorithm, prior to which group allocation was concealed from all participants and study personnel. The study was funded by the National Institutes of Health/National Center for Complementary and Integrative Health (R21 AT007547), with Institutional Review Board approval (exempt status) granted by the University of Pittsburgh (PRO12060417) and all other participating institutions. This study was conducted January 2013–September 2014.

Participants

To participate in this RCT, individuals had to hold a doctor of chiropractic degree, reside in the US, have Internet access, have a valid email address, and have participated in the cross-sectional national survey. There were no participant exclusion criteria.

Recruitment

Participants were recruited from an original pool of 1,314 chiropractors who completed a cross-sectional national survey [7]. We sent e-mail recruitment notices to a random sample of 700 of the 1,314 survey participants. The e-mail message included a brief explanation about the trial and an invitation to participate. A total of 293 chiropractors (42 %) responded to the email invitation and gave informed consent to be randomized.

Interventions

The online educational program was developed and adapted from competencies and resources created through previous National Center for Complementary and Integrative Health funded EBP educational grants awarded to four chiropractic institutions [1115]. Participants were randomly assigned to either immediate access to an online educational program (training group) or a wait-list control group. Following randomization, participants in the training group were sent an e-mail with a link to the host website to register for the program and to create a user account; this provided access to the EBP program at no cost. Participants in the waitlist group were sent a different e-mail explaining that they would be receiving instructions about registration within 9 months. Both groups were given 30 days to register, after which the registration process was closed. The program was delivered on a Moodle learner management system at the host website.

The EBP educational program was delivered online over a 7-month period for each group. The overall goal was to provide foundational skills for practitioners to become ‘informed consumers’ of research [16]. Program competencies addressed those considered foundational for enabling EBP [17], such as basic statistics, introduction to types of research, and an emphasis on ‘information mastery’ [16], as well as competencies advocated by the Sicily statement [18]. This includes 1) formulating an answerable question to a clinical problem; 2) finding the best available evidence; 3) critically appraising that evidence; 4) interpreting and applying the results; and 5) evaluating or auditing the outcome [13].

To incentivize participation, up to 10 h of continuing education credits were available to participants from qualifying states. E-mail reminders were sent 2 to 4 times per month to encourage participation, and phone calls to stimulate interest were made to the subset of participants who registered for the online program, but did not complete any courses. The EBP educational program consisted of two parts: 1) a series of online educational courses and 2) 4 monthly online booster lessons.

Online courses

The first part of the educational intervention consisted of a series of 15 to 40-min modules, divided into 3 general courses. The entire program contained a total of 18 modules and required an estimated total of 10 h to complete. While participants were encouraged to complete the courses at their own pace over 2 months, they were provided program access for the entire 7-month intervention period. The targeted learning objectives for the program focused on the EBP topics summarized in Table 1. The online modules were developed as part of an earlier project (R25AT003582) [4, 10] using a design-based research approach [19] focused on four adult learning theories: 1) events of instruction, 2) cognitive load, 3) dual processing and 4) ARCS theory of motivation [2024]. The modules were designed using the reusable learning object (RLO) model, where RLOs are small, self-contained units of instruction that cover a limited set of related learning objectives. Short quizzes were provided at the end of each module to foster further learning.

Table 1 Detailed descriptions of the individual online educational modules and booster lessons

Booster lessons

Four online booster lessons, consisting of 30-min interactive presentations, provided opportunities for participants to review and practice their EBP skills (Table 1). Three months after the start of the intervention, participants were sent a monthly e-mail with a link to sign in to one of four online ‘booster lessons’ hosted on the Moodle platform. Each lesson consisted of a narrated PowerPoint presentation, which presented a case, posed critical-thinking questions, and offered exercises to complete using a worksheet supplied as a PDF attachment. The design of these lessons adhered to learning and motivational theories (with particular focus on relevance and confidence), as well as applied social learning theory, which posits that individuals learn by observing and imitating others [21]. Experts from the field (peer opinion leaders) were recruited to narrate and model desired behaviors in the booster lessons.

Data collection and measures

We collected demographic and baseline EBP information using online self-report questionnaires during our national cross-sectional survey [7]. Our effectiveness measures were the three subscales of the Evidence-Based Practice Attitude and Utilization SurvEy (EBASE), a self-report instrument to assess providers’ attitudes, skills and use of EBP [20]. The instrument has demonstrated good internal consistency, content validity, and acceptable test-retest reliability [25]. Three sections of this instrument generate subscores: Parts A (Attitudes), B (Skill) and D (Use). These subscores were used as dependent variables in statistical models to explore the effectiveness of the online education program in making changes to these outcomes. All participants were asked to complete 3 EBASE surveys over the course of the trial; baseline, 9 months (Time 2) and 16 months (Time 3).

Feasibility data collection included course and booster session completion rates. A program evaluation survey was also administered to gain insight as to participants’ satisfaction with the program. The survey included 23 items addressing the educational program overall, and the online course modules and boosters specifically, using a 5 point Likert scale (strongly disagree, disagree, neutral, agree, strongly agree).

Statistical analysis

Power analysis showed that a sample size of n = 125 per group would yield 88 % power to detect a between-groups, standardized effect size of 0.4 on E-BASE subscore changes. We achieved this projected enrollment goal by randomizing a total of N = 293 participants; n = 147 in the Training Group and n = 146 in the Waitlist Group.

Descriptive statistics, including measures of central tendency and variability, were generated to examine the demographic characteristics and EBASE scores for the immediate access and waitlist groups. We examined the effectiveness of the educational program by comparing the differences in the between-group changes in EBASE attitudes, skills, and use subscores from baseline to Time 2 (9 months). We used general linear models with robust standard errors and controlled for baseline EBASE subscore value, gender, practice focus, education, number of patients seen daily, and years in practice. These clinically relevant covariates were each correlated with the outcomes (p < 0.05). Prior to these analyses, we rescaled each EBASE subscore to be on a 100 point scale (0 = minimum; 100 = maximum) by normalizing the data and multiplying by 100. This linear transformation was conducted to facilitate ease of interpretation of the results by using the same scale for the 3 subscores. All statistical analyses were performed using SPSS 22 (IBM Corp, Armonk, NY, USA) with statistical significance set to .05.

Although the primary endpoint for the between-group analysis was Time 2, a third EBASE (Time 3) was administered at 16 months, for two reasons. First, we wanted to explore descriptively the within-group changes in EBASE subscores following the program for the Wait-list Group (Time 3 – Time 2), to see if it was comparable the change following the program in the Training Group (Time 2 – Time 1). Second, we wanted to explore the sustainability of the within-group changes in the Training Group, 7 months after they had completed the educational intervention (Time 3 – Time 2).

Feasibility was assessed by calculating frequencies and percentages of the number of courses and booster sessions completed for each group. These descriptive feasibility data were collected at Time 2 for the Training Group, and at Time 3 for the Waitlist Group. We tracked the total number of people in each group who completed each online course and booster lesson, which was reported as the frequency. We then divided the number of completed courses and booster lessons by the total number of people in that group to arrive at the completion rates.

To ease interpretation of the program evaluation surveys, we created dichotomous variables by collapsing the response categories ‘strongly agree’ and ‘agree’ into one ‘agree’ variable, and combining ‘neutral’, ‘disagree’, and ‘strongly disagree’ responses into one ‘disagree’ variable. Frequencies and percentages were then calculated using these dichotomous variables for each group.

Results

Participant characteristics

Table 2 summarizes the demographic characteristics of study participants, and shows that the groups were generally balanced at baseline. Participants had a mean age of 45.6 (±11.8) years, and tended to be white (96.6 %) and male (79.7 %). On average, participants had been in practice for 16.2 (±10.3) years and consulted 20.2 (±13.5) patients daily. Over 70 % had a practice with a musculoskeletal focus. Participants were also generally balanced with respect to their mean and median baseline E-BASE subscores (Table 3), although the Waitlist Group showed slightly higher baseline values.

Table 2 Demographic characteristics by group
Table 3 Sub-scores from EBASE rescaled to a range of 0–100

Figure 1 details the flow of participants throughout the study. A total of 293 persons were randomized during the month of May 2013; 147 to the Training Group and 146 to the Waitlist Group. Two individuals immediately withdrew from the Waitlist Group after randomization, and an additional 22 were lost to follow up at Time 2, prior to the invitation to register. A number of participants in both groups did not register for the program after receiving the registration invitation (Training: n = 21, Waitlist: n = 15); this resulted in registration rates for the online education program of 86 % for those randomized to the Training Group and 73 % for those randomized to the Waitlist Group.

Fig. 1
figure 1

Participant flow through the DELIVER Study

In terms of data collection follow up, at Time 2 (primary endpoint), 59 % of the training group completed the second EBASE and the Program Evaluation Survey, compared to 84 % of the Waitlist Group. At Time 3, 61 % of the Training Group and 60 % of the Waitlist Group completed the third EBASE questionnaire.

Preliminary effectiveness outcomes: EBP attitude, skills and use

Table 3 presents the three EBASE subscores for each group and at each time point. Means with standard deviations and medians with interquartile ranges are reported, as well as between-group comparisons with 95 % confidence intervals. Medians were reported in addition to means, due to the highly skewed distributions of the Attitudes and Use subscores. Attitude subscores favored the Training Group with an adjusted mean difference of 6.2 (p = .008) for the multi-covariate model. Skills subscores also demonstrated the greatest performance advantage for the Training Group (Δ =10.0, p < .001). There was no statistically significant difference between groups for the Use subscores (Δ = −2.3, p = .470). Table 3 also shows modest within-group improvements in Attitudes and Skills EBASE subscores (but not the Use subscore) for the Waitlist Group from Time 2 to Time 3, comparable to those between Baseline and Time 2 in the Training Group.

Feasibility outcomes

Feasibility was measured by rates of online EBP course completion and the monthly booster session completion rates. Generally the course and booster session completion rates among those randomized was low, with only 71 (48 %) randomized to the Training Group completing all three courses compared to 61 (42 %) in the Waitlist Group. For the boosters, 79 (54 %) of participants randomized to the Training group compared to 59 (41 %) in the waitlist, failed to participate in any booster lessons. Only 23 (16 %) in the Training Group and 35 (24 %) in the Waitlist Group completed all four booster lessons.

Table 4 illustrates the responses from participants about the program evaluation survey. The majority of survey respondents agreed that the overall educational program was ‘relevant to their profession’ (84 %) and ‘was worthwhile’ (82 %). Participant views regarding specific aspects of the online course modules and the booster lessons are also illustrated in Table 4. In general, the comments were more favorable regarding the educational modules compared to the booster lessons.

Table 4 Responses to program evaluation survey of the EBP online educational program

Discussion

This is one of the first studies to systematically explore the effectiveness and feasibility of delivering an online EBP educational program for chiropractors, one of the most widely utilized CIH disciplines in the United States [26].

Preliminary effectiveness

Significant differences between the training and waitlist groups were observed in self-reported EBP attitudes and skills, but there was no meaningful change in EBP use following exposure to the educational program. However, the magnitude of these significant group differences were only modest, and given that no established standard exists for what constitutes a meaningful difference in EBP outcomes, the potential impact of these findings is unknown.

The modest improvement in EBP attitudes and skills, and the absence of any change in EBP use were unexpected findings. This is in light of literature showing improvements in EBP knowledge, skills, attitudes, and behaviors following EBP training in other health disciplines, including allied health [27], nursing [28] and medicine [29]. There are several possible explanations for why our results differed from other studies.

First, it is possible that the small change in attitudes was due to self-selection bias and an associated ceiling effect. That is, participants were already favorably inclined towards EBP and motivated to take part in the study, resulting in high pre-intervention attitude sub-scores with little room for improvement. These same factors might also explain the modest change in EBP skills. Another explanation for the observed outcomes is that the previously developed online courses were designed as a ‘companion’ intervention to be used with other more interactive educational venues such as classes, workshops, and other formats [30]. While the booster lessons were intended to complement the courses, they were designed to be done asynchronously and independent of an instructor. The intent was to make these lessons more amenable to broad scale implementation. This limited the ability to provide the individualized feedback, collaborative learning, or instructor-student interaction. These are considered important educational strategies from an adult learning perspective [17, 20, 31, 32].

Further, if our EBP educational program is considered within the context of relevant behavioral theories, the lack of social support in the program may be a critical and absent component. While the program did provide opportunity to enhance EBP capabilities critical for enacting and enabling EBP behaviors [33, 34], the format failed to provide in-depth social support to participants, which in the form of interaction with peers, peer-coaches and/or opinion leaders may be important for advancing EBP in health professionals [35].

Our choice of outcome measure may also have influenced the observed results. Ideally, outcome measures should be both psychometrically sound and mapped specifically to what an intervention aims to achieve [36]. While the EBASE had been previously validated for CAM professions, there were some EBASE questions that did not map well to a number of our EBP learning objectives. For example, the EBASE items that were related to conducting clinical research and systematic reviews, contrasted with our program’s focus on information mastery. Conversely, there were learning objectives addressed in our online modules and booster sessions that were not measured in the EBASE. Examples include the sections of our modules that covered the types of research questions answered by different research designs and statistical concepts.

Given the educational program was considered foundational in nature, we did not anticipate large changes in the EBASE ‘use’ domain. In fact, it is unlikely that >any educational program will result in important changes in research use due to the inherent complexity of EBP-related behaviors [2, 8]. Instead, EBP educational programs should be viewed as comprising only one necessary component of behavioral change: the opportunity to address practitioners’ capabilities, specifically attitudes, knowledge and skills. Multi-factorial strategies will be required to address other individual and system-related issues in an ongoing and sustainable manner [2, 36].

Feasibility

Regarding our feasibility aims, we were able to successfully develop and host a series of online educational modules and deliver monthly booster lessons to almost 300 chiropractors in different geographic regions of the US. However, we found generally poor compliance with completion of the online courses and booster lessons. The engagement data illustrated that a number of individuals (14–26 %) failed to register for the educational program after randomization. In this study, participants were required to take a number of online steps (clicks) in order to proceed from registration to commencement of the educational program, the purpose of which was to accommodate the processing of continuing education credits. This may have proved un-motivating to some, especially those unaccustomed to online education formats [37].

The engagement rates for those that did register for the program was also somewhat disappointing, especially given efforts encouraging participation via e-mail and telephone reminders. Of those that registered for the program, 55–63 % completed at least one course with 42–48 % completing all three courses. Participation in the booster lessons was far less, with 32–33 % completing at least one lesson, and only 16–24 % completing all four. Indeed, motivating individuals to complete online courses is a well-recognized challenge for education. Others have noted a 10–20 % greater drop-out rate for online courses compared to traditional classroom environments [38], and completion rates of massive open online courses (MOOCs) is a dismal 2–14 % [39].

Noteworthy is that of those in both groups who completed one course in our program (n = 171), the majority went on to complete all three (n = 132). This observation, particularly in light of other online participation rates, suggests our program was successful once individuals overcame the initial obstacles to commencement.

Strengths and limitations

A strength of this study is the randomized, waitlist-controlled design with careful attention paid to the feasibility of implementation. Further, given that there has been little research investigating the effectiveness of online learning for improving EBP attitudes, knowledge, and behaviors [2], our study makes an important contribution to the EBP education literature by providing preliminary information regarding effectiveness. Additionally, we have provided a description of the intervention [10] in accordance with the Guideline for Reporting Evidence-based practice Educational Intervention and Teaching (GREET), the purpose of which is to guide the design and interpretation of EBP educational interventions [2, 28, 40, 41]. This will aid others in interpreting the results, and optimizing and testing future EBP education programs. Limitations of the study include the generally low data collection follow up and intervention engagement rates. This study was designed to be pragmatic in nature, reflecting how an EBP online education program could be implemented in real-world settings. Future studies aiming to establish the efficacy and effectiveness of such programs should take additional efforts to bolster participation.

Implications

The results of this study have several implications for those embarking on future EBP educational initiatives, particularly those aimed at broader scale implementation regardless of healthcare discipline. Future programs should consider including the use of moderated online discussions and other interactive methods as ways to provide individualized feedback, collaborative learning and instructor-student interaction; all of which are known to facilitate adult learning [17, 20, 42, 31, 32]. Further, such methods targeting interaction with peers, peer-coaches and/or opinion leaders should be considered [29]. Importantly, those aiming to rigorously evaluate the effectiveness of EBP programs should carefully weigh the strengths and limitations of the range of existing EBP outcome measures, in order to ensure alignment with their program competencies [17, 18]. Also, greater attention should be paid to stream-lining registration processes, especially with online programs to facilitate ease of use. Finally, to optimize participation in future EBP programs, a better understanding of what motivates practitioners to engage in online educational activities would be advantageous. A manuscript addressing barriers and facilitators to participation in this study will be reported separately.

Conclusion

This exploratory study found that an EBP educational program can be delivered in an online format; however relatively poor engagement suggests several barriers exist which need to be considered in future research and implementation efforts. Our online educational program resulted in small positive improvements in chiropractors’ attitudes and skills in EBP, but not their level of EBP uptake (use). This suggests that such online programs can provide an opportunity to enhance practitioners’ motivation and capacity for EBP, but that online education alone is not sufficient to make large changes in EBP use behaviors. The feasibility and effectiveness results from this study can be used to design and refine future, and more robust EBP training and implementation initiatives that can have larger impact on a broader scale.

The online courses described in this manuscript are freely available at http://www.csh.umn.edu/research/foundations-evidence-informed-practice-modules.

References

  1. Sackett DL, Straus SE, Richardson WS, Rosenberg W, Haynes RB. Evidence-based medicine. New York: Churchill Livingstone; 2000.

    Google Scholar 

  2. Young T, Rohwer A, Volmink J, et al. What are the effects of teaching evidence-based health care (EBHC)? Overview of systematic reviews. PLoS One. 2014;9(1), e86706.

    Article  PubMed  PubMed Central  Google Scholar 

  3. Clarke TC, Black LI, Stussman BJ, et al. Trends in the use of complementary health approaches among adults: United States, 2002–2012. Natl Health Stat Rep. 2015;79:1–16.

    Google Scholar 

  4. Evans R, Maiers M, Delagran L, et al. Evidence informed practice as the catalyst for culture change in CAM. Explore (NY). 2012;8(1):68–72.

    Article  Google Scholar 

  5. Hall G. Attitudes of chiropractors to evidence-based practice and how this compares to other healthcare professionals: A qualitative study. Clin Chiropr. 2011;14:106–11.

    Article  Google Scholar 

  6. Suter E, Vanderheyden LC, Trojan LS, Verhoef MJ, Armitage GD. How important is research-based practice to chiropractors and massage therapists? J Manipulative Physiol Ther. 2007;30:109–15.

    Article  PubMed  Google Scholar 

  7. Schneider MJ, Evans R, Haas M, et al. US chiropractors’ attitudes, skills and use of evidence-based practice: A cross-sectional national survey. Chiropr Manual Ther. 2015;23:16.

    Article  Google Scholar 

  8. Ubbink DT, Guyatt GH, Vermeulen H. Framework of policy recommendations for implementation of evidence-based practice: a systematic scoping review. BMJ open. 2013;3(1). doi:10.1136/bmjopen-2012-001881.

  9. Sadeghi-Bazargani H, Tabrizi JS, Azami-Aghdash S. Barriers to evidence-based medicine: a systematic review. J Eval Clin Pract. 2014;20(6):793–802.

    Article  PubMed  Google Scholar 

  10. Delagran L, Vihstadt C, Evans R. Aligning Theory and Design: The Development of an Online Learning Intervention to Teach Evidence-based Practice for Maximal Reach. Glob Advances Health Med. 2015;4(5):40–9.

    Article  Google Scholar 

  11. Zwickey H, Schiffke H, Fleishman S, Haas M, Cruser DA, Lefebvre R, Sullivan B, Taylor B, Gaster B. Teaching evidence-based medicine at complementary and alternative medicine institutions: strategies, competencies, and evaluation. J Altern Complement Med. 2014;20(12):925–31.

  12. Sullivan BM, Furner SE, Cramer GD. Development of a student-mentored research program between a complementary and alternative medicine university and a traditional, research-intensive university. Acad Med. 2014;89(9):1220–6.

    Article  PubMed  PubMed Central  Google Scholar 

  13. Haas M, Leo M, Peterson D, Lefebvre R, Vavrek D. Evaluation of the effects of an evidence-based practice curriculum on knowledge, attitudes, and self-assessed skills and behaviors in chiropractic students. J Manipulative Physiol Ther. 2012;35(9):701–9.

    Article  PubMed  PubMed Central  Google Scholar 

  14. Lefebvre RP, Peterson DH, Haas M, Gillette RG, Novak CW, Tapper J, Muench JP. Training the evidence-based practitioner: university of Western States document on standards and competencies. J Chiropr Educ. 2011;25(1):30–7.

  15. Long CR, Ackerman DL, Hammerschlag R, Delagran L, Peterson DH, Berlin M, Evans RL. Faculty development initiatives to advance research literacy and evidence-based practice at CAM academic institutions. J Altern Complement Med. 2014;20(7):563–70.

  16. Slawson DC, Shaughnessy A. Teaching evidence-based medicine: Should we be teaching information management instead? Acad Med. 2005;80:685–9.

    Article  PubMed  Google Scholar 

  17. Mizerow J. Learning to Think Like an Adult: Core Concepts of Transformation Theory. In: Mezirow and Associates, editor. Learning as Transformation. San Francisco: Wiley; 2000.

    Google Scholar 

  18. Dawes M, Summerskill W, Glasziou P, et al. Sicily statement on evidence-based practice. BMC Med Educ. 2005;5(1):1.

    Article  PubMed  PubMed Central  Google Scholar 

  19. Dolmans DH, Tigelaar D. Building bridges between theory and practice in medical education using a design-based research approach: AMEE Guide No. 60. Med Teach. 2012;34(1):1–10.

    Article  PubMed  Google Scholar 

  20. Gagne RM. The conditions of learning. 3rd ed. New York: Holt, Rinehart and Winston; 1977.

    Google Scholar 

  21. Sweller J. The psychology of learning and motivation: cognition in education (55th Edition). San Diego: Elsevier Inc.; 2011.

    Google Scholar 

  22. Keller JM. Motivational Design for Learning and Performance: the ARCS Model. New York: Springer; 2009.

    Google Scholar 

  23. Keller JM, Suzuki K. Learner motivation and E-learning design: a multi-nationally validated process. J Educ Media. 2004;29(3):229–39.

    Article  Google Scholar 

  24. Bandura A. Self-efficacy: The exercise of control. New York: W.H. Freeman and Company; 1997.

    Google Scholar 

  25. Leach MJ, Gillham D. Evaluation of the evidence-based practice attitude and utilization survey for complementary and alternative medicine practitioners. J Eval Clin Pract. 2008;14:792–8.

    Article  PubMed  Google Scholar 

  26. Weeks WB, Goertz C, Meeker W, Marchiori D. Public Perceptions of Doctors of Chiropractic: Results of a National Survey and Examination of Variation According to Respondents’ Likelihood to Use Chiropractic, Experience With Chiropractic, and Chiropractic Supply in Local Health Care Markets. J Manipulative Physiol Ther. 2015;38(8):533–44.

    Article  PubMed  Google Scholar 

  27. Dizon JMR, Grimmer-Somers KA, Kumar S. Current evidence on evidence-based practice training in allied health: a systematic review of the literature. IntJ Evid Based Healthc. 2012;10(4):347–60.

    Article  Google Scholar 

  28. Mooney S. The effect of education on evidence-based practice and nurses’ beliefs/attitudes toward and intent to use evidence-based practice. Doctoral Dissertation. Boiling Springs, North Carolina: Gardner-Webb University; 2012.

    Google Scholar 

  29. Sprague S, Pozdniakova P, Kaempffer E, Saccone M, Schemitsch EH, Bhandari M. Principles and Practice of Clinical Research course for surgeons: an evaluation of knowledge transfer and perceptions. Can J Surg. 2012;55:46–52.

    Article  PubMed  PubMed Central  Google Scholar 

  30. Evans R, Delagran L, Maiers M, et al. Advancing evidence informed practice through faculty development: the Northwestern Health Sciences University model. Explore (NY). 2011;7(4):265–8.

    Article  Google Scholar 

  31. Akyol Z, Garrison DR. Educational Communities of Inquiry: Theoretical Framework, Research and Practice. Hershey: IGI Global; 2013.

    Book  Google Scholar 

  32. Johnson DW, Johnson RT, Smith KA. Cooperative learning returns to college what evidence is there that it works? Change: the magazine of higher learning. 1998;30(4):26–35.

    Article  Google Scholar 

  33. Michie S, van Stralen MM, West R. The behaviour change wheel: a new method for characterising and designing behaviour change interventions. Implement Sci. 2011;6:42.

    Article  PubMed  PubMed Central  Google Scholar 

  34. Cialdini RB. Influence: the Psychology of Persuasion. New York: HarperCollins; 2007. p. 59.

    Google Scholar 

  35. Frantsve-Hawley J, Meyer DM. The evidence-based dentistry champions: a grassroots approach to the implementation of EBD. J Evid Based Dent Pract. 2008;8(2):64–9.

    Article  PubMed  Google Scholar 

  36. Craig P, Dieppe P, Macintyre S, et al. Developing and evaluating complex interventions: the new Medical Research Council guidance. Int J Nurs Stud. 2013;50(5):587–92.

    Article  PubMed  Google Scholar 

  37. Greenhalgh T. Computer assisted learning in undergraduate medical education. BMJ. 2001;322(7277):40–4.

    Article  CAS  PubMed  PubMed Central  Google Scholar 

  38. Herbert M. Staying the Course: A Study in Online Student Satisfaction and Retention. Online Journal of Distance Learning Administration. 2006; (9)4

  39. Perna L, Ruby A, Boruch R, et al. The Life Cycle of a Million MOOC Users. Presentation December 5, 2013: MOOC Research Initiative Conference. Powerpoint file, accessed 17 Jan 2016 at: http://k12accountability.org/resources/Online-Education/perna_ruby_boruch_moocs_dec2013.pdf

  40. Phillips AC, Lewis LK, McEvoy MP, et al. A Delphi survey to determine how educational interventions for evidence-based practice should be reported: stage 2 of the development of a reporting guideline. BMC Med Educ. 2014;14:159.

    Article  PubMed  PubMed Central  Google Scholar 

  41. Phillips AC, Lewis LK, McEvoy MP, et al. A systematic review of how studies describe educational interventions for evidence-based practice: stage 1 of the development of a reporting guideline. BMC Med Educ. 2014;14:152.

    Article  PubMed  PubMed Central  Google Scholar 

  42. Newman I, Benz CR. Qualitative-quantitative Research Methodology: Exploring the Interactive Continuum. Carbondale, IL: Southern Illinois University Press; 1998. p.158-165.

Download references

Acknowledgements

This research was made possible by Grant Number R21AT007547 from the National Center for Complementary and Integrative Health (NCCIH; formerly NCCAM) at the National Institutes of Health (NIH). The views expressed in this article are solely those of the authors and do not necessarily represent the official views of the NCCIH, NCCAM or NIH.

We would like to acknowledge the assistance of our research coordinator Christine McFarland and senior research associate Kris Gongaware with the management of the survey website and data collection. There were also several faculty members from the participating chiropractic institutions who we would like to acknowledge for their assistance with developing the booster lessons: Barry Taylor, John Stites, Thomas Grieve, David Peterson, and Ron LeFebvre.

Authors’ contributions

MS was the principal investigator of the DELIVER study and was responsible for securing the funding and administration of the research grant. He was responsible for data collection, analysis, a majority of the manuscript preparation, and contributed substantial revisions toward the final manuscript resubmission. RE contributed to the conceptualization, design and funding acquisition of this work; she participated in data collection implementation and monitoring as well as decisions regarding data analysis and interpretation of results. She worked with the primary author to prepare the manuscript for publication, contributed content to the background and discussion sections, and contributed substantial revisions toward the final manuscript resubmission. MH was responsible for assisting in designing the study, and interpretation of findings, as well as drafting and editing the manuscript. He also contributed substantial revisions toward the final manuscript resubmission. LD served as the instructional design consultant to this study and was involved with conceptualizing and designing the booster lessons. She also contributed to the editing and final manuscript preparation. GC contributed to the initial concept and design of the study, was involved in meetings assessing progress, and critically reviewed the drafts of the manuscript, including data analyses. CH contributed to the initial concept and design of the study, interpretation of findings, as well as editing the manuscript. ML was involved in the design and funding of the study, development of the outcome measure, drafting of the methods, and editing of the draft and final manuscript. CL contributed to the initial concept and design of the study, was involved in meetings assessing progress, and critically reviewed the drafts of the manuscript, including data analyses. CV worked with the primary author and co-authors to prepare the manuscript. She also participated in the implementation of Phase II of the DELIVER study. OW made significant contributions to the conception and design, acquisition and analysis of data; he was involved in drafting/revising the manuscript and gave final approval of the version to be published. LT performed the statistical analyses of the data and assisted with the interpretations of the results. She prepared the tables and worked with the primary author to write the methods, statistical analyses, and results sections of the manuscript. All authors read and approved the final manuscript.

Competing interests

The authors declare that they have no competing interests.

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Michael Schneider.

Rights and permissions

Open Access This article is distributed under the terms of the Creative Commons Attribution 4.0 International License (http://creativecommons.org/licenses/by/4.0/), which permits unrestricted use, distribution, and reproduction in any medium, provided you give appropriate credit to the original author(s) and the source, provide a link to the Creative Commons license, and indicate if changes were made. The Creative Commons Public Domain Dedication waiver (http://creativecommons.org/publicdomain/zero/1.0/) applies to the data made available in this article, unless otherwise stated.

Reprints and permissions

About this article

Check for updates. Verify currency and authenticity via CrossMark

Cite this article

Schneider, M., Evans, R., Haas, M. et al. The effectiveness and feasibility of an online educational program for improving evidence-based practice literacy: an exploratory randomized study of US chiropractors. Chiropr Man Therap 24, 27 (2016). https://doi.org/10.1186/s12998-016-0109-8

Download citation

  • Received:

  • Accepted:

  • Published:

  • DOI: https://doi.org/10.1186/s12998-016-0109-8

Keywords