Skip to main content

A failed review of CCE site inspection standards and processes



Accreditation of educational programs involves an independent agency assessing quality against a set of defined standards. Site inspection teams are appointed by an accrediting agency and compile a report with the intention of identifying deficiencies and making recommendations for their rectification and continued improvement. For chiropractic programs accreditation is carried out by Councils on Chiropractic Education (CCEs). However, the reliability of their site inspection teams remains unknown. Recent research has suggested that variability in chiropractic practice may be partially traced back to the education provider. This raises the possibility of deficient accreditation procedures that may include unsatisfactory site inspection standards or processes or the accreditation standards by which they work to.

We sought to compare the various CCEs documented standards and processes for site inspection teams for similarities and differences with the intent of making recommendations to create uniform and high quality standards. Further, we sought to compare a sample of CCEs site inspection team surveys / reports for commonly identified recommendations and quality improvements and determine if they are adequately described in their accreditation standards.


In December of 2018 invitation emails were sent to 4 CCEs through their website portals outlining a proposed study investigating site inspection teams’ standards and processes. Access was requested to all appropriately redacted documentation relating to site inspection teams and their chiropractic program reports. Follow up emails were sent several weeks later.


Only one of four of the CCEs responded by providing the requested information.

Conclusion and recommendations

Three CCEs did not cooperate with this educational research. The possible reasons for the non-engagement is discussed.


The number of chiropractic education programs has increased considerably over the past decades. Some are run as private initiatives, some are included in university structures, sometimes even within a medical school setting. Concomitantly, perhaps both as a result and a driver, evidence-based practice approaches have become important and the traditional chiropractic vitalistic approaches downgraded, at least ‘officially’.

Accreditation standards and inspection procedures have been established worldwide by Councils on Chiropractic Education (CCEs) to safeguard standards and to ensure harmonisation between schools and geographical regions. CCEs are variously mandated and exist for the purposes of assuring educational quality and institutional integrity to governments, regulatory bodies, chiropractic programs, professional organizations, students and the public at large. Members of the CCEs may be elected or appointed. Medical education has adopted science and evidence-based practice as a basis for training [1]. This is not always the case for chiropractic education. Some CCE accredited chiropractic programs have aligned with an evidence-based mainstream healthcare approach and declared that since there is no evidence for vitalist or subluxation beliefs it has no place in chiropractic training, except from as a historical context [2]. Other CCE accredited colleges have continued to hold to a vitalist philosophy. Some even openly advertise vitalistic statements such as the New Zealand College of Chiropractic “To offer well-resourced, integrated, relevant, evidence based curriculum with the correction of vertebral subluxation as the primary chiropractic aim” [3] and “The purpose of chiropractic is to help people reach and maintain excellent health and wellbeing through the care of the spine and nerve system” [4].

This has implications for patient safety and quality of care. Vitalist or subluxation trained chiropractors have been characterised by excessive X-ray usage, anti-vaccination beliefs, and poor levels of inter-professional / disciplinary communication [5, 6].

This raises the two questions,

  1. i.

    How can these vitalistic educational practices occur?

  2. ii.

    Why are these issues not discovered and dealt with during the CCE inspections?

In our opinion, this suggests two possibilities; chiropractic accreditation standards are not addressing such issues adequately or there is inadequate monitoring and site inspection processes. Site inspection teams, appointed by the accrediting agency, is the mechanism by which program performance is assessed against prescribed standards. Such teams compile a report with the intention of identifying deficiencies and making recommendations for their rectification and continued improvement.

We have previously conducted a systematic review of these accreditation standards, and made a series of recommendations with the intention of improving their uniformity and quality [7]. The intent of this study was to conduct a similar systematic review of the site inspection standards and processes to the same end.

Medical education is frequently ‘overseen’ by accreditation agencies, whether governmental or private. This activity rests on (among other things) the relevance of inspection, the skills and knowledge of the inspection teams, and the subsequent use made of any recommendations for improvement [8]. Obviously, accreditation and re-accreditation surveys depend on the expertise of the people involved and are therefore at risk of becoming subjective and even invalid. Historically the reliability of inspection team surveys has been unknown and difficult to study [9]. Nonetheless it is recognised as an important area for further attention as it is under-investigated [10,11,12].

Research to date has identified several factors that are thought to increase the likelihood of improved outcomes of this process. These include having the processes for inspection surveys clearly outlined, standardized and consistently applied to the accreditation standards [9], strong communication skills within an experienced team and team members should be temporary or replaceable so that it promotes allegiance to the accrediting organisation [11]. Also the teams should undergo detailed training and mentoring [13]. Yet another paper suggested that team members should have extensive experience in the profession, with a minimum of experience in high managerial positions (ranging from 2 to 5 years), and profession-specific certification [8].

Intrinsic to the accreditation process of chiropractic, as performed by the various CCEs, is the site inspection team appointment, training, co-ordination, quality control and review, and implementation of the survey team’s final report. However, we could not find any previous work on the monitoring aspect of the tasks of the CCEs, nor with respect to site visitations for accreditation or re-accreditation purposes of chiropractic programs in PubMed, Scopus or Chirolndex databases.

Clearly, regardless of how good the accreditation standards are, unless the monitoring process is relevant, consistent and effective, they will not be reinforced in teaching institutions that, prefer to deviate in other directions.

In view of these problems, we wanted to see if there is an appropriate and comprehensive approach to inspecting chiropractic programs for re-accreditation by site inspection / surveys by CCEs.


The aim of this systematic audit was to investigate similarities and differences between the various CCEs inspection site team documentation and processes and compare these to known quality standards and the available evidence.


The objectives were to:

  1. 1.

    Review and compare the available site team inspection documentation from each CCE to look for similarities and differences.

  2. 2.

    Review and compare a sample of CCEs site inspection team surveys / reports for commonly identified recommendations and quality improvements and determine if they are described in their respective accreditation standards.

  3. 3.

    Make recommendations that would create a high-quality set of site inspection team standards and processes that is consistent with known best standards and evidence.

Methods and analysis

Ethics approval was obtained from Murdoch University Human Research and Ethics Committee (2018/238) for this study.

We intended to conduct a systematic audit to investigate the three objectives.

Eligibility criteria

The World Health Organization recommends the CCE-International as the source of information regarding evaluation of chiropractic education [14]. Consequently, we included all those CCEs who were members in good standing. The CCE-USA was a member in good standing of the CCE-I since inception until 2015 and is home to the largest number of chiropractic programs [7]. Also, the CCE-USA Accreditation Standards were released in 2013 and remain current [15]. Consequently, the CCE-USA was included in the analysis.

Thus four CCEs were included in total; Council on Chiropractic Education Australasia (CCE-Australasia) [16], Council on Chiropractic Education Canada (CCE-Canada) [17], the Council on Chiropractic Education (CCE-USA) [15], and the European Council on Chiropractic Education (ECCE) [18]. A fifth CCE was identified, the Council on Chiropractic (International) CCE-I which is a federation of the other four CCEs. The CCE-I does not conduct site inspections and was not included.


An email invitation with the study information was sent to the four CCEs (CCE-A, CCE-C, CCE-USA, ECCE) via their website portal in mid-December 2018 (Additional file 1). A follow up email was sent to the non-responders in mid-January of 2019.

Data extraction process and synthesis of results

Objective 1. A systematic audit of CCE site inspection team documentation and processes

Method: All CCEs were approached and asked for copies of their documentation related to site inspection of chiropractic programs. CCEs were asked to de-identify all data using redaction. Once obtained these data would be further scrutinized to ensure no identifying information was found. The data would then be recorded and tabulated for a comparative analysis. This method had been used in three previous systematic reviews investigating CCEs accreditation standards [19,20,21]. For these studies the documentation used in the analysis was obtained because it was accessible to the public from CCE websites. The table format to compare for similarities and differences was to be structured to identify similarities and differences with respect to the following elements;

  1. i.

    Team composition

  2. ii.

    Team selection criteria for chiropractors and consumer members

  3. iii.

    Team training / instruction

  4. iv.

    Report construction

The findings were to be compared to inspection team survey documentation from another widely known and recognised Medical Accreditation organisation, such as the Australian Medical Council [22] or the Accreditation Council for Graduate Medical Education [23].

This was to allow for the identification of similarities and differences in CCEs site inspection standards and processes.

Objective 2. A thematic analysis of CCEs site inspection team final reports

Method: We sought to obtain copies of CCEs Site Inspection team final reports of chiropractic programs for the last 5 years from CCE-Australasia, CCE-Canada, CCE-USA and the ECCE. The reports were to be coded to identify themes and de-identified to ensure confidentiality.

Two researchers would independently review the reports and place each recommendation made by the site inspection team under the appropriate Accreditation Standard.

The two researchers would then compare their decisions for report recommendation placement and then discuss any differences. A third reviewer was available to resolve any instances when a recommendation could not be placed.

The process allows an understanding of whether the inspection team reports are comprehensive and consistent when compared to the accreditation standards. Identified ‘gaps’ could inform future iterations of accreditation standards as well as CCEs training of site inspection teams.

Objective 3. Recommendations for creation of a high-quality set of site inspection team standards and processes

By conducting this comparative process, and considering the evidence, similarities and differences between CCEs and other quality non-chiropractic inspection standards and processes could be identified enabling the proposing of recommendations to create uniform and high quality international set of site inspection standards, report construction and processes.


Responses were received from 3 of the 4 CCE organisations.

The CCE-USA referred us to their website and declined to forward any further material. Information on their website related to site inspection standards and processes but did not include training or recruitment data. This CCE also declined to provide site inspection team reports, as they deemed these reports to be disclosed at the discretion of the chiropractic program (CP) only. It was also claimed that this was to protect the confidentiality of CCE members, CCE office staff and CPs, and to comply with the overarching bodies’ policy of not making accreditation activities open to the public for release.

The CCE-Canada did not respond to our request despite further correspondence.

The CCE-Australasia informed us that this material was confidential in nature and could not be reproduced or commented on in any public domain or format. Also, this CCE took the view that site inspection reports were confidential and released only at the discretion of the CP. Further that this related to the integrity of the accreditation process as it depends, in part, on the Council and the Committee maintaining confidentiality with all aspects of the process other than the reporting of procedure and decisions. We recontacted this CCE and asked them to reconsider their decision on the grounds that their overarching government organisation had a policy on research and this area of investigation was identified as a high priority. Also that this study would meet all the confidentiality and privacy requirements and was redacted and not being released to the public but to researchers. The CCE-Australasia responded “The CCE has considered your response and we re-confirm our original position as advised previously”.

The ECCE readily complied and forwarded all site inspection documentation, including their training information. This also included site inspection team reports that were, in fact, also available on their website.

This resulted in a complete and usable set of desired documents from only 1 of the 4 CCEs.


Summary of findings

We contacted four CCEs with a letter explaining a study to investigate CCEs site inspection team standards, processes, and documentation as well as seeking copies of reports from chiropractic program inspections. This letter presented the reasons and rationale for the study, appropriate ethics approval, and addressed the sensitive area of confidentiality and anonymity. Only 1 of 4 CCEs responded positively and agreed to participate.

Potential explanations

The unwillingness to participate is puzzling as the 4 CCEs are undertaking the same task, with the same goals, using supposedly uniform standards. So, while one of the CCEs was completely transparent the other three were not at all. Clearly, this makes it difficult for the CCEs to compare and align activities between themselves. This compares poorly with a similar study of Dental programs in Australia in 2007, where such inspection team reports were readily made available using the same research methodology [24].

We considered several possible explanations for the non-participation of the CCEs. First, is that these organizations do not differentiate between the public and researchers. When we made this differentiation on a repeat request one of the non-participating CCE did not change its mind. Obviously the CCE had control over ensuring that confidentiality requirements were met as they were invited to remove any sensitive information before forwarding the required documentation. Perhaps there was a distrust of the authors by the CCE or an overcautiousness that resulted in a decision to be wary despite the authors having University Ethics approval.

It is tempting to speculate that these responses of ‘silence’ and ‘public confidentiality’ were stimulated by a lack of confidence in their own standards and processes for site inspection of the various chiropractic programs. This may have resulted in an approach to camouflage this possibility by refusing to participate. Alternatively, these organizations may be confident in their established standards and methods and have no desire to change them.

It is argued by some that transparency of accreditation findings motivates educational or healthcare programs to hide any short-comings and avoid ‘brand damage’ that would result in a reduced capacity to compete in a competitive market place [25]. This could be less important in regions where there is a greater number of student enrolments than available places in CPs.

Proponents of confidentiality hold to the view that an environment without negative public consequences improves the likelihood of disclosure by CPs to CCEs [26, 27]. The resultant openness between agency and educator produced by this ‘safe environment’ is thought to be conducive to a collaborative working relationship that increases the chances of quality improvements [28]. However, the evidence for this is elusive.

The question for us as bona fide educational researchers, is why did one of the four CCEs trust the authors and agree to fully participate while the other 3 did not? We suspect that cultural differences may be one factor as the positively responding CCE already transparently publishes inspection team reports on their website. It is also possible that there are differences between the composition of the CCEs executives that influence the willingness to subject themselves to scrutiny with the possibility of having to change and improve. Those who argue in favour of such transparency believe the public and students have a right in a democratic society to know of educational practice standards [29]. They also argue that the possibility of positive reports motivates educational programs to strive to improve themselves and increase their marketability [26, 28].


We could not compare the CCE site inspection standards and processes because only one provided information. We conclude that there is a wide discrepancy in transparency between CCEs on the topic of site inspections.

Future research to develop a clearer understanding of the internal machinations and thinking of CCEs with respect to such matters may be best obtained through interviews with key personnel.

Availability of data and materials

Not applicable



Council on Chiropractic Education


Council on Chiropractic Education – International


Councils on Chiropractic Education


Chiropractic Program


European Council on Chiropractic Education


  1. WFME GLobal standards for quality improvement, 2015 Revision [ ].

    Google Scholar 

  2. Clinical and professional education: A position statement [].

  3. Strategic Vision [].

  4. Introducing Chiropractic [].

  5. McGregor M, Puhl AA, Reinhart C, Injeyan HS, Soave D. Differentiating intraprofessional attitudes toward paradigms in health care delivery among chiropractic factions: results from a randomly sampled survey. BMC Complement Altern Med. 2014;14:51.

    Article  Google Scholar 

  6. Puhl AA, Reinhart CJ, Doan JB, McGregor M, Injeyan HS. Relationship between chiropractic teaching institutions and practice characteristics among Canadian doctors of chiropractic: a random sample survey. J Manip Physiol Ther. 2014;37(9):709–18.

    Article  Google Scholar 

  7. Innes SI, Leboeuf-Yde C, Walker BF. Comparing the old to the new: a comparison of similarities and differences of the accreditation standards of the chiropractic council on education-international from 2010 to 2016. Chiropr Man Therap. 2018;26:25.

    Article  Google Scholar 

  8. Plebani M. Role of inspectors in external review mechanisms: criteria for selection, training and appraisal. Clin Chim Acta. 2001;309(2):147–54.

    Article  CAS  Google Scholar 

  9. Greenfield D, Pawsey M, Naylor J, Braithwaite J. Researching the reliability of accreditation survey teams: lessons learnt when things went awry. HIM J. 2013;42(1):4–10.

    Article  Google Scholar 

  10. Shaw CD. Evaluating accreditation. Int J Qual Health Care. 2003;15(6):455–6.

    Article  Google Scholar 

  11. Greenfield D, Pawsey M, Naylor J, Braithwaite J. Are accreditation surveys reliable? Int J Health Care Qual Assur. 2009;22(2):105–16.

    Article  Google Scholar 

  12. Pomey MP, Contandriopoulos AP, Francois P, Bertrand D. Accreditation: a tool for organizational change in hospitals? Int J Health Care Qual Assur Inc Leadersh Health Serv. 2004;17(2–3):113–24.

    Article  Google Scholar 

  13. Frisino J. COA’s accreditation system: checks, balances, and firewalls. Behav Health Accred Account Alert. 2002;7(4):1–4.

    Google Scholar 

  14. WHO. WHO guidelines on basic trianing and safety in chiropractic. In: WHO Library Cataloguing-in-Publication Data. 1st ed. Geneva: World Health Organization; 2005. p. 51.

    Google Scholar 

  15. CCE Accreditation Standards [].

  16. (CCEA) CoCEA. Accreditation Standards for Chiropractic Programs. Canberra: Council on Chiropractic Education Australasia; 2017.

    Google Scholar 

  17. Accreditation of Doctor of Chiropractic Programmes [].

  18. Accrediation Procedures and Standards in First Qualification Chiropractic Education and Training [].

  19. Innes SI, Leboeuf-Yde C, Walker BF. Similarities and differences of graduate entry-level competencies of chiropractic councils on education: a systematic review. Chiropr Man Therap. 2016;24:1.

    Article  Google Scholar 

  20. Innes SI, Leboeuf-Yde C, Walker BF. Similarities and differences of a selection of key accreditation standards between chiropractic councils on education: a systematic review. Chiropr Man Therap. 2016;24:46.

    Article  Google Scholar 

  21. Innes SI, Leboeuf-Yde C, Walker BF. How comprehensively is evidence-based practice represented in councils on chiropractic education (CCE) educational standards: a systematic audit. Chiropr Man Therap. 2016;24(1):30.

    Article  Google Scholar 

  22. Australian Medical Council Ltd [].

  23. Willoughby J, Nguyen V, Bockenek WL. Assessing competency in physical medicine and rehabilitation residency: the ACGME milestones initiative. AMA J Ethics. 2015;17(6):515–20.

    Article  Google Scholar 

  24. Arena G, Kruger E, Tennant M. Accreditation of dental programs in Australia: a thematic analysis of recommendations, 1996-2004. J Dent Educ. 2007;71(9):1210–6.

    PubMed  Google Scholar 

  25. Hinchcliff R, Greenfield D, Hogden A, Sarrami-Foroushani P, Travaglia J, Braithwaite J. Levers for change: an investigation of how accreditation programmes can promote consumer engagement in healthcare. Int J Qual Health Care. 2016;28(5):561–5.

    Article  Google Scholar 

  26. Levay C, Waks C. Professions and the pursuit of transparency in healthcare: two cases of soft autonomy. Organ Stud. 2009;30(5):509–27.

    Article  Google Scholar 

  27. Narins CR, Dozier AM, Ling FS, Zareba W. The influence of public reporting of outcome data on medical decision making by physicians. Arch Intern Med. 2005;165(1):83–7.

    Article  Google Scholar 

  28. Eindhoven DC, Wierda E, de Bruijne MC, Amoroso G, de Mol BAJM, Umans VAWM, Schalij MJ, Borleffs CJW. The year of transparency: measuring quality of cardiac care. Neth Hear J. 2015;23(10):457–65.

    Article  CAS  Google Scholar 

  29. Greenfield D, Braithwaite J. Developing the evidence base for accreditation of healthcare organisations: a call for transparency and innovation. BMJ Quality & Safety. 2009;18(3):162.

Download references




Not applicable.

Author information

Authors and Affiliations



SI, BW and CLY were responsible for the study design. SI and CLY undertook the data analysis and interpretation. SI developed the initial and iterative draft. BW and CLY were responsible for reviewing and redrafting the final manuscript. All contributed to the final version. All authors read and approved the final manuscript.

Corresponding author

Correspondence to Stanley I. Innes.

Ethics declarations

Ethics approval and consent to participate

Ethics approval was granted from Murdoch University Human Research and Ethics Committee (Project No 2018/238).

Consent for publication

Not applicable

Competing interests

Bruce Walker (BFW) is Editor-in-Chief and Charlotte Leboeuf-Yde (CLY) is Senior Editorial Adviser of the journal Chiropractic & Manual Therapies. Neither played any part in the assignment of this manuscript to Associate Editors or peer reviewers and are separated and blinded from the editorial system from submission inception to decision.

Additional information

Publisher’s Note

Springer Nature remains neutral with regard to jurisdictional claims in published maps and institutional affiliations.

Additional file

Additional File 1:

Study Invitation (DOCX 186 kb)

Rights and permissions

Open Access This article is distributed under the terms of the Creative Commons Attribution 4.0 International License (, which permits unrestricted use, distribution, and reproduction in any medium, provided you give appropriate credit to the original author(s) and the source, provide a link to the Creative Commons license, and indicate if changes were made. The Creative Commons Public Domain Dedication waiver ( applies to the data made available in this article, unless otherwise stated.

Reprints and permissions

About this article

Check for updates. Verify currency and authenticity via CrossMark

Cite this article

Innes, S.I., Leboeuf-Yde, C. & Walker, B.F. A failed review of CCE site inspection standards and processes. Chiropr Man Therap 27, 49 (2019).

Download citation

  • Received:

  • Accepted:

  • Published:

  • DOI: