Skip to main content

Database coverage and their use in systematic reviews regarding spinal manipulative therapy: an exploratory study



Systematic reviews (SRs) of randomized controlled trials (RCTs) are considered one of the most reliable study types. Through a systematic and thorough literature search, researchers aim to collect all research relevant to their purpose. The selection of databases can be challenging and depend on the topic of interest. The Cochrane Handbook suggests searching at least the following three databases: Cochrane Library, MEDLINE, and EMBASE. However, this is not always sufficient for reviews on the musculoskeletal field in general.

This study aimed to examine the frequency and choice of databases used by researchers in SRs of spinal manipulative therapy (SMT). Secondly, to analyze the RCTs included in the SRs to determine the optimal combination of databases needed to conduct efficient literature searches for SRs of SMT.


SRs investigating the effect of SMT on any patient-reported outcome measure were identified through searches in PubMed and Epistemonikos (all entries till date of search February 25, 2022). For each SR, databases searched and included RCTs were collected. RCTs were searched individually in nine databases (Cochrane Library, MEDLINE/PubMed, EMBASE, Google Scholar, CINAHL, Web of Science, Index to Chiropractic Literature, PEDro, and AMED). Coverage rates were calculated using the number of retrieved RCTs by the database or combinations of databases divided by the total number of RCTs.


Eighty-five SRs published met the inclusion criteria, and 442 unique RCTs were retrieved. The most frequently searched database was MEDLINE/PubMed. Cochrane Library had the highest overall coverage rate and contained the third most unique RCTs. While a 100% retrieval was not possible, as 18 RCTs could not be retrieved in any of the nine databases, the combination of Cochrane Library, Google Scholar, and PEDro retrieved all possible RCTs with a combined coverage rate of 95.9%.


For SRs on SMT, we recommend using the combination suggested by the Cochrane Handbook of Cochrane Library, MEDLINE/PubMed, Embase, and in addition, PEDro and Index to Chiropractic Literature. Google Scholar might be used additionally as a tool for searching gray literature and quality assurance.


Systematic reviews (SRs) of randomized controlled trials (RCTs) are widely accepted to be on top of the evidence hierarchy [1, 2]. They are cornerstones in evidence-based healthcare [3] and evidence-based research [4]. This comes to fruition by condensing all relevant and available evidence on a topic and drawing a general conclusion from a broader population by combining sample sizes and thereby reducing biases [5]. In order to collect all relevant studies, a comprehensive literature search must be conducted, and researchers are generally advised to search multiple databases and use additional methods such as citation tracking, contacting experts in the field, and searching gray literature [6,7,8,9,10,11,12]. As the Cochrane Handbook for Systematic Reviews of Interventions highlights, leaving out relevant evidence can lead to selection bias. Cochrane thereby recommends searching at least the following three databases: The Cochrane Central Register of Controlled Trials (CENTRAL), MEDLINE, and EMBASE [11]. However, these recommendations may not sufficiently cover all relevant aspects of the research question. Some types of research or research topics may only be found in specialty journals that are not indexed in all databases [13]. An example of such could be literature related to chiropractic and, more specifically, spinal manipulative therapy (SMT) [14]. SMT is a guideline-recommended conservative therapy used by various practitioners, including chiropractors, osteopaths, and physiotherapists worldwide, typically to treat low back pain, neck pain, and headache [15, 16]. Furthermore, the procedures and theoretical frameworks have developed quite substantially over the last century [17]. It is not unlikely that specific papers are only published in journals related to those professions and thereby only found in the corresponding database.

In contrast, searching too many databases has clear disadvantages, as the search strategy must be translated to fit different databases using different interfaces and search syntaxes, and the time spent screening more, likely irrelevant, titles and abstracts is not insignificant [18]. Which and how many databases are necessary to be searched and the added value of select databases has been the topic of many previous studies, and the main takeaway seems that it, as expected, heavily depends on the topic of interest [13, 14, 19,20,21,22,23,24,25,26,27,28,29,30]. No research has looked systematically at retrieving relevant SMT papers. However, in the broader field of musculoskeletal disorders, Aagaard et al. [31] found MEDLINE, EMBASE, and CENTRAL to be insufficient at identifying all effect studies based on achieving a combined coverage rate of 88.9%. In an attempt to make a more generalized recommendation across all biomedical fields, Bramer et al. [32] found that searches should include EMBASE, MEDLINE, Web of Science, and Google Scholar as minimum requirements.

The Preferred Reporting Items for Systematic reviews and Meta-Analysis (PRISMA) tool was developed in 2009 to standardizing reporting in SRs, ensuring transparency and minimizing biases [33]. PRISMA and the use of an information specialist have become imperative when conducting a high-quality SR [34,35,36].

Hence, conducting a SR on a specific intervention such as SMT is not without challenge, and the selection of databases has not yet been explored sufficiently. This study will examine the frequency and choice of databases used by researchers in SRs of SMT. Secondly, to analyze the RCTs included in the SRs to determine the optimal combination of databases to conduct efficient literature searches for SRs of SMT. Finally, to examine whether the year of publication or the use of an information specialist influenced the number of investigated databases and how the use of PRISMA has changed over time.


The research protocol for this study was registered at the Open Science Framework (protocol:

Changes made to the protocol

To ensure feasibility of completion, we had to limit our approach to SRs in English, Danish, Norwegian, and Swedish, exclude SRs focusing on more general conservative approaches, and SRs not focusing on patient-reported outcome measures (PROMs). SRs focusing on adverse events, cost-effectiveness, and age groups below 18 years were also excluded. Additionally, we searched all databases that were used in more than 20% of the SRs instead of the five most common.

Eligibility criteria

We included SRs investigating the effect of spinal manipulations on any spinal region (i.e., cervical, thoracic, or lumbar spine, and the sacroiliac-joint (SI)). The SRs had to include RCTs evaluating any PROM. Exclusion criteria were (a) not an SR, (b) SRs focusing on more general conservative approaches, (c) SRs not evaluating PROMs, (d) SRs of age groups below 18 years, (e) SRs focusing on cost-effectiveness, (f) SRs focusing on adverse events, and (g) lack of full list of databases searched. The title and abstract screening process was performed independently by two researchers (MNE and SDM). Conflicts in the screening process of the SRs were solved by CGN and MNE by discussion.

All references included in the SRs were collected and manually evaluated. Hence, references investigating the effect of spinal manipulations on any spinal region using any PROM were included. Other study types and RCTs, including age groups below 18 years, and unpublished papers, were excluded.

Search strategy

SRs investigating the effect of SMT were retrieved from PubMed and Epistemonikos [37] for all entries (date of search February 25, 2022). For PubMed, the search term “Musculoskeletal manipulations” [MeSH] and the filter “systematic reviews” was applied. For Epistemonikos, a search by title or abstract using the search terms combined with the Boolean operators (musculoskeletal OR spinal*) AND (manipulation* OR adjust* OR chiropract*) and filtered for systematic reviews was performed. No restriction to the date of publication was applied.

Data collection

All variables collected are shown in Table 1. The body part related to the treated disorder was categorized into “cervical + headache”, “thoracic”, “lumbar + SI-joint + coccyx”, “extremities”, “multiple sites”, and “not defined”. We extracted information on which databases and search platforms were used in the SRs. For simplicity, we label these “databases” onwards. All included RCTs were manually searched in the following databases: MEDLINE/PubMed (via PubMed), CENTRAL (via Cochrane Library), EMBASE (via Ovid), jointly through Web of Science Core Collection Indexes (Science Citation Index Expanded, Social Sciences Citation Index, Arts and Humanities Citation Index, Conference Proceedings Citation Index (Science + Social Sciences and Humanities), and Emerging Sources Citation Index), henceforth listed as Web of Science, and Google Scholar. When searching Google Scholar, we searched the titles in quotations and unchecked the inclusion of citations. These databases were chosen because it allowed us to investigate the databases recommended by the Cochrane Handbook for SRs of Interventions and the previously suggested databases by Bramer et al. [32]. Furthermore, we also searched all other databases used by more than 20% of the included SRs in our study. As PubMed includes all MEDLINE references [38], we treated them as one database to avoid misleading results.

Table 1 Variables collected in this study

The RCTs were initially searched by title, and if that yielded no result, further searches using author, year of publication, and digital object identifier (DOI) were performed. MNE performed all searches, and SDM independently searched a sample of 50 random RCTs. We calculated intraclass correlation coefficient (ICC) using the two-way mixed-effects model to secure consistency in our search approach [39]. An ICC < 0.9 would lead to further training and collaboration between the two data curators.

Statistical analysis

The number and frequency of databases searched were described in absolute numbers, mean, median, and interquartile range (IQR). The correlation between the number of databases searched and the year of publication of the included SRs was performed using Spearman’s rank correlation coefficient. Use of an information specialist was reported as number and frequency. The correlation was calculated using Pearson’s correlation coefficient. Correlation between the year of publication and the use of PRISMA was also performed using Pearson’s correlation coefficient. In this analysis, we only included SRs published after 2009, the year PRISMA was published [33].

The contribution of RCTs from each database and the various combinations of databases and their combined contributions were described as absolute numbers, overall coverage, mean coverage per SR, median coverage per SR, and 100% coverage per SR. Coverage rates were calculated using the numbers of RCTs retrieved by the database(s) divided by the total number of included RCTs, presented as percentages. Although Google Scholar has a high recall rate, previous reports have highlighted issues with low precision in structured literature searches of Google Scholar. Hence, calculations were performed with and without Google Scholar [40, 41]. We tabulated the three best combinations across two, three, and four databases, both including and excluding Google Scholar. All statistical analyses were performed in RStudio (v. 4.1.3, RStudio v. 1.4) for Windows 10 using the Tidyverse packages [42].


The initial searches in PubMed and Epistemonikos yielded 1,256 results, of which 128 were duplicates. After title and abstract screening, 314 SRs were eligible for full text review. Eighty-five SRs ended up being included. The 36 SRs where we could not access full-texts were excluded as this was an exploratory study, and resources were limited for acquiring additional materials. The full process can be seen in Fig. 1.

Fig. 1
figure 1

Flowchart of the selection process of the included and excluded SRs and RCTs

From the 85 included SRs, 1227 RCTs were collected, and after removing 785 duplicates, 442 (36%) unique titles were manually searched in the nine databases, the five previous stated and CINAHL (via EBSCOhost), Index to Chiropractic Literature (ICL) ( – Index To Chiropractic Literature), PEDro (English – PEDro), and AMED – The Allied and Complementary Medicine Database (via EBSCOhost). MANTIS was also searched by more than 20% of the SRs, but despite multiple attempts, we could not gain access to MANTIS. From a sample of 50 random RCTs, an ICC of 0.97 (95% confidence interval = 0.96–0.97) showed excellent agreement between the two assessors without the need for further training.

Characteristics of the included SRs and RCTs

All the included SRs were published between 1985 and 2021. Figure 2 shows the distribution over time for the included SRs and the RCTs’ accumulation. A significant but weak correlation of 0.25 was found, indicating that newer SRs search slightly more databases.

Fig. 2
figure 2

A Distribution of the included systematic reviews and B accumulation of included randomized controlled trials over time

Thirty-four (40%) of the SRs investigated the effect of SMT as a treatment for disorders in the lumbar spine, SI-joint, or coccyx. The second most investigated region was the cervical spine and different types of headaches, which 25 (29%) of the SRs focused on.

Sixteen (19%) of the SRs included used an information specialist. No correlation was found between the use of an information specialist/research librarian and the number of databases searched.

Mean and median numbers for databases searched by the SRs were 5.8 and 6, respectively, with an IQR of 3, the distribution is shown in Fig. 3. All 85 SRs searched MEDLINE/PubMed (100%), Cochrane Library (78%), EMBASE (72%), and CINAHL (71%) were searched second to fourth most, with a considerable drop to the fifth most searched database being the Index to Chiropractic Literature at 33%. Collectively, the 85 SRs searched 52 different databases, shown in Fig. 4. Mean, median, and IQR for RCTs per SR were 14.4, 8, and 15, respectively. No correlation was found between the number of RCTs per SR, and the number of databases searched (correlation coefficient = − 0.06).

Fig. 3
figure 3

Number of databases searched by the systematic reviews

Fig. 4
figure 4

Frequency of use of individual databases by the included systematic reviews

Fifty-eight SRs were published after 2009 and included in the correlation calculation between the use of PRISMA and the year of publication. Twenty-nine of the 58 SRs (50%) reported using PRISMA. A significant moderate correlation of 0.68 between the use of PRISMA and the year of publication was found, indicating that more recent SRs more often apply PRISMA.

Appendix 1 contains a full list of the included SRs and their characteristics. Appendix 2 provides a complete list of all included randomized controlled trials, their characteristics, and in which databases they were found.

Unique RCTs per database

Eighteen of the 442 RCTs (4.1%) were not found in any of the nine databases. Thirteen (2.9%) RCTs were unique to only one database, Google Scholar (n = 6), PEDro (n = 4), and CENTRAL (n = 3). When excluding Google Scholar from the analysis 24 of the 442 RCTs (5.4%) were not found in any of the eight databases. Ten (2.3%) RCTs were unique to only one database, PEDro (n = 5), Cochrane Library (n = 4), and Index to Chiropractic Literature (n = 1), listed in Table 2. The 18 RCTs not found in any of the nine databases were primarily in Chinese, further details are listed in Table 3

Table 2 Number of unique RCTs per database
Table 3 The 18 RCTs not found in any of the nine searched databases

Coverage rates

For each of the databases, their overall coverage rate was calculated, and Cochrane Library obtained the highest individual coverage rate of 91.6%, followed by Google Scholar (88.2%) and EMBASE (85.5%).

Combined recall rates of three databases performed better, with the highest recall rate at 95.9% obtained by CENTRAL, Google Scholar, and PEDro. This combination was able to retrieve all 424 possible RCTs. The best combinations of four performed similarly, though the best performing combination of four databases excluding Google Scholar retrieved one more RCT (n = 418), than the best combination of three. The minimum recall per SR was zero for all nine databases due to one SR, including four RCTs, that were not found by any database. Tables 4, 5, 6 and 7 shows overall recall rates, mean, median, and 100% recall per SR of all individual databases and the three best performing combinations with and without Google Scholar. A complete list of all combinations of two, three and, four databases and their recall rates can be found in Appendix 3.

Table 4 Coverage rates of individual databases
Table 5 Combined coverage rates of two databases
Table 6 Combined coverage rates of three databases
Table 7 Combined recall rates of four databases


On average, the SRs searched 5.8 databases, commonly corresponding to the Cochrane Handbook for Systematic Reviews of Interventions (i.e., MEDLINE/PubMed, Cochrane Library, and EMBASE) [11]. The SRs contained 14.4 RCTs on average, with an IQR of 15, indicating a large variation in research available depending on the topic within SMT. The large proportion of duplicate RCTs (64%) within all the included SRs, reflect a considerable overlap with many similar SRs on SMT in general.

The single database with the highest overall coverage rate was Cochrane Library (91.6%). It also outperformed the other databases on mean, median, and 100% coverage per SR, retrieving all RCTs in 75.3% of the included SRs. Adding Google Scholar, the coverage rate increased to 94.3%, only seven short of detecting the 424 possible RCTs. Excluding Google Scholar, the combination of Cochrane Library and PEDro retrieved 93.7% of all RCTs. The best combination of three databases, Cochrane Library, Google Scholar, and PEDro, was able to retrieve all possible RCTs with a coverage rate of 95.9%. When excluding Google Scholar, the best combination was Cochrane Library, PEDro, and ICL or EMBASE, with a coverage rate of 94.3%, retrieving eight more RCTs than Cochrane Library, MEDLINE/PubMed, and EMBASE combined, as recommended by the Cochrane Handbook. Although CINAHL was used more frequently than PEDro and performed better on its own than ICL, we suggest using PEDro or ICL over CINAHL when searching multiple databases. This is mainly due to fact that PEDro and ICL performed better than CINAHL when combined with Cochrane Library or both Cochrane Library and MEDLINE/PubMed. Furthermore, CINAHL did not retrieve any unique RCTs, while PEDro and ICL retrieved five and one unique RCTs, respectively, when excluding Google Scholar from the analysis.

Bramer et al. [32] suggested that an acceptable literature search for a SR should cover at least 95% of all possible studies. This was possible using any combination of Cochrane Library, Google Scholar, and PEDro/EMBASE/ICL. However, 18 RCTs were not found in any of the nine databases investigated in this study, resulting in the highest possible coverage rate being 95.9% (94.6% when excluding Google Scholar). However, we still find our results representative for conducting a thorough search as the same 18 RCTs limited our findings. Two of the included SRs contained 12 of the 18 RCTs not found, and for these apply, that they searched in either Chinese databases or databases explicitly related to osteopathy. The rest were found in six different SRs. The major challenge was Chinese literature (n = 11). Most likely because they are only indexed in databases other than the ones we searched, although issues relating to translation cannot be ruled out. The large diversity in databases searched by the SRs, especially Asian databases, and the amount of Chinese studies not found might suggest that a wide diversity of electronic databases is required to find all relevant materials. Our findings underline this, where PEDro found most unique references when ignoring Google scholar. Further research should aim to determine the role of Asian databases when performing SRs of SMT. Moreover, authors should remember the importance of a wide diversity of electronic databases combined with additional methods than electronic databases when searching for literature. These methods include hand searching journals, conference proceedings, searching reference lists of previously conducted systematic and narrative reviews, contacting experts in the field, and searching databases related to theses and dissertations [11, 12]. Searching ongoing and unpublished studies (often referred to as gray literature) also make up an important part of a systematic literature search, but since unpublished literature was excluded from this study, we cannot provide any specific considerations.

Overall, our results suggest that, in theory, using Cochrane's recommended databases along with PEDro and ICL appears sufficient to capture more than 95% of all SMT RCTs. Supporting the results of Aagaard et al. [31], who concluded that searching MEDLINE, EMBASE, and CENTRAL were insufficient when searching for musculoskeletal disorders. In their study, adding PEDro or ICL did not improve their search. However, their scope was much broader than ours. It is not unlikely that our findings can be extrapolated into manual therapy in general, as different types of interventions typically share (1) journals, (2) keywords, and (3) professions who administrate them. However, this is entirely speculative. Given our findings, we suggest that when performing reviews related to specific professions (e.g., chiropractic), selecting profession-specific databases (e.g., PEDro or ICL) in addition to Cochrane's recommended databases may provide more unique RCTs. Likewise, a review with another profession-oriented approach than chiropractic or physiotherapy could arguably exchange the ICL/PEDro for another profession-related database (e.g., Osteopathic Research Web for osteopaths).

Ranking databases based on coverage rates presents some challenges. The presence of a relevant study in a database does not automatically correspond to that study being found by the search strategy used (e.g., the selected keywords). This limitation becomes evident in the case of Google Scholar. Google Scholar achieved the second-highest single database coverage rate of 88.2% and was a part of the combinations with the highest coverage rates. Also, Google Scholar was able to identify the most unique RCTs (six) of all databases. Despite these impressive results, it has previously been reported that the precision of Google Scholar is low [40, 41]. Because of this and other limitations in its search functions, it has been assessed to be inadequate as a standalone database and should rather be used in addition to traditional databases [43]. An example could be to search for gray literature and quality assurance.

Only 16 (19%) of the SRs reported the use of an information specialist contradicting general guidelines [11] and suggestions from previous studies [34, 35]. However, this may be the result of under-reporting [35]. We highly suggest using an information specialist when conducting a SR since it enhances the quality of the SR [35, 44]. We would also remind researchers to report the use of information specialists when used, and acknowledge their work, either as an author, if they qualify for that according to the Vancouver guidelines, or in acknowledgments [45].

The increased number of databases searched and the increased use of PRISMA in recent years may reflect a tendency towards more emphasis on thorough methodology and transparency. This may explain the large number of duplicates found. Earlier SRs may be of such low quality that the information disserted is inapplicable to clinical practice, and newer SRs provide a more thorough and detailed dissertation. While this is speculative, some evidence suggests that the manual therapy professions have provided higher quality research in recent years [46, 47].


The assumption that the 442 included RCTs make up all relevant effect studies in the field of SMT is idealistic, as we did not perform a thorough systematic search and data extraction but an exploratory study. First, other SRs may have been found in databases other than PubMed and Epistemonikos or without using the “Systematic review” filter. Second, the included SRs may have excluded RCTs considered irrelevant for their purpose but could have been relevant in the context of contribution of databases. Third, the included SRs were published from 1985 to 2021 and may not include most recent RCTs. Fourth, older SRs may not have had access to the same databases as today. An example of this is Google Scholar, which was first released in 2004 [48], prior SRs would not have been able to use that database. However, despite all of this, we consider our sample size sufficient to provide thorough recommendations for future SMT reviews.

As mentioned above, evaluating databases solely on their performance in coverage rate and ability to find unique RCTs alone is not adequate. The fact that a database contains a reference is not the same as that reference being found using a search string or that a link to the full text is available. Our findings cannot be directly generalized to other fields as the performance of the databases greatly depends on the topic. Another limitation revolves around the selection of investigated databases. Our findings might look different if additional profession-specific databases were included (e.g., those related to osteopathy).


Cochrane Library had the single highest overall coverage rate and contained the third most unique RCTs of the nine databases investigated. The combination which performed best excluding Google Scholar, was Cochrane Library, PEDro, Index to Chiropractic Literature and either EMBASE, MEDLINE/PubMed, or CINAHL, with a coverage of 94.6%.

For studies related to SMT, we suggest following the recommendations by the Cochrane Handbook searching Cochrane Library, MEDLINE, and EMBASE and adding PEDro and Index to Chiropractic Literature. In addition, Google Scholar might also be used to search gray literature and quality assurance or can be included in the search strategy depending on authors’ available research time and ambition.

Researchers should apply these results to select the most relevant databases for future SMT reviews. Furthermore, our findings should be translated to other areas of manual therapy.

Availability of data and materials

All data is available in the supplementary material. For details on the coding procedudres please contact



Systematic review


Randomized controlled trial


The Cochrane Central Register of Controlled Trials


Spinal manipulative therapy


The preferred reporting items for systematic reviews and meta-analyses




Patient-reported outcome measure


Index to Chiropractic Literature


Digital object identifier


Interquartile range


  1. Guyatt GH, Sackett DL, Sinclair JC, Hayward R, Cook DJ, Cook RJ, et al.. Users’ guides to the medical literature: IX. A method for grading health care recommendations. JAMA. 1995;274(22):1800–4.

    Article  CAS  Google Scholar 

  2. Greenhalgh T. How to read a paper: getting your bearings (deciding what the paper is about). BMJ. 1997;315(7102):243–6.

    Article  CAS  Google Scholar 

  3. Gray JAM, Shepperd S, Ison E, Lees R, Pearce-Smith N. Evidence-based healthcare and public health: how to make decisions about health services and public health. New York: Churchill Livingstone; 2009.

    Google Scholar 

  4. Lund H, Brunnhuber K, Juhl C, Robinson K, Leenaars M, Dorch BF, et al.. Towards evidence based research. Bmj. 2016;355:i5440.

    Article  Google Scholar 

  5. Guyatt GH, Mills EJ, Elbourne D. In the era of systematic reviews, does the size of an individual trial still matter? PLoS Med. 2008;5(1):e4.

    Article  Google Scholar 

  6. Levay P, Raynor M, Tuvey D. The contributions of MEDLINE, other bibliographic databases and various search techniques to NICE public health guidance. Evid Based Libr Inf Pract. 2015;10(1):50–68.

    Article  Google Scholar 

  7. Stevinson C, Lawlor DA. Searching multiple databases for systematic reviews: added value or diminishing returns? Complement Ther Med. 2004;12(4):228–32.

    Article  CAS  Google Scholar 

  8. Lawrence DW. What is lost when searching only one literature database for articles relevant to injury prevention and safety promotion? Inj Prev. 2008;14(6):401–4.

    Article  CAS  Google Scholar 

  9. Lemeshow AR, Blum RE, Berlin JA, Stoto MA, Colditz GA. Searching one or two databases was insufficient for meta-analysis of observational studies. J Clin Epidemiol. 2005;58(9):867–73.

    Article  Google Scholar 

  10. Silverman J. For literature searches, is medline enough? Lab Anim (NY). 2004;33(2):15.

    Article  Google Scholar 

  11. Higgins JP, Thomas J, Chandler J, Cumpston M, Li T, Page MJ, et al.. Cochrane handbook for systematic reviews of interventions Version 6.2 (updated February 2021). Cochrane. 2021.

  12. O’Connor AM, Anderson KM, Goodell CK, Sargeant JM. Conducting systematic reviews of intervention questions I: writing the review protocol, formulating the question and searching the literature. Zoonoses Public Health. 2014;61(Suppl 1):28–38.

    Article  Google Scholar 

  13. Cogo E, Sampson M, Ajiferuke I, Manheimer E, Campbell K, Daniel R, et al.. Searching for controlled trials of complementary and alternative medicine: a comparison of 15 databases. Evid Based Complement Altern Med. 2011;2011:858246.

    Article  Google Scholar 

  14. Aker PD, McDermaid C, Opitz BG, White MW. Searching chiropractic literature: a comparison of three computerized databases. J Manip Physiol Ther. 1996;19(8):518–24.

    CAS  Google Scholar 

  15. Corp N, Mansell G, Stynes S, Wynne-Jones G, Morsø L, Hill JC, et al.. Evidence-based treatment recommendations for neck and low back pain across Europe: a systematic review of guidelines. Eur J Pain. 2021;25(2):275–95.

    Article  Google Scholar 

  16. Rubinstein SM, de Zoete A, van Middelkoop M, Assendelft WJJ, de Boer MR, van Tulder MW. Benefits and harms of spinal manipulative therapy for the treatment of chronic low back pain: systematic review and meta-analysis of randomised controlled trials. BMJ. 2019;364:l689.

    Article  Google Scholar 

  17. Langevin HM, Shurtleff D, Mudd L, Sabri M. Spinal manipulation: what you need to know: National Center for Complementary and Integrative Health (NCCIH). 2019. Updated July 2019.

  18. Bramer WM, de Jonge GB, Rethlefsen ML, Mast F, Kleijnen J. A systematic approach to searching: an efficient and complete method to develop literature searches. J Med Libr Assoc. 2018;106(4):531–41.

    Article  Google Scholar 

  19. Beyer FR, Wright K. Can we prioritise which databases to search? A case study using a systematic review of frozen shoulder management. Health Inf Libr J. 2013;30(1):49–58.

    Article  Google Scholar 

  20. Wright K, Golder S, Lewis-Light K. What value is the CINAHL database when searching for systematic reviews of qualitative studies? Syst Rev. 2015;4:104.

    Article  Google Scholar 

  21. Brand-de Heer DL. A comparison of the coverage of clinical medicine provided by PASCAL BIOMED and MEDLINE. Health Inf Libr J. 2001;18(2):110–6.

    Article  CAS  Google Scholar 

  22. Wilkins T, Gillies RA, Davies K. EMBASE versus MEDLINE for family medicine searches: can MEDLINE searches find the forest or a tree? Can Fam Physician. 2005;51(6):848–9.

    Google Scholar 

  23. Halladay CW, Trikalinos TA, Schmid IT, Schmid CH, Dahabreh IJ. Using data sources beyond PubMed has a modest impact on the results of systematic reviews of therapeutic interventions. J Clin Epidemiol. 2015;68(9):1076–84.

    Article  Google Scholar 

  24. Ahmadi M, Ershad-Sarabi R, Jamshidiorak R. Comparison of bibliographic databases in retrieving information on telemedicine. J Kerman Univ Med Sci. 2014;21(4):343–53.

    Google Scholar 

  25. Lorenzetti DL, Topfer LA, Dennett L, Clement F. Value of databases other than medline for rapid health technology assessments. Int J Technol Assess Health Care. 2014;30(2):173–8.

    Article  Google Scholar 

  26. Beckles Z, Glover S, Ashe J, Stockton S, Boynton J, Lai R, et al.. Searching CINAHL did not add value to clinical questions posed in NICE guidelines. J Clin Epidemiol. 2013;66(9):1051–7.

    Article  Google Scholar 

  27. Hartling L, Featherstone R, Nuspl M, Shave K, Dryden DM, Vandermeer B. The contribution of databases to the results of systematic reviews: a cross-sectional study. BMC Med Res Methodol. 2016;16(1):127.

    Article  Google Scholar 

  28. Justesen T, Freyberg J, Schultz A. Database selection and data gathering methods in systematic reviews of qualitative research regarding diabetes mellitus - an explorative study. BMC Med Res Methodol. 2021;21(1):94.

    Article  Google Scholar 

  29. Bramer WM, Giustini D, Kramer BM, Anderson P. The comparative recall of Google Scholar versus PubMed in identical searches for biomedical systematic reviews: a review of searches used in systematic reviews. Syst Rev. 2013;2:115.

    Article  Google Scholar 

  30. Bramer WM, Giustini D, Kramer BM. Comparing the coverage, recall, and precision of searches for 120 systematic reviews in Embase, MEDLINE, and Google Scholar: a prospective study. Syst Rev. 2016;5:39.

    Article  Google Scholar 

  31. Aagaard T, Lund H, Juhl C. Optimizing literature search in systematic reviews - are MEDLINE, EMBASE and CENTRAL enough for identifying effect studies within the area of musculoskeletal disorders? BMC Med Res Methodol. 2016;16(1):161.

    Article  Google Scholar 

  32. Bramer WM, Rethlefsen ML, Kleijnen J, Franco OH. Optimal database combinations for literature searches in systematic reviews: a prospective exploratory study. Syst Rev. 2017;6(1):245.

    Article  Google Scholar 

  33. Page MJ, McKenzie JE, Bossuyt PM, Boutron I, Hoffmann TC, Mulrow CD, et al.. The PRISMA 2020 statement: an updated guideline for reporting systematic reviews. BMJ. 2021;372:n71.

    Article  Google Scholar 

  34. Dudden RF, Protzko SL. The systematic review team: contributions of the health sciences librarian. Med Ref Serv Q. 2011;30(3):301–15.

    Article  Google Scholar 

  35. Koffel JB. Use of recommended search strategies in systematic reviews and the impact of librarian involvement: a cross-sectional survey of recent authors. PLoS ONE. 2015;10(5):e0125931.

    Article  Google Scholar 

  36. Liberati A, Altman DG, Tetzlaff J, Mulrow C, Gøtzsche PC, Ioannidis JP, et al.. The PRISMA statement for reporting systematic reviews and meta-analyses of studies that evaluate health care interventions: explanation and elaboration. PLoS Med. 2009;6(7):e1000100.

    Article  Google Scholar 

  37. Rada G, Pérez D, Araya-Quintanilla F, Ávila C, Bravo-Soto G, Bravo-Jeria R, et al.. Epistemonikos: a comprehensive database of systematic reviews for health decision-making. BMC Med Res Methodol. 2020;20(1):286.

    Article  Google Scholar 

  38. National Library of Medicine. MEDLINE, PubMed, and PMC (PubMed Central): How are they different? 2021. Updated 13 Oct 2021.

  39. Koo TK, Li MY. A guideline of selecting and reporting intraclass correlation coefficients for reliability research. J Chiropr Med. 2016;15(2):155–63.

    Article  Google Scholar 

  40. Boeker M, Vach W, Motschall E. Google Scholar as replacement for systematic literature searches: good relative recall and precision are not enough. BMC Med Res Methodol. 2013;13:131.

    Article  Google Scholar 

  41. Shultz M. Comparing test searches in PubMed and Google Scholar. J Med Libr Assoc. 2007;95(4):442–5.

    Article  Google Scholar 

  42. Wickham H, Averick M, Bryan J, Chang W, McGowan L, François R, et al. Welcome to the tidyverse. J Open Source Softw. 2019;4(43):1686.

    Article  Google Scholar 

  43. Haddaway NR, Collins AM, Coughlin D, Kirk S. The role of Google Scholar in evidence reviews and its applicability to grey literature searching. PLoS ONE. 2015;10(9):e0138237.

    Article  Google Scholar 

  44. Rethlefsen ML, Farrell AM, Osterhaus Trzasko LC, Brigham TJ. Librarian co-authors correlated with higher quality reported search strategies in general internal medicine systematic reviews. J Clin Epidemiol. 2015;68(6):617–26.

    Article  Google Scholar 

  45. International Committee of Medical Journal Editors (ICMJE). Recommendations for the conduct, reporting, editing, and publication of scholarly work in medical journals. 2022.

  46. Karpouzis F, Bonello R, Pribicevic M, Kalamir A, Brown BT. Quality of reporting of randomised controlled trials in chiropractic using the CONSORT checklist. Chiropr Man Ther. 2016;24(1):19.

    Article  Google Scholar 

  47. McCambridge AB, Nasser AM, Mehta P, Stubbs PW, Verhagen AP. Has reporting on physical therapy interventions improved in 2 decades? An analysis of 140 trials reporting on 225 interventions. J Orthop Sports Phys Ther. 2021;51(10):503–9.

    Article  Google Scholar 

  48. Giles J. Science in the web age: start your engines. Nature. 2005;438(7068):554–5.

    Article  CAS  Google Scholar 

Download references


We acknowledge the reviewers who provided detailed revisions, significantly improving the manuscript.


No funding was received.

Author information

Authors and Affiliations



MNE: Methodology, Investigation, Data curation, Formal Analysis, Visualization, Writing—Original Draft Preparation. SDM: Investigation, Data curation, Writing—Review and Editing. ANS: Conceptualization, Methodology, Supervision, Writing—Review and Editing. CGN: Conceptualization, Methodology, Supervision, Writing—Review and Editing. All authors read and approved the final manuscript.

Corresponding author

Correspondence to Casper Glissmann Nim.

Ethics declarations

Ethics approval and consent to participate

No ethical approved was needed for this work.

Consent for publication

Not applicable.

Competing interests

The authors declare no competing interests.

Additional information

Publisher's Note

Springer Nature remains neutral with regard to jurisdictional claims in published maps and institutional affiliations.


Appendix 1

See Table 8.

Table 8 Overview of the included SRs

Appendix 2

See Table 9.

Table 9 Overview of all unique randomized controlled trials included

Appendix 3

See Table 10.

Table 10 Complete list of recall rates for all combinations of two, three and four databases

Rights and permissions

Open Access This article is licensed under a Creative Commons Attribution 4.0 International License, which permits use, sharing, adaptation, distribution and reproduction in any medium or format, as long as you give appropriate credit to the original author(s) and the source, provide a link to the Creative Commons licence, and indicate if changes were made. The images or other third party material in this article are included in the article's Creative Commons licence, unless indicated otherwise in a credit line to the material. If material is not included in the article's Creative Commons licence and your intended use is not permitted by statutory regulation or exceeds the permitted use, you will need to obtain permission directly from the copyright holder. To view a copy of this licence, visit The Creative Commons Public Domain Dedication waiver ( applies to the data made available in this article, unless otherwise stated in a credit line to the data.

Reprints and permissions

About this article

Check for updates. Verify currency and authenticity via CrossMark

Cite this article

Eybye, M.N., Madsen, S.D., Schultz, A.N.Ø. et al. Database coverage and their use in systematic reviews regarding spinal manipulative therapy: an exploratory study. Chiropr Man Therap 30, 57 (2022).

Download citation

  • Received:

  • Accepted:

  • Published:

  • DOI: