Lan;27nov99

I m p r oving the quality of reports of meta-analyses of randomisedcontrolled trials: the QUOROM statement David Moher, Deborah J Cook, Susan Eastwood, Ingram Olkin, Drummond Rennie, Donna F Stroup, for the QUOROM Group* I n t r o d u c t i o nHealth-care providers and other decision-makers now B a c k g r o u n d The Quality of Reporting of Meta-analyses have, among their information resources, a form of clinical (QUOROM) conference was convened to address standards for report called the meta-analysis,1 - 4 a review in which bias has improving the quality of reporting of meta-analyses of clinical been reduced by the systematic identification, appraisal, randomised controlled trials (RCTs).
synthesis, and, if relevant, statistical aggregation of all M e t h o d s The QUOROM group consisted of 30 clinical relevant studies on a specific topic according to a predetermined and explicit method.3 The number of researchers. In conference, the group was asked to identify published meta-analyses has increased substantially in the items they thought should be included in a checklist of past decade.5 These integrative articles can be helpful forclinical decisions, and they may also serve as the policy standards. Whenever possible, checklist items were guided by research evidence suggesting that failure to adhere to the economic evaluations, and future research agendas. The item proposed could lead to biased results. A modified Delphi value of meta-analysis is evident in the work of the technique was used in assessing candidate items.
international Cochrane Collaboration,6 , 7 F i n d i n g s The conference resulted in the QUOROM statement, purpose of which is to generate and disseminate high- a checklist, and a flow diagram. The checklist describes our quality systematic reviews of health-care interventions.
preferred way to present the abstract, introduction, methods, Like any research enterprise, particularly one that is results, and discussion sections of a report of a meta- observational, the meta-analysis of evidence can be flawed.
analysis. It is organised into 21 headings and subheadings Accordingly, the process by which meta-analyses are regarding searches, selection, validity assessment, data carried out has undergone scrutiny. A 1987 survey of 86 abstraction, study characteristics, and quantitative data English-language meta-analyses8 assessed each publication synthesis, and in the results with “trial flow”, study on 23 items from six content areas judged important in the characteristics, and quantitative data synthesis; research conduct and reporting of a meta-analysis of randomised documentation was identified for eight of the 18 items. The trials: study design, combinability, control of bias, flow diagram provides information about both the numbers of statistical analysis, sensitivity analysis, and problems ofapplicability. The survey results showed that only 24 (28%) RCTs identified, included, and excluded and the reasons for of the 86 meta-analyses reported that all six content areas had been addressed. The updated survey, which included I n t e r p r e t a t i o n We hope this report will generate further more recently published meta-analyses, showed little thought about ways to improve the quality of reports of meta- improvement in the rigour with which they were reported.9 analyses of RCTs and that interested readers, reviewers, Several publications have described the science of researchers, and editors will use the QUOROM statement and reviewing research,1 differences among narrative reviews, systematic reviews, and meta-analyses,2 and how to carryo u t ,3 , 4 , 1 0 critically appraise,1 1 – 1 5 and apply1 6 meta-analyses in practice. The increase in the number of meta-analyses published has highlighted such issues as discordant meta-analyses on the same topic1 7 and discordant meta-analysesand randomised-trial results on the same question.1 8 An important consideration in interpretation and use of meta-analyses is to ascertain that the investigators who didthe meta-analysis not only report explicitly the methods they used to analyse the articles they reviewed, but also University of Ottawa, Thomas C Chalmers Centre for Systematic report the methods used in the research articles they Reviews, Ottawa (D Moher MSc); McMaster University, Hamilton analysed. The meta-analytical review methods used may (D J Cook MD), Ontario, Canada; University of California, not be provided when a paper is initially submitted: even San Francisco (S Eastwood ELS(D)); Stanford University, Stanford, CA when they are, other factors such as page limitations, peer (I Olkin PhD); JAMA, Chicago, IL (D Rennie PhD); and Centers forDisease Control and Prevention, Atlanta, GA, USA (D F Stroup review, and editorial decisions may change the content and format of the report before publication.
Correspondence to: Dr David Moher, Thomas C Chalmers Centre for Several investigators have suggested guidelines for Systematic Reviews, Children’s Hospital of Eastern Ontario ResearchInstitute, 401 Smyth Road, Ottawa, Ontario K1H 8L1, Canada reporting of meta-analyses.3 , 1 9 However, a consensus across disciplines has not developed. After the initiative to THE LANCET • Vol 354 • November 27, 1999 Copyright 1999. All rights reserved.
improve the quality of reporting of randomised controlled the UK and North America who are interested in meta- trials (RCTs),2 0 – 2 2 we organised the Quality of Reporting of analysis. These 30 individuals were invited to a conference in Meta-analyses (QUOROM) conference to address these Chicago on Oct 2–3, 1996. Participants were surveyed before issues as they relate to meta-analyses of RCTs. This report the meeting to elicit their views on current reporting standardsof meta-analyses and whether these needed improvement. In summarises the proceedings of that conference. The issues addition, they were sent relevant citations for review and were discussed might also be useful for reporting of systematic asked to indicate in which of the six groups they wished to reviews (ie, meta-analysis, as defined above, without statistical aggregation), particularly of RCTs.
The conference included small-group and plenary sessions.
Each small group had a facilitator who was a member of the steering committee and was responsible for ensuring the The QUOROM steering committee began with a comprehensive discussions of as many as possible of the issues relevant to their review of publications on the conduct and reporting of meta- specific remit. Each small group also had a recorder, who was analyses. The databases searched included MEDLINE and the responsible for documenting the main points and the consensus on Cochrane Library,2 3 which consists of the Cochrane Database of each issue discussed during that session; the recorder presented Systematic Reviews, the Cochrane Controlled Trials Register, the the group's consensus during the plenary sessions. During the York Database of Abstracts of Reviews of Effectiveness, and the plenary sessions, an elected scribe from each small group was Cochrane Review Methodology Database. We examined reference responsible for recording the principal points relevant to that lists of the retrieved articles and individual personal files. Articles group's charge that arose during the plenary discussion.
of potential relevance were retrieved and critically appraised by the The participants in each small group were asked to identify QUOROM steering committee. The committee generated a draft items that they thought should be included in a checklist of agenda for the conference, which included six domains requiring standards that would be useful for investigators, editors, and peer discussion and debate. The content areas were slightly modified reviewers. We asked that, whenever possible, items included in the during preliminary discussions at the conference and are reported checklist be guided by research evidence that suggested that a as: the search for the evidence; decision-making on which evidence failure to adhere to the particular checklist item proposed could to include; description of the characteristics of primary studies; lead to biased results. For example, a substantial lack of sensitivity quantitative data synthesis; reliability and issues related to internal and specificity of MEDLINE searches is evident.2 4 Therefore, the validity (or quality); and clinical implications related to external checklist suggests that investigators explicitly describe all search strategies used to locate articles for inclusion in a meta-analysis. In In planning the QUOROM conference, the steering committee considering whether candidate items were essential, each subgroup identified clinical epidemiologists, clinicians, statisticians, and used a modified Delphi technique2 5 that was replicated in the researchers who conduct meta-analysis as well as editors from Identify the report as a meta-analysis [or systematic review] of RCTs26 The databases (ie, list) and other information sources The selection criteria (ie, population, intervention, outcome, and study design);methods for validity assessment, data abstraction, and study characteristics, andquantitative data synthesis in sufficient detail to permit replication Characteristics of the RCTs included and excluded; qualitative and quantitativefindings (ie, point estimates and confidence intervals); and subgroup analyses The explicit clinical problem, biological rationale for the intervention, and rationale for review The information sources, in detail28 (eg, databases, registers, personal files, expertinformants, agencies, hand-searching), and any restrictions (years considered, publicationstatus,29 language of publication30,31) The inclusion and exclusion criteria (defining population, intervention, principaloutcomes, and study design32 The criteria and process used (eg, masked conditions, quality assessment, and their findings33–36) The process or processes used (eg, completed independently, in duplicate)35,36 The type of study design, participants’ characteristics, details of intervention, outcomedefinitions, &c,37 and how clinical heterogeneity was assessed The principal measures of effect (eg, relative risk), method of combining results (statistical testing and confidence intervals), handling of missing data; how statisticalheterogeneity was assessed;38 a rationale for any a-priori sensitivity and subgroup analyses;and any assessment of publication bias39 Provide a meta-analysis profile summarising trial flow (see figure) Present descriptive data for each trial (eg, age, sample size, intervention, dose, duration,follow-up period) Report agreement on the selection and validity assessment; present simple summaryresults (for each treatment group in each trial, for each primary outcome); present dataneeded to calculate effect sizes and confidence intervals in intention-to-treat analyses (eg 2ϫ2 tables of counts, means and SDs, proportions) Summarise key findings; discuss clinical inferences based on internal and external validity;interpret the results in light of the totality of available evidence; describe potentialbiases in the review process (eg, publication bias); and suggest a future research agenda THE LANCET • Vol 354 • November 27, 1999 Copyright 1999. All rights reserved.
R e s u l t sThe conference resulted in the QUOROM statement: achecklist (table) and a flow diagram (figure). The checklistof standards for reporting of meta-analyses describes ourpreferred way to present the abstract, introduction,methods, results, and discussion sections of a report of ameta-analysis. The checklist is organised into 21 headingsand subheadings to encourage authors to provide readerswith quantitative data synthesis, and trial flow. Authors areasked to provide a flow diagram (figure) providinginformation about the number of RCTs identified,included, and excluded and the reasons for excludingt h e m .1 0 P r e t e s t i n gAfter development of the checklist and flow diagram, twomembers of the steering committee (DM, DJC) undertookpretesting with epidemiology graduate students studyingmeta-analysis, residents in general internal medicine,participants at a Canadian Cochrane Center workshop,and faculty members of departments of medicine and ofepidemiology and biostatistics. One group of candidates fora master's degree in epidemiology used the checklist andflow diagram to report their meta-analyses as if their workwere being submitted for publication. Feedback from thesefour groups was positive, most users stating that the Progress through the stages of a meta-analysis for RCTs checklist and flow diagram would be likely to improvereporting standards. Modifications of the checklist (eg, discussion should include comments about whether the inclusion of a statement about major findings) and changes results obtained may have been influenced by such bias.
to the flow diagram (eg, more detail) were incorporated.
Publication bias derives from the selective publishing ofstudies with statistically significant or directionally positiver e s u l t s ,4 0 – 4 2 and it can lead to inflated estimates of efficacy in meta-analyses. For example, trials of single alkylating In developing the checklist, we identified supporting agents versus multiple-agent cytotoxic chemotherapy in the scientific evidence for only eight of 18 items to guide the treatment of ovarian cancer have been analysed.3 9 reporting of meta-analyses of RCTs.2 6 - 3 9 Some of this Published trials yielded significant results in favour of the evidence is indirect. For example, we ask authors to use a multiple-agent therapy, but that finding was not supported structured abstract format. The supporting evidence for when the results of all trials—both those published and this item was collected by examining abstracts of original those registered but not published—were analysed.
reports of individual studies2 7 and may not pertain The statement asks authors to be explicit about the specifically to the reporting of meta-analyses. However, the publication status of reports included in a meta-analysis.
QUOROM group judged this a reasonable approach by Only about a third of published meta-analyses report the analogy with other types of research reports and pending inclusion of unpublished data.2 9 , 4 3 Although one study further evidence about the merits of structured abstracts found that there were no substantial differences in the dimensions of study quality between published and We have asked authors to be explicit in reporting the unpublished clinical research,4 2 another suggested that criteria used when assessing the “quality” of trials included intervention effects reported in journals were 33% greater in meta-analyses and the outcome of the quality than those reported in doctoral dissertations.4 4 The role of assessment. There is direct and compelling evidence to the “grey literature” (difficult to locate or retrieve) was support recommendations about reporting on the quality examined in 39 meta-analyses that included 467 RCTs, of RCTs included in a meta-analysis. A meta-analytic 102 of which were grey literature.2 9 Meta-analyses limited database of 255 obstetric RCTs provided evidence that to published trials, compared with those that included both trials with inadequate reporting of allocation concealment published and grey literature, overestimated the treatment (ie, keeping the intervention assignments hidden from all effect by an average of 12%. There is still debate between participants in the trial until the point of allocation) editors and investigators about the importance of including overestimated the intervention effect by 30% compared with trials in which this information was adequately We have asked authors to be explicit in reporting r e p o r t e d .3 3 Similar results for several disease categories and whether they have used any restrictions on language of methods of quality assessment have been reported.3 4 T h e s e publication. Roughly a third of published meta-analyses findings suggest that inclusion of reports of low-quality have some language restrictions as part of the eligibility RCTs in meta-analyses is likely to alter the summary criteria for including individual trials.3 0 The reason for such measures of the intervention effect.
restrictions is not clear, since there is no evidence to We also ask authors to be explicit in reporting support differences in study quality, and there is evidence assessment of publication bias, and we recommend that the that language restrictions may result in a biased summary.
THE LANCET • Vol 354 • November 27, 1999 Copyright 1999. All rights reserved.
The reports of 127 RCTs written in English, compared the QUOROM group need to survey the literature with those reported in four other languages, showed little continually to help inform themselves about emerging or no difference in several important methodological evidence on reporting of meta-analyses. This information f e a t u r e s .4 5 Similar results have been reported elsewhere.3 1 needs to be collated and presented annually for two The role of language restrictions has been studied in 211 purposes. The first is decisions on which checklist items to RCTs included in 18 meta-analyses in which trials keep, delete, or add; these decisions can be made similarly published in languages other than English were included in to the selection of the original items. The second purpose is the quantitative summary.3 0 Language-restricted meta- so that an up to date summary on the reporting of meta- analyses overestimated the treatment effect by only 2% on analyses can be prepared. These efforts are being average compared with language-inclusive meta-analyses.
coordinated through a website. This approach is similar to However, the language-inclusive meta-analyses were more In summary, our choice of items to include in a meta- Reports of RCTs with statistically positive results are analysis report was based on evidence whenever possible, more likely than those with negative results to be published which implies the need to include items that can in English.3 1 Likewise, there is emerging evidence to systematically influence estimates of treatment effects.
suggest that reports of RCTs from certain countries mostly Currently, we lack a detailed understanding of all the factors leading to bias in the result of a meta-analysis.
We used several methods to generate the checklist and Clearly, research is required to help improve the quality of flow diagram: a systematic review of the reporting of meta- reporting of meta-analyses. Such evidence may also act as a analyses; focus groups of the steering committee; and a catalyst for improving the methods by which meta-analyses modified Delphi approach during the conference.
Although we did not involve certain users of meta-analyses The QUOROM checklist and flow diagram are available (policy-makers or patients), we formally pretested this on The Lancet's website [www.thelancet.com]. We hope document with representatives of several constituencies that this document will generate further interest in the field who would use the recommendations and made of meta-analysis and that, like the CONSORT initiative, the QUOROM statement will become available in different The QUOROM group also discussed the format of a languages and locations as it is disseminated. We invite meta-analysis report, how best to assess the impact of the interested readers, reviewers, researchers, and editors to QUOROM statement, and how best to disseminate it. The use the QUOROM statement and generate ideas for format we recommend includes 15 subheadings that reflect the sequential stages in the conduct of the meta-analysis within the text of the report of a meta-analysis. The David Moher, Deborah Cook, Susan Eastwood, Ingram Olkin, checklist included in the statement can also be used during Drummond Rennie, and Donna Stroup developed the QUOROMstatement. They all planned the meeting, participated in regular the planning, performing, and reporting of a meta-analysis conference calls, identified and secured funding, identified and invited and during peer review of the report after its submission to participants, and planned the meeting agenda. All of them helped write We delayed publication of the QUOROM statement until its impact on the editorial process had been assessed.
D G Altman (ICRF/NHS Centre for Statistics in Medicine, Oxford, UK);J A Berlin (University of Pennsylvania, Philadelphia, PA, USA); L Bero We organised an RCT involving eight medical journals to (University of California, San Francisco, CA, USA); W DuMouchel assess the impact of use of QUOROM criteria on journal (AT&T Laboratories, New York, NY, USA); K Dickersin (Brown peer review. Accrual is now complete and we will report University, Providence, RI, USA); J J Deeks (ICRF/NHS Centre forStatistics in Medicine, Oxford, UK); P Fontanarosa (JAMA, Chicago, IL, USA); N Geller (National Heart, Lung, and Blood Institute, Bethesda, After about 5 weeks of electronic posting we had MD, USA); F Godlee (BMJ, London, UK); S Goodman (Annals of received five comments from investigators, whom we thank Internal Medicine, Philadelphia, PA, USA); R Horton (The Lancet,(London, UK); P Huston (University of Ottawa, Ottawa, Canada); for their thoughtful consideration of the statement. Several A R Jadad (McMaster University, Hamilton, Canada); K Kafadar issues, in particular in relation to terminology, cannot be (University of Colorado, Denver, CO, USA); T Klassen (University of addressed in the statement at present. The QUOROM Alberta, Edmonton, Canada); S Morton (RAND, Santa Monica, CA,USA); C Mulrow (University of Texas, San Antonio, TX, USA); S Pyke group is agreed on the importance of making changes to (GlaxoWellcome, London, UK); H S Sacks (Mount Sinai School of the checklist in the light of documented evidence and must Medicine, New York, NY, USA); K F Schulz, (Family Health resist changes based on opinion or anecdotal evidence International, Research Triangle Park, NC, USA); S G Thompson unless there is a compelling rationale for doing otherwise.
(Imperial College School of Medicine, London, UK); M Winker (JAMA,Chicago, IL, USA); S Yusuf (McMaster University, Hamilton, Canada).
Nonetheless, the issues raised have been noted for consideration and discussion in future.
We thank Iain Chalmers, Ted Colton, Sander Greenland, Brian Haynes, Several queries addressed the distinction between the Edward J Huth, Alessandro Liberati, Tom Louis, Roy Pitkin, David meta-analysis and systematic review. As we indicate in the Sackett, Trevor Sheldon, and Chris Silagy, for reviewing earlier drafts of this paper, and Jacqueline Page for helping with revisions.
Financial support was provided by Abbott Laboratories, Agency for QUOROM group agreed to observe the distinction as Health Care Policy & Research, GlaxoWellcome, and Merck & Co.
defined by the Potsdam consultation on meta-analysis.3 We were also asked to clarify the checklist item asking investigators to interpret their results in light of the totality Mulrow CD. The medical review article: state of the science.
of evidence. Increasingly, several meta-analyses on the Ann Intern Med 1987; 106: 485–88.
same topic are reported.4 7 - 4 9 If other similar reports are Cook DJ, Mulrow C, Haynes RB. Systematic reviews: synthesis of best available, authors should discuss their results as they relate evidence for clinical decisions. Ann Intern Med 1997; 126: 376–80.
Cook DJ, Sackett DL, Spitzer W. Methodologic guidelines forsystematic reviews of randomized controlled trials in health care from For the QUOROM statement to continue to be useful, it the Potsdam consultation on meta-analysis. J Clin Epidemiol 1995; 48:
must remain evidence based and up to date. Members of THE LANCET • Vol 354 • November 27, 1999 Copyright 1999. All rights reserved.
Deeks J, Glanville J, Sheldon T. Undertaking systematic reviews of 28 Tramér M, Reynolds DJM, Moore RA, McQuay HJ. Impact of covert research on effectiveness CRD guidelines for those carrying out or duplicate publication on meta-analysis: a case study. BMJ 1997; 315:
commissioning reviews. CRD report no 4. York: NHS Centre for Reviews and Dissemination, University of York, 1996.
29 McAuley L, Moher D, Tugwell P. The influence of grey literature on Chalmers I, Haynes RB. Reporting, updating, and correcting meta-analysis. MSc Thesis: University of Ottawa, 1999.
systematic reviews of the effects of health care. In: Chalmers I, 30 Moher D, Pham B, Klassen TP, et al. Does the language of Altman DG, eds. Systematic reviews. London: BMJ Publishing Group publication of reports of randomized trials influence the estimates of intervention effectiveness reported in meta-analyses? 6th Cochrane Bero L, Rennie D. The Cochrane Collaboration: preparing, maintaining, and disseminating systematic reviews of the effects of 31 Egger M, Zellweger-Zahner T, Schneider M, Junker C, Lengeler C, health care. JAMA 1995; 274: 1935–38.
Antes G. Language bias in randomised controlled trials published in Huston P. The Cochrane Collaboration helping unravel tangled web English and German. Lancet 1997; 350: 326–29.
woven by international research. Can Med Assoc J 1996; 154: 1389–92.
32 Khan KS, Daya S, Collins JA, Walter S. Empirical evidence of bias in Sacks HS, Berrier J, Reitman D, Ancona-Berk VA, Chalmers TC.
infertility research: overestimation of treatment effect in crossover Meta-analyses of randomized controlled trials. N Engl J Med 1987; trials using pregnancy as the outcome measure. Fertil Steril 1996; 65:
316: 450–55.
Sacks HS, Reitman D, Pagano D, Kupelnick B. Meta-analysis: an 33 Schulz KF, Chalmers I, Hayes RJ, Altman DG. Empirical evidence of update. Mt Sinai J Med 1996; 63: 216–24.
bias: dimensions of methodological quality associated with estimates 10 Mulrow CD, Oxman AD, eds. Cochrane Collaboration Handbook.
of treatment effects in controlled trials. JAMA 1995; 273:
In: The Cochrane Library [database on disk and CDROM]. Oxford: Cochrane Collaboration. Update Software: 1994, issue 4.
34 Moher D, Pham B, Jones A, et al. Does the quality of reports of 11 Oxman AD, Cook DJ, Guyatt GH, and the Evidence-Based Medicine randomised trials affect estimates of intervention efficacy reported in Working Group. Users' guides to the medical literature: VI, how to meta-analyses? Lancet 1998; 352: 609–13.
use an overview. JAMA 1994; 272: 1367–71.
35 Jadad AR, Moore RA, Carroll D, et al. Assessing the quality of reports 12 Klassen TP, Jadad AR, Moher D. Guides for reading and interpreting of randomized clinical trials: is blinding necessary? Control Clin Trials systematic reviews: 1, getting started. Arch Pediatr Adolesc Med 1998; 1996; 17: 1–12.
152: 700–04.
36 Berlin JA on behalf of the University of Pennsylvania meta-analysis 13 L'Abbé KA, Detsky AS, O'Rourke K. Meta-analysis in clinical blinding study group. Does blinding of readers affect the results of research. Ann Intern Med 1987; 107: 224–33.
meta-analyses? Lancet 1997; 350: 185–86.
14 Olkin I. A critical look at some popular meta-analytic methods. Am J 37 Barnes DE, Bero LA. Why review articles on the health effects of Epidemiol 1984; 140: 287–88.
passive smoking reach different conclusions. JAMA 1998; 279:
15 Olkin I. Statistical and theoretical considerations in meta-analysis.
J Clin Epidemiol 1995; 48: 133–46.
38 Thompson SG. Why sources of heterogeneity in meta-analysis should 16 Guyatt GH, Sackett DL, Sinclair J, Hayward R, Cook DJ, Cook RJ.
be investigated. BMJ 1994; 309: 1351–55.
Users' guides to the medical literature: IX, a method for grading health 39 Simes RJ. Publication bias: the case for an international registry of care recommendations. JAMA 1995; 274: 1800–04.
clinical trials. J Clin Oncol 1986; 4: 1529–41.
17 Jadad AR, Cook DJ, Browman G. A guide to interpreting discordant 40 Sterling TD, Rosenbaum WL, Weinkam JJ. Publication decisions systematic reviews. Can Med Assoc J 1997; 156: 1411–16.
revisited: the effect of the outcome of statistical tests on the decision to 18 LeLorier J, Gregroire G, Benhaddad A, Lapierre J, Derderian F.
publish and vice versa. Am Statist 1995; 49:108–12.
Discrepancies between meta-analyses and subsequent large 41 Dickersin K, Min YI. NIH clinical trials and publication bias. Online J randomized, controlled trials. N Engl J Med 1997; 337: 536–42.
Curr Clin Trials 1993: April 28; doc no 50.
19 Shea B, Dubé C, Moher D. Assessing the quality of reports of meta- 42 Easterbrook PJ, Berlin JA, Gopalan R, Matthews DR. Publication bias analyses: a systematic review of scales and checklists. In: Egger M, in clinical research. Lancet 1991; 337: 867–72.
Davey Smith G, Altman DG, eds. Systematic reviews, 2nd edn.
43 Cook DJ, Guyatt GH, Ryan G, et al. Should unpublished data be London: BMJ Publishing Group (in press).
included in meta-analyses? Current convictions and controversies.
20 The Standards of Reporting Trials Group. A proposal for structured JAMA 1993; 269: 2749–53.
reporting of randomized controlled trials. JAMA 1994; 272: 1926–31.
44 Smith ML. Publication bias and meta-analysis. Eval Educ 1980; 4:
21 The Asilomar Working Group on Recommendations for Reporting of Clinical Trials in the Biomedical Literature. Checklist of information 45 Moher D, Fortin P, Jadad AR, et al. Completeness of reporting of for inclusion in reports of clinical trials. Ann Intern Med 1996; 124:
trials published in languages other than English: implications for conduct and reporting of systematic reviews. Lancet 1996; 347:
22 Begg C, Cho M, Eastwood S, et al. Improving the quality of reporting of randomized controlled trials: the CONSORT statement. JAMA 46 Vickers A, Goyal N, Harland R, Rees R. Do certain countries produce 1996; 276: 637–39.
only positive results? A systematic review of controlled trials. Control 23 The Cochrane Library [database on disk and CDROM]. Oxford: Clin Trials 1998; 19: 159–66.
Cochrane Collaboration. Update Software, 1996, issue 3.
47 Kennedy E, Song F, Hunter R, Clark A, Gilbody S. Risperidone 24 Dickersin K, Scherer R, Lefebvre C. Identifying relevant studies for versus typical antipsychotic medication for schizophrenia (Cochrane systematic reviews. BMJ 1994; 309: 1286–91.
Review). In: Cochrane Library, issue 3. Oxford: Update Software, 25 Whitman N. The Delphi technique as an alternative for committee meetings. J Nurs Educ 1990; 29: 377–79.
48 Davies A, Adena MA, Keks NA, Catts SV, Lambert T, Schweitzer I.
26 Dickersin K, Higgins K, Meinert CL. Identification of meta-analyses: Risperidone versus haloperidol: I, meta-analysis of efficacy and safety.
the need for standard terminology. Control Clin Trials 1990; 11: 52–66.
Clin Ther 1998; 20: 58–71.
27 Taddio A, Pain T, Fassos FF, Boon H, Illersich AL, Einarson TR: 49 Leucht S, Pitschel-Walz G, Abraham D, Kissling W. Efficacy and Quality of nonstructured and structured abstracts of original research extrapyramidal side-effects of the new antipsychotics olanzapine, articles in the British Medical Journal, the Canadian Medical quetiapine, risperidone, and sertindole compared to conventional Association Journal and the Journal of the American Medical antipsychotics and placebo: a meta-analysis of randomized controlled Association. Can Med Assoc J 1994; 150: 1611–15.
trials. Schizophrenia Res 1999; 35: 51–68.
THE LANCET • Vol 354 • November 27, 1999 Copyright 1999. All rights reserved.
THE LANCET • Vol 354 • November 27, 1999 Copyright 1999. All rights reserved.

Source: http://www.emgo.nl/kc/analysis/statements/quorum%20review%20lancet%201991.pdf

Hybridsfinal

I’d almost forgotten that I’d arranged to meet her. But as soon as I saw abeautiful girl pushing open the door, I remembered I’d told her where I hangout. Francis, the owner, often let me sit here in the Twisted Strands, a back-street café for losers, nursing the same drink for hours. She shook off the street as she paused in the doorway, trying to spot mein the shadows. Compared to ever

Fijáte 445

Die Auswirkungen des Freihandelabkommen zwischen den USA, Mittelamerika und der Dominikanischen Republik (DR-CAFTA) auf den Medikamentenmarkt und das Gesundheitswesen in Guatemala Der folgende Artikel berichtet von einer Forschungsstudie, die im August diesen Jahres in der wissenschaftlichen Zeitschrift Health Affairs veröffentlicht wurde, und zeigt auf, wie die Freihandelsabkomme

© 2010-2018 Modern Medicine