• Control Clin Trials · Oct 1997

    Meta Analysis

    The relationship between study design, results, and reporting of randomized clinical trials of HIV infection.

    • J P Ioannidis, J C Cappelleri, H S Sacks, and J Lau.
    • Division of Geographic Medicine and Infectious Diseases, New England Medical Center Hospitals, Tufts University School of Medicine, Boston, Massachusetts, USA.
    • Control Clin Trials. 1997 Oct 1; 18 (5): 431-44.

    AbstractWe examined whether the study design of randomized clinical trials for medications against human immunodeficiency virus (HIV) may affect the results and whether the outcomes of these trials affect reporting and publication. We used a database of 71 published randomized HIV-related drug efficacy trials and considered the following study design factors: endpoint definition and method of analysis, masked design, sample size, and duration of follow-up. Large variation was noted in the methods of analysis for surrogate endpoints. Often statistical significance for a surrogate endpoint was not associated with statistical significance for the clinical endpoint or for survival in the same trial, although disagreements in the direction of the treatment effect for surrogate endpoints and survival within individual trials were uncommon. Open-label design seemed to affect the magnitude of the treatment effect for two treatments. The magnitude of the treatment effect in trials of zidovudine monotherapy was inversely related to their sample size, but this probably reflected the confounding effect of longer duration of follow-up in large trials (with a resulting loss of efficacy) rather than publication bias. There was, however, evidence for potential bias in reporting and publication of HIV-related trials. Meta-analyses of published trials for specific treatments demonstrated a sizable treatment benefit for all the examined medications regardless of whether these medications were officially approved, controversial, or abandoned, raising concerns about either publication bias or unjustifiable rejection of potentially useful medications. Compared with trials published in specialized journals, trials published in journals of wide readership were larger (p = 0.001) and 4.4 times more likely to report "positive" results (p = 0.01). We identified several examples of trials with "negative" results that have remained unpublished for a long time. In conclusion, study design factors may have an impact on the magnitude and significance of the treatment effect in HIV-related trials. Bias in reporting can further affect the information that these studies provide.

      Pubmed     Full text   Copy Citation     Plaintext  

      Add institutional full text...

    Notes

     
    Knowledge, pearl, summary or comment to share?
    300 characters remaining
    help        
    You can also include formatting, links, images and footnotes in your notes
    • Simple formatting can be added to notes, such as *italics*, _underline_ or **bold**.
    • Superscript can be denoted by <sup>text</sup> and subscript <sub>text</sub>.
    • Numbered or bulleted lists can be created using either numbered lines 1. 2. 3., hyphens - or asterisks *.
    • Links can be included with: [my link to pubmed](http://pubmed.com)
    • Images can be included with: ![alt text](https://bestmedicaljournal.com/study_graph.jpg "Image Title Text")
    • For footnotes use [^1](This is a footnote.) inline.
    • Or use an inline reference [^1] to refer to a longer footnote elseweher in the document [^1]: This is a long footnote..

    hide…