You really should read this debate – Steven Goodman, a statistician for whom I have the utmost regard, wrote a brilliant paper in which he and colleagues show the importance of ‘design thinking’ in observational research. The essence of their argument is that in designing and interpreting observational studies one should think about how the corresponding RCT would look. This way one can spot survivorship bias, which arises when the intervention group has been depleted of the most susceptible cases. This way of thinking encourages a comparison between new users of an intervention with new users of the comparator. Of course, it is not always possible to identify ‘new users’, but at least thinking in such a ‘design way’ can alert the reader to the danger of false inference.
One of the examples mentioned concerns hormone replacement therapy (HRT) where the largest RCT (Women’s Health Initiative trial) gave a very different result to the largest observational study (Nurses’ Health Study). The latter suggests a protective effect for HRT, while the former suggest the opposite. It looks as though this might not have been a very good example because, as Bhupathiraju and colleagues point out, there is a much simpler and more convincing explanation for the difference in the observed effects of HRT across the two studies. The hormone replacement was given to much younger women in the observational study than in the trial. Subsequent meta-analysis of subgroups across all RCTs confirms that HRT is only protective in younger woman (who do not have established coronary artery disease). Thus, HRT is probably effective if started sufficiently early after the menopause.
This does not mean, of course, that Goodman and colleagues are wrong in principle; they may simply have selected a bad example. This is an extremely interesting exchange conducted politely between scholars and is interesting from both of the methodological and the substantive points of view.
— Richard Lilford, CLAHRC WM Director
- Goodman SN, Schneeweiss S, Baiocchi M. Using design thinking to differentiate useful from misleading evidence in observational research. JAMA. 2017; 317(7): 705-7.
- Bhupathiraju SN, Stampfer MJ, Manson JE. Posing Causal Questions When Analyzing Observational Data. JAMA. 2017; 318(2): 201.