Complexity in Service Delivery Research

In our previous blog we identified two objections to use of comparative studies and causal models in the evaluation of service improvement initiatives. The first objection relates to the quest for objectivity and the second to complexity. Objectivity was covered in the previous post. Now we must turn to the more tricky area of complexity.

One of the problems with the idea of complexity is that it can mean different things to different people. Things may be complex in all sorts of ways:

  1. Interventions may be made up of many different components that may act synergistically or antagonistically. Such interventions are perhaps better given the name ‘compound interventions’.
  2. Systems may exhibit emergent behaviour after the fashion of a flocks of bird (or their in silico equivalents – ‘Boids‘).[1]
  3. The interaction between elements of a system may be non-linear. A non-linear system is one where the output is not directly proportional to the input. Of course, it does not follow that such a system cannot be modelled.

And then one has to deal with a special class of non-linear systems – so called ‘chaos’. This is a system where outputs are sensitive to initial conditions, and which evolves in a dynamic way. The way chaotic systems evolve is deeply mathematical, involving ideas such as ‘attractors’, ‘fractals’, ‘Mandelbrot sets’, and other concepts only dimly understood by the CLAHRC WM Director who recommends “Chaos: A Very Short Introduction,” by Leonard Smith.[2] This book helps dispel the many misconceptions about this subject when applied to ordinary affairs, such as the organisation of health services. Of importance to our point today is that it is often difficult to tell by observing a system whether variable outputs result from ‘chaos’ or stochastic variation (play of chance). For this reason, chaotic and random variation have been called “observationally equivalent.”[3]

To make matters still more confusing, it is common to talk about complex interventions when what is really meant is that it is the system with which the intervention interacts that is complex, while the intervention itself may, or may not, be complex.

So what does all of this mean for people like those in CLAHRCs whose job it is to evaluate health services? The CLAHRC WM Director proposes the following:

  1. Do not eschew models. The fact that something may be complex, in any of the senses above, does not mean that models are not useful. A hospital manager who installs a computerised prescribing system with computerised decision support has a model something like this in mind:Implement IT system; to Reduce medication error; to Reduce preventable adverse events (AEs); to Lower mortality, Improve QoL and Reduce costs of care for AEsIs that a complete model? No. Could there by reverse causality – say because adverse events may prompt separate actions to reduce errors? Yes. Does this mean the model is without value? No. All models are representative of the world – they are useful abstractions that help decision-makers. Some things will always be left out, but that would be the same if a decision were made sans model. It is funny how people are sometimes happy with decisions that are made unaided by a model, yet suddenly become hypercritical when a little light is thrown on a subject by means of a model; the model is extirpated on the basis of perceived imperfections as though such imperfections could be eliminated by abandoning modelling!
  2. Populate models with the best data possible and do not let the ideal be the enemy of the practical good. In some cases it is possible to populate individual links in the model (θ1 and θ2 in the figure below) and also to measure the correlation between the input and output directly (θ3 in the figure). We have constructed a simple Bayesian model to calculate θ3 using values for θ1 and θ2, obtained from the literature. We then validated the model, against values for θ3, also obtained from systematic literature review.Decision support; Theta 1 to Prescribing errors; Theta 2 to AEs. Theta 3 directly from Decision support to AEsSince the calculated and observed values matched, more complex models were not necessary. However, there may be other organisational causal chains that require more complex (e.g. non-linear) models.[4] [5]
  3. Complexity is not an argument against making scientifically valid observations. Rather, it is an argument in favour of scientific scrutiny. The fact that complexities of various sorts make it difficult to predict what will happen in any one instance does not mean that it is not possible to discern a general tendency by means of models at the aggregate level, populated with scientific data. Take weather and climate, for example. The complexity of the system means that medium-term weather predictions are rather vague, but this does not obscure our ability to discern long-term climatic trends, for example relating to El-Niño, and greenhouse gases. In service delivery evaluation, complexity results in heterogeneity and wide variances that must be taken into account in study design. Hopefully, by drawing attention to the distinction between a single instance and a general tendency, we can gradually dispel simplistic ideas such as the notion that RCTs are incompatible with complex systems. This fallacious idea is ‘back to front’ – only if everything was completely straightforward would we not need formal comparative studies. Moreover, finding out which type of model best describes the relationship between the inputs and outputs of a system requires that the parameters relating the various presumed components of the system to its outputs are accurately calibrated. Such studies should adhere to scientific rules to reduce the risk of error, as discussed in last fortnight’s blog.

— Richard Lilford, CLAHRC WM Director
— Sam Watson, CLAHRC Research Fellow

References:

  1. Reynolds, Craig (1987). Flocks, herds and schools: A distributed behavioral model. Computer Graphics. 1987; 21(4): 25–34.
  2. Smith L. Chaos: A Very Short Introduction. Oxford: Oxford University Press, 2007.
  3. Werndl C. Are Deterministic Descriptions and Indeterministic Descriptions Observationally Equivalent? Stud Hist Philos M P. 2009; 40(3): 232-42.
  4. Dal Forno A, Merlone U. Chaotic Dynamics in Organization Theory. In: Bischi GI, Chiarella C, Shusko I, eds. Global Analysis of Dynamic Models in Economics and Finance. New York, NY: Springer-Verlag Berlin Heidelberg; 2013. pp. 185–204.
  5. Juárez F. Applying the theory of chaos and a complex model of health to establish relations among financial indicators. Procedia Computer Science. 2011; 3: 982-6.
Advertisements

Leave a Reply

Fill in your details below or click an icon to log in:

WordPress.com Logo

You are commenting using your WordPress.com account. Log Out / Change )

Twitter picture

You are commenting using your Twitter account. Log Out / Change )

Facebook photo

You are commenting using your Facebook account. Log Out / Change )

Google+ photo

You are commenting using your Google+ account. Log Out / Change )

Connecting to %s