Acessibilidade / Reportar erro

Health promotion evaluation, realist synthesis and participacion

Avaliação em promoção de saúde, síntese realista e participação

DISCUSSANTS DEBATEDORES

Health promotion evaluation, realist synthesis and participacion

Avaliação em promoção de saúde, síntese realista e participação

Marcia Hills; Simon Carroll

Centre for Community Health Promotion Research. University of Victoria, BC, Canada

There are many ways to enter a debate, more or less polemical, critical or supportive. We will address some very important issues raised by the initiating paper in this debate (Carvalho, Bodstein, Hartz & Matida, 2004), but first we should thank the authors for an opening that is clear and forthright, innovative and important. They have managed to present what we feel are many of the key issues in the debate over how to evaluate the effectiveness of health promotion, without in any way closing off alternative avenues and approaches.

We are grateful for this opportunity, as one of the main planks of the paper we are responding to ask us to consider "realist synthesis" as a promising alternative approach to the dominant mode of systematic reviews in health promotion. Along with other colleagues from the Canadian Consortium of Health Promotion Research, we have recently completed the initial phase in a multi-year project with Health Canada, that attempts to develop a framework for assessing the effectiveness of community initiatives to promote health, based largely on the theoretical and methodological insights of "realist synthesis" (Hills, O'Neill, Carroll and McDonald, 2004; Hills, Carroll and O'Neill, in press; Pawson and Tilley, 1997; Pawson, 2001, 2003, 2004).

There are three parts to this friendly response: 1) A a rationale for our agreement with the fundamental position outlined by Carvalho et al., that: the "realist" approach is the "most radical and innovative perspective in evaluation," and that effectiveness research should be focused on "mechanisms" that are shared across initiatives, making these the theoretical units which form the basis for systematic comparison and review of evaluation data; 2) a brief description of our initial attempt to apply this approach to assessing the effectiveness of federally-funded community initiatives in Canada, and a discussion of some of the opportunities it presented, along with some of the challenges it posed; this discussion will raise some of the internal difficulties and questions for the realist synthesis approach to health promotion; 3) a very short discussion of a possible external tension between the realist approach and the principled emphasis in health promotion (HP) on the importance of participation and empowerment in all its aspects, including evaluation.

To begin, it is clear that the demand for '"evidence-based policy" is not going to go away, because at its heart, even if it metamorphoses into something with a new label, it speaks to the need for policy-makers to account for and justify their expenditures. This is part of a long-term trend in changing state-societal relations, where "results-based management" and "performance indicators" are becoming indispensable tools for managers under increasing pressure to rationalize an ever decreasing pot of social investment funds. Fiscal retrenchment and the neo-liberal "hollowing out" of the state (Jessop, 1994, 2002) has meant that public health in general, and health promotion in particular, are swimming against the current, to avoid drowning from the massive cost-squeeze between the neo-liberal state project and the endlessly inflationary, acute care-obsessed health system. The question then becomes (assuming a revolution doesn't happen tomorrow!): what type of evidence will convince external funding agencies of the effectiveness of health promotion interventions, especially, complex, community-based work?

The second aspect of our agreement with the authors is to join the loud chorus of HP researchers who have questioned the wisdom of relying exclusively on randomized controlled trial evidence as the panacea for demonstrating effectiveness (Potvin, 1994; McQueen, 2001; Potvin et al., 2001; Potvin & Richard, 2001). The distinction we would emphasize, however, is that part of what makes the "realist" approach radical and innovative is its unique critique of the orthodox meta-analytical and the alternative narrative approaches to systematic review (Pawson and Tilley, 1997; Pawson, 2002a). While we do not have the space here to go into detail, at the core of this critique is a radically different understanding of how to conceptualize causality and explanation in scientific work. It is key to an understanding of the realist approach that it goes very deep and represents a completely alternative critique of the positivistic understanding of science to the more well known, phenomenological and constructivist critiques (Bhaskar, 1975; 1979; Keat Urry, 1982; Harré, 1983; Outhwaite, 1987; Sayer, 1992; Archer, 1995). These deeper, more philosophical ruminations, should be of direct concern to HP researchers trying to assess effectiveness, as they have large implications for how to begin to properly theorize complex social interaction and change: the ground upon which HP lives and breaths, succeeds or fails. There are a variety of social theories that HP could use as meta-theoretical foundations, none of which are likely to be entirely compatible with each other. For example, Habermas, Bourdieu, Latour, and Bhaskar all share the anti-positivist label, yet each respective theoretical and philosophical perspective has deep implications for how even middle-range theory (of the type HP must construct) is developed.

We agree that the realist approach offers great potential for finally allowing a systematic and rigourous theoretical basis for assessing the effectiveness of health promotion. Most profoundly, as the authors intimate, it allows for the integration of the real-world complexity of HP initiatives, including the basic elements of socio-economic context discussed in the paper.

The work on the Heath Canada framework grew out of a long period of frustration with both the RCT-based gold standard and with the inability of more qualitative, phenomenological and constructivist alternatives to grasp the nettle of demonstrating effectiveness. We were left with many of the basic insights that seem to be shared by the vast majority of HP researchers about what type of things were important to successful initiatives (e.g. participation; inter-sectoral action; empowerment; critical dialogue; shared leadership); yet, with no methodology that could generate a sufficient level of abstraction to compare these " things" or "processes" across initiatives. Pawson's idea that "mechanisms" might be the basis for comparison was the catalyst for an attempt to do something radically different. The most important gain in adopting this approach seemed to be that it offered a way of identifying what it was that made a process like collaborative planning work. What were the key mechanisms that caused positive change in community-based health promotion initiatives? Furthermore, because the orthodox epidemiological approach to causality was no longer the standard, it was possible to start to think about how these mechanisms were related to enabling and constraining contexts. This meant that even if the outcomes didn't show positive change, you could still investigate the effectiveness of the mechanisms themselves, and you could start to build a picture of those contexts that allowed "participation" to enable successful collaborative planning, for example, to translate into positive outcomes, and those that hindered its positive effects.

However, this exciting project has also thrown up some equally persistent challenges. First, the conceptualization of mechanisms demands some very intense theoretical reflection and collaboration with other experts in the field, to ensure that the abstract concepts fairly reflect the complexity of HP initiatives. Secondly, how these mechanisms interact with each other, and what weight each mechanism has in given contexts, adds another layer of complexity to the theorization. Thirdly, how to theorize the "context" itself means delving into the world of social theory to a depth that many in the HP field do not see as relevant, although some leading thinkers argue that the development of a stronger theoretical base for HP is increasingly necessary (Potvin, 2001; Nutbeam, 2000; Best, 2003; Green, 2000; Brickmayer and Weiss; 2000; Judge, 2001). Fourthly, the move from rigorous conceptualization to appropriate indicators for measurement is just as complicated and may, in some cases, not have a solution at all. Finally, in relation to the necessary empirical grounding of the theoretical development, we have found that the lack of good systematic evaluation data (focused on the areas that all HP practitioners talk about when they have time off from evaluation!) will be an impediment to developing the evidence base for effectiveness, no matter how good the theoretical intentions are. Even if evaluations become more systematic, unless evaluations of community initiatives are encouraged to collect detailed data on such things as "level of participation," it is hard to develop the required middle-range theoretical hypotheses that are the sine qua non of realist syntheses. In summation, the Health Canada project is a very exciting and promising opportunity, although the difficulties ahead are legion.

The final point raises a serious issue for anyone enthused by the realist alternative as we are. There is a potential paradox within the realist approach in that, while it allows for the theorization and integration of participation and empowerment into an effectiveness framework, and does not pre-judge whether they are immediately available for measurement, there is an inherent danger within realism that it systematically valorises the "scientist's" epistemological perspective above the lay-perspective. In its powerful and necessary critique of the solipsism and self-contradiction of extreme constructivism, it often equivocates between the obvious position that lay-members of the community can be wrong, and the strong implication that this shared fallibility somehow makes a "scientific" methodology superior in itself to other forms of knowledge. In other words, in realism's strong and incisive demolition of relativism, it often wears, unintentionally, the defensive armour of a "scientism" that we would do better to leave behind. There should be no special epistemological status attained through spurious methodological privilege. We do not hold that this is a necessary outcome of using a realist approach, only to point out that it would be a cruel irony if in grasping at a theoretical resource HP is profoundly lacking, it lets go of one of the crucial theoretical insights it has managed to gain: that knowledge for change must be created and used through a participatory frame which values all types of knowledge as in principle equally valid, though any particular knowledge claim is open to critical questioning.

References

Archer M 1995. Social theory; the morphogenetic approach. Cambridge University Press, Cambridge.

Best A et al. 2003. An integrative framework for community partnering to translate theory into effective health promotion strategy. American Journal of Health Promotion 18(2):168-176.

Bhaskar R 1975. A Realist Theory of Science. Leeds Books, Leeds.

Bhaskar R 1977. On the ontological status of ideas. Journal for the Theory of Social Behaviour 27(2/3):139-147.

Bhaskar R 1979. The possibility of Naturalism; a philosophical critique of the contemporary human sciences. Harvester, Sussex.

Brickmayer J & C H Weiss 2000. Theory-based evaluation: Investigating the how and why of wellness promotion, pp. 132-137. In MS Jammer & D Stokols. Promoting human wellness: New frontiers for research, practice and policy. University of California Press, Berkeley.

Carvalho A, RC Bodstein, Z Hartz, & A Matida 2004. Concepts and approaches in the evaluation of health promotion. Ciência e Saúde Coletiva 9(3):521-529.

Green J 2000. The role of theory in evidence-based health promotion practice. Health Education Research 15(2):125-129.

Harré R 1893. Social being. Blackwell, Oxford.

Hills M et al. 2004. Effectiveness of community initiatives to promote health: An assessment tool. Canadian Consortium for Health Promotion Research: 1-223. (Mimeo).

Jessop B 1994. Post-Fordism and the State. Post-Fordism: a reader. A. Amin. Oxford, Cambridge, Blackwell Publishers Inc.: 251-279.

Jessop B 2002. The future of the capitalist state. Polity, Cambridge.

Judge K & L Bauld 2001. Strong theory, flexible methods: Evaluating complex community-based initiatives. Critical Public Health 11(1):19-38.

Keat RUJ 1982. Social theory as science. Routledge, London.

McQueen DV 2001. Strengthening the evidence base for health promotion. Health Promotion International 16(3):261-268.

Nutbeam D 1994. Inter-sectoral action for health: making it work. Health Promotion International 9(3):143-144.

Nutbeam D 1998. Evaluating health promotion – Progress, problems and solutions. Health Promotion International 13(1):27-44.

Nutbeam D 1999. The challenge to provide "evidence" in health promotion. Health Promotion International 14(2):99-101.

Nutbeam D 2000. Health promotion effectiveness – the questions to be answered (Part Two, Evidence Book), pp. 1-11. In The evidence of health promotion effectiveness: Shaping public health in a new Europe. International Union for Health Promotion and Education, Brussels.

Outhwaite W 1987. New philosophies of social science: realism, hermeneutics and critical theory. MacMillan, London.

Pawson R 2001. Evidence and policy and naming and shaming. ESRC UK Centre for Evidence Based Policy and Practice.

Pawson R 2002. Evidence-based policy: In search of a method. Evaluation 8(2):157-181.

Pawson R 2002. Evidence-based policy: The promise of "realist synthesis". Evaluation 8(3):340-358.

Pawson R 2003. Nothing as practical as a good theory. Evaluation 9(4):471-490.

Pawson R 2003. Social care knowledge: Seeing the wood for the trees. London, ESRC UK Centre for Evidence Based Policy and Practice: 1-23.

Pawson RC et al. 2003. Types and quality of social care knowledge – stage one: A classification of types of social care knowledge. London, ESRC UK Centre for Evidence Based Policy and Practice: 1-16.

Pawson R & N Tilley 2001. Realistic evaluation bloodlines. American Journal of Evaluation 22(3):317-324.

Potvin L et al. (2003). Implementing participatory intervention and research in communities: lessons from the Kahnawake Schools Diabetes Prevention Project in Canada. Social Science & Medicine 56:1295-1305.

Potvin L et al. 2001. Beyond process and outcome evaluation: A comprehensive approach for evaluating health promotion programmes, pp. 45-62. In I Rootman et al. Evaluation in health promotion: Principles and perspectives. World Health Organization, Denmark.

Potvin L & L Richard 2001. Evaluating community health promotion programmes, pp. 142-165. In I Rootman et al. Evaluation in health promotion: Principles and perspectives. WHO Europe, Copenhagen.

Sayer A 1992. Method in social science: A realist approach. Routledge, London-New York.

Publication Dates

  • Publication in this collection
    20 Oct 2004
  • Date of issue
    Sept 2004
ABRASCO - Associação Brasileira de Saúde Coletiva Av. Brasil, 4036 - sala 700 Manguinhos, 21040-361 Rio de Janeiro RJ - Brazil, Tel.: +55 21 3882-9153 / 3882-9151 - Rio de Janeiro - RJ - Brazil
E-mail: cienciasaudecoletiva@fiocruz.br