Download Chapter
Vicki Francis/Department for International Development;

Chapter 10 Communications, information, education

Evaluating impact

Photo: Vicki Francis/Department for International Development;

It is difficult to attribute and measure the direct impact of communications initiatives on promoting DRR and reducing risk. This is particularly the case with evaluating behavioural change. Shifts in knowledge and attitude can take place quite quickly and are relatively straightforward to assess. Changes in behaviour are slower to reveal themselves and it can be harder to identify them. There is also the problem of attribution: it is hard to tell how much people learned from a specific public information programme or project intervention, and how much from other sources. Other social, cultural or economic factors may have a strong influence on behavioural change – just as a range of external factors may prevent behavioural change despite the best efforts of a project. The ultimate test of success may be how people behave when a real disaster threatens or strikes.

Chapter 18 discusses M&E in DRR methods in more detail: many can be used to assess the impact of communications work as well. For example:

  • Well-established ‘audience research’ methods can be used to find out how many people received particular information and what impact it had on their thinking and action. These include questionnaires, structured interviews and more qualitative in-depth interviews.
  • Valuable information can be collected from informal and relaxed conversations with people receiving messages, or through more participatory initiatives. Direct observation of how (or if) people adopt risk mitigation measures can also be highly informative.
  • Participatory communications approaches can be applied to evaluation. Folk drama or other community-based methods can be used to give people an opportunity to present their own views on an issue or on how well a project is doing. Focus groups are also commonly used. In the broadcast media, listeners’ letters and responses to quizzes and competitions provide useful qualitative indicators.
  • Rather than carrying out large-scale surveys, it may be easier to work with less direct indicators, relying more on triangulation (cross-checking) of a number of simpler evaluation techniques. This is likely to be cheaper as well as faster, and indicators can be based on verbal or other evidence of change.
  • The value of impact evaluations is limited if baseline data about attitudes and behaviour have not been collected.

Knowledge, attitudes and practice (KAP) surveys or studies, which are widely used in health and other programmes, might perhaps be applicable to hazard, risk and DRR communication, although it is not clear how much they have been used in this field. There are many ways of carrying out KAP surveys, with the general aims of generating information on current knowledge, attitudes and practice; improving understanding of the key cultural and other socio-economic factors influencing behaviour; identifying appropriate communications methods and networks for stimulating changes; and designing awareness-raising projects on the basis of this knowledge.

Box 10.6 Checklist of good practice in risk communication

  1. Think strategically.
  2. Plan and prepare carefully, with communities.
  3. Devise a series of actions to build up awareness and mobilise communities in the long term.
  4. Ensure that you understand how people process and evaluate information about hazards and risks.
  5. Focus risk communication on changing behaviour rather than merely improving understanding.
  6. Use methods of communication that are most acceptable to the communities concerned. Be prepared to spend time and effort to find out which methods are most suitable.
  7. Adapt the information and communications method to the needs and tastes of each target group, and set priorities where you do not have the capacity to communicate with everyone effectively.
  8. Ensure that technical information is presented in accessible formats.
  9. Check that the materials or advice being given are comprehensible, credible and consistent.
  10. Ensure that the actions suggested are feasible and that people will be motivated to act (and not panic).
  11. Pre-test materials and methods to make sure they are effective.
  12. Acknowledge the likelihood that apathy and information overload will affect people’s response to messages.
  13. Acknowledge that people’s attitudes to hazard risks are influenced by other factors, such as cultural traditions or the need to maintain insecure livelihoods.
  14. Provide interactive communication and pathways for dialogue, questions and requests for further information.
  15. Reinforce the message over time, and add new information and ideas as part of an overall strategy.
  16. Evaluate your work and share the findings with others.
Based on R. Steen, A Guide to Information Preparedness (Oslo: Directorate for Civil Defence and Emergency Planning, 2000); B. Rohrmann, ‘Effective Risk Communication for Fire Preparedness: A Conceptual Framework’, Australian Journal of Emergency Management, 10(3), 1995; pp. 42–46; S. Nathe et al., Public Education for Earthquake Hazards (Boulder, CO: University of Colorado, 1999), http://www.colorado.edu/hazards/publications/informer/infrmr2/informer2.pdf.