Skip links and keyboard navigation

Interpreting, sharing, reporting and responding to results

Introduction

After an evaluation has been conducted decisions need to be made about:

  • how results will be interpreted
  • how information will be reported and shared
  • what planning and decision-making processes the results will inform.

Ideally, these decisions should be considered as part of the development of an evaluation framework and should refl ect the purpose and audience for the evaluation (Step 2). However, issues that arise during the conduct of the evaluation research may require changes to be made and any initial plans should be flexible to adapt to these changing circumstances.

Results interpretation

Interpretation refers to the process by which the meaning and signifi cance of results is determined. This is not always a straightforward process. Often, different people will interpret the same information in different ways.

The most useful way of ensuring that the results of evaluations are interpreted in a meaningful and unbiased way is to bring together a reference group to discuss the fi ndings and their meaning. This group should include key people who will need to respond to the information and people who participated in the evaluation. This not only provides a more accurate way of interpreting evaluation findings, it also provides an excellent learning opportunity for those involved.

The process of interpretation should also be informed by the information about the political, institutional, social and policy context of the community engagement activity that was collected in Step 1. This information is important for identifying external factors which may have influenced the results and for determining the signifi cance of observed outcomes.

Reporting and sharing results

How the results from an evaluation need to be reported and shared will be determined by the purpose and audience for the evaluation (Step 1) and the project management framework within which the evaluation will occur, including:

  • the current systems for reporting
  • the current systems for on-going review of communityengagement activities.

Evaluation findings can be reported and shared through a number of different mechanisms. These include:

  • internal or external government reporting mechanisms
  • internal or external research reports
  • academic or professional publications
  • presenting case studies through showcasing events
  • contributing case studies to the Get involved website
  • the media
  • presentations, workshops or seminars.

Examples of different information reporting and sharing options that might be suitable for different purposes and audiences are presented in Table 11 (over the page).

Reporting requirements in the Queensland Government

The Queensland Government’s commitment to improving community engagement means there is an obligation for agencies to report on community engagement achievements. Agencies give an account of engagement activities in their annual report, highlighting ways in which improved community engagement has delivered better outcomes for communities aligned with the State Government’s priorities.

Agencies also contribute to a community engagement annual report prepared by the Department of  Communities.

The State Government’s commitment to delivering high quality services to the community has been consolidated through the Charter of Social and Fiscal Responsibility. The charter sets out how the State Government will report on the outcomes of its activities. The State Government also reports against the key priorities in its Priorities in Progress Report which provides an account of performance in line with social, fiscal, economic and environmental objectives. Increasing emphasis is being placed on highlighting the results of community engagement through these reporting processes.

Table 11. Examples of different information reporting and sharing options which might be suitable to different audiences

Table 11. Examples of different information reporting and sharing options which might be suitable to different audiences
Purpose Audience Audience needs Potential ways of sharing information
Summative evaluation People external or internal to the activity with an interest in monitoring the activity to ensure it is effective, efficient and worthwhile
  • Evidence of performance that is objective, valid, reliable and quantifiable.
  • Stories of success that illustrate the value of community engagement to government and communities
  • Reports
  • Media
Formative evaluation People with a direct interest in the activity and/or some control over its future including program managers and participants
  • Evidence of what is happening and why
  • Identification of opportunities for improvement
  • Reports
  • Workshops
Evaluation research People with a direct interest in the activity and other community engagement practitioners, experts and participants
  • Lessons from the evaluation about what works for whom, in what circumstances
  • Showcases
  • Seminars and presentations
  • Websites
  • Professional and academic publications

Principles to consider in sharing results

Whichever mechanisms for sharing results are chosen a number of principles need to be considered.

Identify the needs and capabilities of the different audiences

  • It is important to identify what information each audience cares about most and balance that with what you think is important for each audience to know. Also consider different information formats that might be appropriate for different audience types.

Identify opportunities to discuss the results with key stakeholders

  • The failure of many evaluations is that they are only used to produce written reports which are not read by the right people. Consider opportunities to share information through a two-way mechanism in which results can be discussed and in the case of formative evaluation, desirable and feasible changes identified.

Make sure results are reported in an accurate and unbiased manner

  • Most data are not neutral. Take care in the presentation of data so that all assumptions and value judgements are made explicit and that data are presented in a comprehensive, rather than selective way.
  • Present quantitative results with a clear indication of the reliability of the data.
  • Avoid over-generalising results. Ensure that results specify to whom the results apply and the likely timeframe for which the results hold true.
  • Avoid making value comparisons between situations. For example, avoid making a judgement that one activity is outperforming another when there may be intervening factors affecting the outcomes.
  • Avoid mistaking correlation of data for causality when there is not enough evidence to draw that conclusion.

Make reports user-friendly

  • Write reports which are easily accessible to those who need to implement changes. Be concise and use plain English with little jargon.
  • Present quantitative results with appropriate contextual statements to aid interpretation. Break up graphs and tables of numerical data with qualitative feedback in the form of stories and anecdotes that illustrate the points that the data are indicating.

Make sure results are reported in a timely manner

  • Provide information within a timeframe that is useful to decision-makers.

Make sure results are shared as widely as possible

  • Develop mechanisms to distil and share the lessons from evaluations of individual community engagement activities to guide the planning and implementation of future community engagement processes.

Consider the ethical and political sensitivities and risks attached to evaluation

  • Write reports that are sensitive to both the community and government parties involved. Take care before drawing conclusions about data that might be critical of a community, an individual’s actions or a program’s outcomes without thoroughly exploring the context.
  • Evaluations undertaken by government agencies can be politically sensitive. It is important to realise that evaluation results are vulnerable to misuse and misinterpretation. Make sure any reports provide clear guidance on the reliability and applicability (scope) of results and how they should be interpreted. Also remember that evaluations can raise expectations in the community or amongst program staff that change may happen. 

Responding to results

In general, evaluations should result in a list of findings and recommendations. These should be developed into improvement strategies and a response plan for implementation of those strategies which should detail:

  • the issue or problem to be addressed
  • the desirable changes
  • who is responsible for implementing the changes
  • the timeframe within which changes should be implemented.

Often this will involve a process of negotiation between key stakeholders and decision-makers.

If the improvement strategy is part of an on-going community engagement program (formative evaluation), a reasonable timeframe for implementation should be included. If the strategy is addressing changes to be implemented in future activities, these should be included in guidance materials and/or shared through training or showcasing events.

An example of a response plan format is:
Issue Desired changes Responsibility Time frame
       
Last reviewed
27 May 2011
Last updated
22 June 2011