A strong evidence base for an anti-bullying approach gives schools confidence that the time and effort required to implement an approach will result in positive and sustainable outcomes.

Why is evidence important?
Does it work and how do we know?
What makes research 'well-designed'?
Is it strong evidence?
Measurement of change
Information about the evidence-base for the approach should be readily available
Prompt questions

Why is evidence important?

Schools want to know an approach they select will work. Evidence that an approach makes a difference reassures schools about the value of investing time and money, that the approach will not cause unintended negative outcomes, and that it will lead to a real and lasting positive impact for students.

Unfortunately, many approaches have limited or no evidence of their effectiveness. Of the research that has been conducted, some is not well-designed. Without evidence that an approach is effective, schools risk wasting time, money and energy.

Does it work and how do we know?

The first question often asked by educators about an approach is: 'Is there evidence this works?' A most important second question is: 'How was that evidence generated?'

Approaches and programs designed to be implemented in schools should be researched in schools, and generate real-world measures to show the results.

Well-designed research relevant to school contexts is necessary so that you can be confident that the evidence is strong, and the claim that an approach is 'evidence-based' is valid. Strong evidence that an approach really 'works' can only be provided by well-designed research studies.

What makes research 'well-designed'?

'Well-designed' describes research that uses a careful method to ensure that the findings are valid and not influenced by other factors.

Researchers have established a number of ways to provide the highest confidence that the findings are valid. Without the necessary rigour in the method of the research, it is possible for studies to suggest benefits and results that are not actually present.

Well-designed research provides protection from misleading results.

Interpreting research can be challenging for non-researchers. Read more about the features of high-quality research and how to gauge if the research evidence is strong enough to warrant your school's investment of time and money.

Is it strong evidence?

The term 'evidence-based' is sometimes used when not warranted. Not all information provided in support of an approach is strong evidence of real change.

Examples of information that do provide strong evidence that an approach is effective include:

  • well-designed research with measured change in behaviour which has been published in peer-reviewed journals
  • information from well-designed independent evaluation/s that includes specific measurement
  • measurement of specific behaviour change taken at suitable times in appropriate ways.

Examples of information that do not provide strong evidence that an approach is effective include:

  • poorly-designed research published without review by academic peers
  • reports written by the developer published on their own website
  • satisfaction ratings and opinion surveys conducted in schools
  • stakeholder reports and expert testimonials
  • anecdotes from other schools (but see Practice about useful information from other schools)
  • advertising material.

As well as not actually providing evidence that an approach 'works', this latter type of information does not alert schools to any possible harmful or negative consequences.

If the only information is the developer's advertising, schools should be cautious. Although an approach may seem compelling and have a convincing 'feel good' factor, it is not possible to determine if it will have a sustained and positive impact without well-designed research and high-quality evidence.

Measurement of change

Positive opinion of staff, students or parents and carers is not sufficient information to be confident about the effectiveness of an approach. Evidence that an approach has made a difference requires measurement, taken on multiple occasions and over a sufficient period.

Schools need to be alert when the 'evidence' for an approach is limited to satisfaction ratings or opinions gathered through surveys or interviews. Believing that things are getting better and feeling positive about an approach are notoriously poor indicators that real change has occurred.

To be considered as evidence-based, an approach needs to show specific measures of its effectiveness, and not just strong theory or related research that suggests it will work (see Theory). Therefore, approaches need to be described in specific detail so researchers can test them in the way they are intended to be used, known as 'implementation fidelity'.

Approaches that can be implemented with great variability will vary in what can be measured, and are therefore difficult to research. This means it can be difficult to generate strong evidence, and any statements about effectiveness could be questionable.

Information about the evidence base for the approach should be readily available

So the question for schools to ask developers when examining an approach is: 'What is the evidence and how strong is it?'

The evidence should be readily available, along with information about how this evidence was gathered. If an approach does not readily provide this information in a manual or other guidelines, this should suggest caution by schools.

To assist schools, some sites (see Where is the evidence?) provide reviews of the various approaches and programs which describe the content and use, as well as the state of evidence for the approach. These sites indicate if the research is well-designed and the strength of the evidence.

Prompt questions for Evidence

Key question: Does the approach have evidence from well-designed research which shows measurable change in behaviour?

To ask about the approach

  • What is the evidence for the approach and how has it been gathered?
  • Does the evidence consist of real change involving measurement of specific behaviours and outcomes?
  • How did the researchers ensure confidence that the research supports valid claims of effectiveness?
  • If the research evidence is limited, what plans exist for well-designed research into this approach?

To ask about your school

  • Will the school be able to implement the approach in a way similar to the way it was conducted in research studies?​
  • Are we alert to the use of persuasion​ or emotion-driven arguments rather than evidence to support this approach (e.g. is the information provided as 'evidence' actually opinion, promotion or advertising)?
  • If there is only limited research evidence, have we carefully considered the questions under Definition and Theory?

Use the STEPS form for schools (PDF, 651KB) to record your answers.