Customize Consent Preferences

We use cookies to help you navigate efficiently and perform certain functions. You will find detailed information about all cookies under each consent category below.

The cookies that are categorized as "Necessary" are stored on your browser as they are essential for enabling the basic functionalities of the site. ... 

Always Active

Necessary cookies are required to enable the basic features of this site, such as providing secure log-in or adjusting your consent preferences. These cookies do not store any personally identifiable data.

No cookies to display.

Functional cookies help perform certain functionalities like sharing the content of the website on social media platforms, collecting feedback, and other third-party features.

No cookies to display.

Analytical cookies are used to understand how visitors interact with the website. These cookies help provide information on metrics such as the number of visitors, bounce rate, traffic source, etc.

No cookies to display.

Performance cookies are used to understand and analyze the key performance indexes of the website which helps in delivering a better user experience for the visitors.

No cookies to display.

Advertisement cookies are used to provide visitors with customized advertisements based on the pages you visited previously and to analyze the effectiveness of the ad campaigns.

No cookies to display.

I
n
p
r
a
c
t
i
c
e
:
N
e
w
a
p
p
r
o
a
c
h
e
s
t
o
e
v
a
l
u
a
t
i
o
n
f
o
r
d
y
n
a
m
i
c
a
l
l
y
m
a
n
a
g
e
d
d
e
l
i
v
e
r
y
i
n
M
a
n
c
h
e
s
t
e
r

Fresh evaluation approaches identify importance of specific innovations

One of the most successful programmes to tackle long-term rough sleeping in the UK took place in Greater Manchester (cf. chapter 8). This programme implemented many of the learnings featured in this report, such as peer mentors with lived experience (cf. chapter 2), a personalised, asset-based approach (chapter 5), and a genuinely different relationship between “commissioner” and “supplier” (chapter 4). The combination of these elements has led to a substantial reduction in this seemingly intractable form of homelessness.

This project was clearly a significant breakthrough for Greater Manchester. However, it would have an even greater impact if we could use its lessons to deliver similar results on a much larger scale. To that end, it was important to tease out the individual components of operational management and delivery that really mattered, so they could be shared and implemented elsewhere. To achieve this, a different approach to evaluation was needed.

“It was important to tease out the individual components of operational management and delivery that really mattered, so they could be shared and implemented elsewhere”

What really worked in Manchester?

The first priority was to understand the key design features and the delivery pilots that were trialled. This was the focus of the excellent evaluation produced by the Greater Manchester Combined Authority, which listed the elements found to be important, and made recommendations for future homelessness policy in the region.

But in order to broaden the learning, it was then necessary to dig further into the relative impact of each change. Which design features and delivery pilots were the game-changers that reduced rough-sleeping?

Was it, for example, the commitment of housing agencies to avoid evicting someone if a tenancy went wrong?

Or did directly employing a qualified mental health professional make the real difference?

New evaluation techniques tease out causality

Chris Fox, Professor of Evaluation and Policy Analysis at Manchester Metropolitan University, is examining the Greater Manchester homelessness programme. Among his evaluation tools are “Qualitative Comparative Analysis” (QCA) and “Process Tracing”.

“QCA looks in depth at a number of cases to find causal patterns,” explains Professor Fox. “It compares different combinations of interventions and the outcomes. This helps us to work out which combinations lead to what outcomes. It helps to show which conditions are essential to produce certain outcomes.

“This approach recognises that different interventions can produce the same outcome. So, for example, providing a ‘managed move’, instead of eviction, can keep someone off the streets. Likewise, preventing someone being jailed for a minor offence might also avoid a return to rough sleeping.

“Additionally, this approach recognises that a single intervention can produce different outcomes, depending on the circumstances. So, for example, receiving personalised mental health support in your own home might turn around one person’s life. But it might not be a game-changer for someone else.”

“QCA helps us to work out which combinations of interventions lead to what outcomes, [and] which conditions are essential to produce certain outcomes”

Process tracing

A second form of analysis tries to uncover any links between possible causes and outcomes. It does this by studying how causal mechanisms might work. This is called “process tracing”, explains Professor Fox. “You start by suggesting some possible causes. Then you identify what you would expect to see if these causes really do exist. Next, you identify pieces of evidence which you would expect to find at each step of a causal chain, if a hypothetical explanation is true. Practical evidence is then gathered to overturn or support rival hypothetical explanations.

“Process tracing helps to establish whether the actual mechanisms at work fit with those that you might have predicted. Assessing each hypothesis alongside the available evidence can help you to understand what causes a given set of outcomes in any given case.”

Sharing the learning

The use of these sophisticated evaluation tools (by Professor Fox and others) should help us to better understand the key factors behind the programme’s success.

“Our hope is that this approach to evaluation will make Greater Manchester’s success more useful to other parts of the country, because we can be more confident about what really matters,” says Professor Fox. “It should also encourage greater application of these innovative evaluation tools to other programmes, and therefore greater devolution of design authority to where it is most effective. When we have a concrete understanding of the key components that are at work, leadership teams in other projects will be able to benefit from this, and central government may become more confident in trusting these teams to work out how best to help people locally.”