Measuring impact of involvement

We asked researchers what they thought about trying to measure the impact of involvement. This covered three main areas: what people knew about the current evidence base for involving people; whether they felt more measurement of impact was necessary; and if so how this might be done.

The current evidence base

Generally researchers agreed the evidence base for involvement was not very strong and needed improving. Some said they did not really know enough about the evidence base to comment.

Narinder is not aware of any hard evidence’ to support involvement but thinks it’s essential.

Age at interview 64

Gender Male

View profile

It’s understandable people are sceptical about PPI when the evidence is so weak. But Rebecca argues it’s only by doing it and reflecting on it we can build the evidence.

Age at interview 31

Gender Female

View profile

Our sample included some researchers with a particular research interest in patient and public involvement who had more detailed knowledge of the available evidence or lack of it. Some mentioned recent studies which are attempting to improve the reporting and measurement of impact, though it is still very early days to see how this will work out in practice.

There isn’t enough evidence for involvement, and mostly its rose-tinted case studies. We need better reporting of involvement, but also more clarity on what impact’ means.

Age at interview 32

Gender Female

View profile

Evaluation is often done poorly and delegated to a junior person. Sabi describes the Public Involvement Impact Assessment Framework (PiiAF) for improving evaluation.

Age at interview 50

Gender Female

View profile

Vanessa describes a study which showed data collected by a mental health user researcher was no different to data collected by other researchers.

Age at interview 42

Gender Female

View profile

Do we need to measure impact?

Views on the need to find ways to measure impact were mixed. One view was that this is vital to convince funders and colleagues it is worth doing, and to make sure we understand how to do it better. Bernadette felt she herself needed better evidence. Sarah, Andy, Jo and Pam all suggested that PPI partners themselves might also want to know they are making a difference. Hayley also pointed out that people will understand not everything they say can be used.

Many researchers will not take involvement seriously unless they can see some convincing evidence for it.

Age at interview 52

Gender Male

View profile

Hayley describes how young people and researchers assess the impact of involvement. Young people understand not every idea can be used but appreciate it if researchers are honest about this.

Age at interview 30

Gender Female

View profile

Bernadette is unconvinced that involvement makes a difference and wants more data which she is happy to help generate.

Age at interview 39

Gender Female

View profile

But it could also be argued measuring impact is inappropriate or unnecessary. Clinical and quantitative researchers such as Carl, Sergio and John, who recognised people might assume they wanted randomised controlled trial (RCT) evidence, actually felt involvement was just a matter of common sense and something you would do anyway, regardless of evidence. Sergio commented: ‘Sometimes we do not really need data to figure out whether something is worth its while or not, I think that there are enough logical arguments there to allow us to sustain that its a positive thing.’ Jim and Ann wondered why we single out patients or members of the public when we don’t evaluate the impact of any other member of the research team.

Carl argues some things don’t need trial evidence. Involving patients in research is just good sense.

Age at interview 46

Gender Male

View profile

It would not be difficult to show involvement makes a difference, but it seems unfair to measure the impact of patients and not other members of the research team.

Age at interview 52

Gender Male

View profile

While people were conscious that there was a lot of pressure to produce evidence of impact, they also felt a contrasting pressure not to question the value of PPI, and a sense that any critique or evidence that it did not ‘work’ or even caused harm would be unwelcome. Felix insisted it was important to be honest about negative impacts as well as positive. Ann commented that criticism is rare and that ‘There are people desperately trying to prove the impact because they want to prove that it has a positive impact.’ Alison added: ‘I think there is a touch of Emperors new clothes going on, that this is so valuable, important and useful,.People don’t really question it now.’

Alison says she still has to constantly remind herself about involvement. It’s high on funders’ agenda but she is not always sure it’s as valuable as everyone says.

Age at interview 47

Gender Female

View profile

We need better evidence, but researchers don’t feel able to voice any negative views or bad experiences of involvement. We need better evidence, but researchers don’t feel able to voice any negative views or bad experiences of involvement.

Age at interview 32

Gender Female

View profile

There was also a fear that some people would remain sceptical no matter what evidence was produced, unless they personally experienced the impact of involvement. As one person commented, ‘I work a lot with surgeons and surgeons are very firm in their opinion as to what is right… Im not convinced that any evidence presented to them would make a huge difference. It would take a research project that theyd tried to do and failed to do to be then inputted by PPI to succeed to make a difference to them, I think, and that is a big challenge, I can be as enthusiastic as anything, but Im not going to convince a true sceptic.’ Others echoed the idea that a positive and enriching experience of involvement at first-hand was an effective route to help people see its value.

How to measure impact
Regardless of whether researchers felt there was a need for better evidence of impact, there was recognition that actually getting such evidence is not easy. The lack of agreement about what we mean by either ‘involvement’ or ‘impact’ remains a problem for trying to come up with suitable measures. Felix and Andy suggested it was important to clarify what impacts you were expecting at the start of the project, and Jo suggested involving PPI advisers themselves in defining what impacts might be reasonable. Chris recommended keeping track of possible impacts during the study rather than trying to do it retrospectively. As Sarah A commented, ‘no one really measures or reports PPI as it goes along, it just kind of happens, So if you’re not reporting it how can you ever demonstrate the impact it has?’

Some PPI researchers are working on clearer standards for reporting involvement. But Sabi suggests randomised trials are impractical and we will never get clear quantitative evidence of impact.

Age at interview 50

Gender Female

View profile

Some new evidence is coming out but Hayley feels the focus is still too much on whether people are involved rather than how they are involved and what difference it makes.

Age at interview 30

Gender Female

View profile

Chris wants more evaluation of which methods of involvement are best. Keeping a record as you go will help.

Age at interview 48

Gender Male

View profile

Andy argues that unless you are clear what you expect from PPI, you won’t do it well or be able to identify impacts, so it will appear to have failed.

Age at interview 49

Gender Male

View profile

Felix suggests that the most important impacts are on people and relationships. Making changes to a specific piece of research is secondary.

Age at interview 36

Gender Male

View profile

There was disagreement whether it was better to stick to measuring processes of involvement (such as number and diversity of people involved) rather than trying to show a difference to the outcome or progress of the research itself. Pam suggested formative evaluation to help improve current PPI practice might be more important than trying to prove whether it ‘works’ or not. Further difficulties were deciding the time point at which to measure, and tracing impact through what might be many years of an evolving research idea. Sometimes involvement may contribute to a change in culture that assists and enables other changes which are never directly attributed, or attributable, to that initial involvement.

Involvement might not show an immediate difference but its effect could resurface later. Deciding when to measure is a problem.

Age at interview 31

Gender Female

View profile

John says it’s important to have a good PPI process but you could waste a huge amount of money on proving any effect. Like research, PPI is a long term and complex process.

Age at interview 59

Gender Male

View profile

Tina supports the need for better evidence of impact, but suggests replacing the word measure’ with demonstrate.

Age at interview 56

Gender Female

View profile

There was little support for the idea that conducting RCTs to establish the effectiveness of PPI was realistic or desirable, although a few people felt it was possible, and there are a growing number of examples. Pam argued that ‘PPI is not a thing, it’s a set of complex relationships’ and that developing a defined and measurable intervention was always going to be challenging. Both Jim and Suzanne suggested it has to be thought of as a ‘complex intervention’ with lots of interconnecting components.

You could compare two studies, one with patient involvement and one without, and measure differences in outcome.

Age at interview 64

Gender Male

View profile

Alice’s instinct is always to want to measure but feels you can’t really do a parallel project with no PPI. She is unsure we should even try.

Age at interview 26

Gender Female

View profile

Pam is sceptical about impact measures and how to disentangle cause and effect. Where PPI advisers agree with researchers it may look as if they made no difference.

Age at interview 54

Gender Female

View profile

Like Pam, most people thought qualitative approaches to trying to describe impact were likely to be more practical, but this could take a lot of time, reflection and resources. Retrospective analysis of funding and ethics applications or published studies was suggested, to compare levels of PPI and various outcomes (such as funding success, speed of approval or recruitment rates). But again this relies on a clear agreement on what ‘PPI’ means and good reporting of PPI activity.

Pam also drew attention to the fact that the impact debate tends to assume that to have value involvement must change things, whereas sometimes patients may agree with researchers or validate the research design, and this is also useful. John made a similar point: ‘I think that endorsement is a good thing to have, You say ‘I got this idea for some research’ and the patient says, ‘That’s a great idea’.’

Skills needed for involvement

We discussed with researchers what skills they felt they needed to work effectively with patients and members of the public. Ceri and Eric summarised many...