Realising there are people behind the data…

 

Another important impact of involvement?

I just read ‘How to survive a plague’, an incredible book about the AIDS activists in America – which I’d highly recommend. There was one story that really stood out for me…

After three years’ campaigning and enormous personal investment in developing their understanding of the science and the condition, the AIDS activists had finally won a seat at the table, and were working closely with a big pharmaceutical company running the first successful clinical trial. The new drug was really expensive and difficult to manufacture, so was only available to the small number of trial participants. One of the activists involved in the trial had a partner who was dying of AIDS. He also happened to be quite rich. So he hired a chemist to synthesise the new drug in a backroom laboratory. This chemist went to all the conferences and read all the scientific papers and managed to work out seven of the eight steps involved the drug’s manufacture – but he was stumped on the very last step in the process.

At around this time, the AIDS activist went to one of his regular meetings with the lead pharma researcher working on the trial. The researcher announced there was a problem. The company had worked out they had a spy and were worried about their competitors. The activist fessed up – it was him doing the spying! He just wanted to make enough of the drug to keep his partner alive, just until the treatment became more widely available. He burst into tears as he explained how his chemist was stuck.

What happened next was amazing. The pharma guy gave the activist all the information he needed, so his hired chemist could finish the job! In later interviews, the pharma researcher explained it was the first time he had really understood ‘there were people behind the data’ and the significance of his research for their lives.

This phrase ‘realising there are people behind the data’ is one I keep hearing from researchers when I talk to them about their experiences of involvement. This seems to be a profound and deeply-affecting impact of involvement. Thankfully it doesn’t always have to be as dramatic an experience as the story above. But based on some recent work at Parkinson’s UK, it does seem important that the researchers get to meet patients and carers in person.

This makes me think of examples of involvement where panels of patients and carers are sent research protocols and patient information sheets for comment, but don’t get to actually meet the researchers – it’s just an exchange of documents. While this can no doubt achieve a great deal, it seems to me to limit the impact of involvement to practical outputs – such as better recruitment. By way of contrast, when researchers meet patients and carers face-to-face (especially for the first time), it seems they connect with the real-life significance of their research, which can have a profound impact on their attitudes and values. Although harder to describe, these softer kinds of impacts are extremely important – they might be exactly what’s needed to bring about culture change and a shift in researchers’ thinking.

What then are the lessons for practice? To me it suggests that involvement is as much about the human interaction between researchers and patients/ carers as it is about the task in front of them. It’s about the people as much as the data. So how can we better support the kind of involvement that leads to this more profound and far-reaching change?

Advertisements

There’s no purpose to involvement…

 

… only principles and practice.

Last year I was asked to help a big research organisation to develop their payment policy for involvement. So I did what I usually do and asked other big research organisations to share their existing policies on involvement, to learn from what they do and avoid reinventing the wheel.

I looked at six existing policies. What was fascinating was they were all pretty much the same. You could insert any [name of organisation] in all the relevant places and it wouldn’t really have changed much. They all started with a statement of their commitment to the principles of involvement – the rights of people to be involved and how this would improve the relevance of their work. Then they all went on to describe excellent practice in terms of how their recruitment processes were fair and transparent, how they trained and supported the people they involve, and the policy details of whether and how they reimbursed expenses and paid for people’s time.

But these organisations were so different! I won’t name names – but some were funders of research, some were concerned with the ethics of research, some were charities providing services, some were charities focused on funding research and campaigning, and some were organisations using the results of research to make decisions about health policy. So these organisations make very different types of decisions. This means the precise way that patient or public involvement adds value to their work is likely to be different in every case.

The nature of any decision being made determines why you need to involve people, what you need them to do, as well as precisely who you need to involve1. It’s important to bring in people with the necessary perspective and/or experiential knowledge to usefully inform and influence the outcome.

So I think paying attention to the purpose of involvement, and being clear how this relates to the specific decisions being made, could help organisations to be clearer about what they need to do and why. This won’t be about doing what everyone else has done, but focusing on what’s different.

  1. Fredriksson & Tritter (2016) Disentangling patient and public involvement in healthcare decisions: Why the difference matters. Sociology of Health and Illness (in press).

Facilitation, facilitation…

26189577_s

… facilitation’s what you need.

Earlier this year, I evaluated a pilot of PPI in research at Parkinson’s UK. The final report is out today.

Parkinson’s UK had been finding patients and carers to get involved in research and passing their names to researchers – but they wanted to make sure that any involvement that took place was good quality. So they decided to invest in providing support to researchers and Parkinson’s UK volunteers, to help them develop effective working partnerships – to facilitate the involvement.

Involvement in essence means bringing together two groups of people, who don’t speak the same language, who aren’t always clear about what they’re meant to be doing, who may have different expectations about outcomes, and are learning new ways of working – all at the same time. No easy task. Facilitation helps smooth this process.

So what does this facilitation need to look like? If you think about involvement as a conversation – then an essential first step is preparing people to have the conversation, helping them to understand what the conversation needs to be about. Parkinson’s UK staff did this by providing training to patients and carers, and advice to the researchers, before the two parties met. This meant the patients and carers were clear about their role and the researchers were clear what questions they wanted to ask. The staff also provided a translation service, translating science to plain English and back again.

During the subsequent meeting between patients, carers and the researchers, Parkinson’s UK staff helped to keep the conversation flowing. They kept everyone focused on the task, made sure everyone had the same understanding of the discussions and ensured all participants had their say. Because this first meeting worked so well for everyone and they all learnt so much from each other, the researchers, patients and carers were keen to continue working together (where opportunity allowed). It proved to be the start of beautiful working relationships!

The big lesson then is that facilitation is one of the key ingredients to ensuring high quality involvement. Parkinson’s UK staff did all this work with great skill – but they hadn’t really recognised this fact until after the evaluation. The role of the facilitator is all too easily overlooked. Perhaps it’s time to look at facilitation in more depth. What makes a good facilitator and what can they do that’s most helpful?

 

Insight, not data

insight

What researchers gain from patients’ experiential knowledge

Researchers often learn something new from involvement. From hearing patients’ stories, researchers gain insights* that influence their thinking and therefore change their research. Researchers experience involvement in a very different way to ‘analysing data’ or ‘making sense of evidence’. This is why involvement isn’t about researching the patients’ experience.

Insight can be described as ‘an instance of understanding the true nature of a thing’ or the ‘power of seeing into or understanding a situation’. I came across this quote about insight from an online entrepreneur, which seemed to describe what researchers often report as their experience of working with patients.

Every now and then, I’ll come across a golden nugget of wisdom and it instantly opens my eyes to a very simple, but powerful new point of view.

Making use of another person’s experiential knowledge or wisdom is an act of learning. It affects each person in different ways –depending on what he/she still has to learn.

What does this mean for involvement? I think this means that there’s much to be done around the preparation. I think we need to support patients to identify and share the most relevant aspects of their experience i.e. to help them identify the golden nuggets in their wisdom – and to prepare researchers to experience new insights, rather than thinking they can do this by analysing patient data.

*Thanks to Derek Stewart for sharing his insight on insight!

It matters who you involve

25605560_s

Whose experience is most valuable?

I’d like to suggest that to ensure we get the most out of involvement, we need to involve people with direct experience of whatever condition is being studied. If you’re studying substance misuse amongst homeless people, you need to talk to people who’ve had that exact experience – it’s not that any mental health service user will do…

I’m basing this conclusion on a recent evaluation of the work of the FAST-R panel – a panel of mental health service users and carers who review patient information sheets, protocols and questionnaires for mental health researchers. (A journal article about the evaluation has just been published).The evaluation involved reading every single comment the Panel had made on 85 studies over a period of three years.

Their comments fell into three different categories. The first category included comments about making the information clear and easy to understand. They were all about rewriting the documents in plain English and changing the format. Any lay person might have done this – others might even argue that a science journalist could do it!

The second category included comments that I’m going to call general patient/ carer comments. These related to issues that many patients would know about – for example, checking that researchers had enough money in their travel budget for participants to take taxis when needed. In this particular example, Panel members also reminded researchers of the need to be sensitive to service users’ concerns about stigma and discrimination – issues that your average member of the public might not have picked up, but ones that people with a range of mental health problems would know about.

The third and final category included what I’m going to call patient/ carer expert comments. These were comments that were based on the unique insights of people with experience of a specific health problem. One example came from a review of a study of brain activity in people with schizophrenia. The information sheet explained that music would be played while people were in the MRI scanner. One of the Panel members with schizophrenia commented that if he were experiencing paranoia, he’d need to know exactly which piece of music was going to be played ahead of time.

So I’m concluding from this evaluation that if you want comments at all three levels you need to involve people with the right kind of experiential knowledge. Is this happening? I’m not sure. I’ve noticed that lots of panels and groups are being set up to support involvement across a wide range of research studies, and I’m wondering if sufficient attention is paid to matching people’s experience to the projects they’re asked to comment on.

Some people might say this doesn’t matter. If such panels are making the information clearer and participation in research easier, then that’s already a great improvement on what’s gone before. But I think those patient/ carer expert insights, are like the icing on the cake, the detail that might make all the difference.

Understanding the differences between these contributions I suggest is crucial to understanding the purpose of involvement. It’s not only about making research lay-friendly – it’s about making research relevant and acceptable to specific groups of patients. We may need to think more carefully about whose experiential knowledge is going to be most valuable in any particular study – so we can be sure to involve the people who have the most relevant experience.

The power of the anecdote

top story

Why patients’ stories work

I just googled ‘the power of the anecdote’ and two sites came up that exactly illustrate the problems we have with this in public involvement in research.

The first site, Ben Goldacre’s Bad Science, talks about how anecdotal reports of the effects of treatments can be potentially misleading, while clinical trials provide the best estimate of the true benefits of a drug. Of course this is right – it’s the reason we do research and why we support the development of evidence-based medicine.

However, when the patient perspective is brought into the research world, some researchers want to apply the same rules. They can dismiss patients’ stories, because they’re not good ‘evidence’. It seems researchers are not quite sure how to use the experiential knowledge that patients provide.

This is where the lessons from the second site ‘Presentation Pointers’ come in. This site encourages the use of anecdotes in presentations because they say anecdotes are one of ‘the most powerful communications tools ever discovered’. I think this describes the true value of patients’ stories. They have the power to communicate and therefore the power to challenge researchers’ assumptions.

I can tell you a great story to illustrate this point. I recently interviewed a researcher who told me how a patient’s anecdote had had a dramatic impact on a NICE committee evaluating new treatments. This committee was reviewing two forms of insulin for the treatment of diabetes. On paper, the clinical data suggested that both forms were equally good at reducing blood sugar, but the newer version cost more money – suggesting it wasn’t any more cost-effective. However, a diabetes patient who was at the meeting, alerted them to an important difference between the two forms – a difference they weren’t aware of. He explained that the older, cheaper version was more likely to result in hypoglycaemic attacks, and he said ‘Sometimes I don’t take my insulin at night, because I’m afraid I might not wake up in the morning.’ This statement challenged the committee’s assumptions about benefits. It sparked a ‘lightbulb moment’ in a way that a report of the patients’ experience describing ‘non-adherence to treatment’ might not have done.

I think patients’ stories work precisely because they’re anecdotal. If we try to turn them into evidence – by researching patients’ views and producing technical reports – we are in danger of losing their impact and value. We need stories in the patients’ own words, and they are probably best spoken by patients.

Anything else is dis-empowerment.

We’re just evidence-base junkies…

25395417_s

…looking to score!

I keep reading reports saying we need to improve the evidence base for involvement – but what does that mean… is it even possible?

It’s almost become a bit of a habit. We’re so used to the culture of evidence-based medicine, that it seems we feel the need to develop an evidence-base for everything! And the only evidence that counts, of course, is statistical data…

I understand how such an evidence-base helps with making decisions about healthcare. People at all levels – from the individual patient, through to NHS organisations and policymakers – all want reliable, statistical data to support them in making ‘the best decision’. But even then other factors will also come into play.

So I think we’re mistaken if the expectation is that an evidence-base for public involvement will give us the same kind of predictive information. I think people hope that such ‘robust data’ would help us be certain as to which projects will benefit from involvement, which approach will be most useful, and how best to do it.

But involvement doesn’t work like a health intervention. It is possible to quantify and measure its impact, but for all the reasons I describe in a recent journal article, the findings from such scientific approaches may not be generalisable. Context is everything with involvement – so what you learn in one context through a carefully constructed randomised controlled trial, might not then apply to other settings.

I’m suggesting it’s time to go cold-turkey. Let’s stop worrying about the evidence base. Let’s stop thinking we have to do an RCT to prove every aspect of how involvement works. I think there are other, more useful ways to learn about involvement – ways that are based on gaining wisdom and insight through experience – does that make it an art rather than a science?