What gets lost in the process?
If I was to tell the real story of how my best research results came about when I was a molecular biologist, it would go something like this…
One day I made a big mistake in setting up my experiment and got some totally unexpected results. These were more interesting than what I was looking for in the first place. So I did the flawed experiment again, just to check it was true and got the same interesting findings. Then I went to talk to my boss to tell him what had happened. This wasn’t so shameful, as I could talk about the cool results as well as owning up to my mistake. Together we came up with more ideas for different experiments to look more closely at the new findings. My team mates also gave me helpful feedback. When I presented these findings at a conference, other experts in the field gave me ideas for even more experiments, and one of the leaders in the field even offered me a job. When it came to writing a journal article reporting this work I didn’t mention any of this!! Instead I summarised the previous literature on the topic, and made out there was a completely logical and well-thought out rationale for everything I’d done….
I’m not saying all research is done this way! But now I recognise there was a whole host of people I talked to along the way – people who acted as a sounding board to check my thinking, gave me new ideas and helped me find solutions when things went wrong. I couldn’t have done it without them. But these people didn’t get a single mention in any of my publications. I wasn’t expected to write about them and there wasn’t a standard way to do this. There still isn’t.
Current guidance on how to report involvement and its impact seems to fit this same pattern. In aiming to make the ‘findings objective and robust’, these reports leave out the researchers’ and public’s experience and learning. I think this means vital information of how involvement works gets lost – it’s like banging a square peg into a round hole! Perhaps we need a new way to report the impact of involvement in research – not one that tries to describe it as a rational and objective process, but one that recognises the subjective and surprising experience of learning. We need to report the content of the conversations – what was said, what was learnt by whom, and what difference this made to those people’s thinking and actions…. so it’s a story of what happened to the people, rather than objective data.
Blog post #1: PPI. Learning it is. What can Yoda teach us about involvement in research?
Blog post #2: Researchers and the public as ‘thinking partners’. Why there’s no ‘method’ for involvement.
Blog post #3: Conflict as thinking. The challenge in working with different kinds of ‘thinking partners’.
Blog post #4: What is the purpose of involvement? To avoid bias in researchers’ thinking…
2 thoughts on “Why try and objectify PPI?”