Tomorrow, I'm presenting to a room that will remind me why I started.

There is a particular kind of preparation that happens before you present to a patient audience.

It is not the same as preparing for pharma. It is not the same as preparing for investors, or regulators, or a CRO operations team. I know those rooms well. I know how to read them, how to pitch into them, how to land a point that shifts thinking without triggering defensiveness.

A room full of people with lived experience of a condition requires something different. Not less preparation. Different preparation.

On Tuesday I'm giving a keynote at the Make 2nds Count Secondary Breast Cancer Patient Summit in Liverpool, alongside Pfizer UK's Medical Affairs Oncology Team. Secondary breast cancer, metastatic, stage four, incurable, is a condition that affects tens of thousands of people in the UK alone. Many of them will be in that room.

When you're presenting to industry, your job is to shift thinking. When you're presenting to people who are living the thing you're talking about, your job is not to educate. It is to empower.

That distinction sounds simple. It is not. It changes the structure of what you say, the order in which you say it, and — more than anything — how much space you leave for what comes back.

I'll tell you what the room teaches me. Next week.

I'll be in Boston, later this month… shall we meet?

From March 19th to 26th I'll be in Boston for Patients as Partners, one of the most important gatherings in the patient engagement and clinical research space.

If you're attending and want to connect, I'd genuinely like to meet. Reply to this newsletter or drop me a message on LinkedIn. I'll be there all week and I'm keeping the diary open for conversations that matter.

Most clinical trials don't fail at patient recruitment. They fail at protocol design.

80% of clinical trials miss their enrolment timelines. In most cases, the cause is traceable. And preventable.

When a protocol's eligibility criteria, visit schedule, or site geography silently excludes underserved populations, clinical teams don't find out until recruitment stalls. By then, the protocol is fixed, the sites are contracted, and the budget is committed.

I wrote How to See Patient Recruitment Failure Before It Happens for VP and Director-level clinical research and operations leaders who want to identify health equity risk before it becomes a delay — not after.

It covers the structural blind spot in EHR, claims, and historic trial data that no amount of data quality work fixes. What lived experience data actually is and how it closes that gap before protocol sign-off. And two real-world trial scenarios that show the difference it makes.

If you're working in clinical development or trial operations, this is for you. Download it below, and if it's useful, send it to one person who's currently designing a protocol.

This week's deep dive: community engagement and patient advisory boards are not the same thing

Earlier this week I posted something blunt on linkedin:

A patient advisory board is eight people, selected because they're articulate and comfortable in a boardroom, meeting quarterly to review materials that have already been written. That's governance theatre.

The response told me this landed on something real. Twenty-four comments. A thread that ran for two days. People who clearly had things stored up to say.

I want to use this week's dispatch to go deeper. Because the comments surfaced a nuance I don't think I landed clearly enough in the original post.

I am not saying patient advisory boards are useless.

Some of the most rigorous patient input I have witnessed has come from well-run advisory boards. There are examples, I've seen them, I have even run them myself, where board members made substantive changes to study designs, challenged assumptions that would otherwise have gone untested, and held organisations to account in ways that genuinely mattered.

So the format is not the problem.

The problem is that advisory boards and community engagement are being used interchangeably. And they are not the same thing.

A patient advisory board is a governance mechanism. It works when the people on it have the standing, the information, and the timing to influence decisions that haven't yet been made. When those conditions are met, it can be powerful.

Community engagement is something else entirely. It is not a meeting. It is a relationship built over time in spaces where people have every reason not to trust you. It requires showing up before you need anything. It requires taking scrutiny, coming back, and demonstrating, not asserting, that the input will move something.

The failure mode I was describing in the post is when organisations use the first to claim they've done the second. When a quarterly boardroom review of pre-written materials gets presented, internally and externally, as evidence of community voice.

Communities are not fooled by this. They have spent decades learning to distinguish between consultation that changes outcomes and consultation that produces a report that gets filed.

And here is where the infrastructure problem re-enters.

The reason genuine community engagement is so rarely operationalised isn't primarily a question of intent. Most of the people running these programmes genuinely care. The reason is that there is no structured mechanism for turning what communities say into something that changes a protocol, a site selection decision, or a market access submission.

The input gets summarised qualitatively. Filed. The decisions get made the same way they would have been made without it.

That is not a people problem. It is a data infrastructure problem.

You cannot move power by holding more meetings. You move power by building systems where the data those meetings generate can sit alongside clinical and operational data and actually influence decisions.

That is what Unwritten Health is built to do. Not to replace advisory boards. Not to critique the people running them. But to build the infrastructure that means participation is worth the time of the people being asked to give it.

I started writing this because I couldn't find the newsletter I needed. If you're finding it useful, tell someone. That's how this grows.

- Ashish

This week in data

26% of European vaccine trials conducted between 2010 and 2020 reported the ethnicity of their participants. In the UK, the figure was 17%. You cannot fix what you do not measure. And you cannot measure what you never designed a system to capture.

Puttick NF, Vanderslott S, Tanner R. Representation of ethnic and racial minority groups in European vaccine trials: a quantitative analysis of clinical trials registries. BMJ Public Health. 2023;1:e000042.

Reply

Avatar

or to participate

Keep Reading