The Most Dangerous Words in a Research Brief: “We Just Need Direction”

The Most Dangerous Words in a Research Brief: “We Just Need Direction”

After about 25 years in this field — the journey from brown hair to gray — I’ve learned that the most dangerous words in a research brief aren’t technical. They’re not about sample size, methodology, or timelines.

They’re simple. And they’re almost always said with the best of intentions:

On the surface, it sounds reasonable. Teams are under pressure. Timelines are tight. There are multiple stakeholders involved, each hoping the research will help move things forward. But over time, I’ve learned that when those words show up, it’s a signal to slow down, not speed up.

Not because the project is in trouble — but because “direction” almost always means different things to different people in the room.

Most research projects don’t start with a single decision-maker. They start with a group. Brand, marketing, communications, leadership — all smart, capable people, each bringing slightly different priorities and expectations to the table.

Everyone wants the research to help. Everyone also tends to want it to answer their version of the question.

When those differences stay unspoken, they don’t disappear. They wait.

One of the most common places this shows up is during the final presentation. We’re walking through results, connecting insights to recommendations, and someone asks:

“Why didn’t we ask about X?”

What makes that moment tricky is that X usually isn’t a surprise topic. The client was involved throughout the process — from goal setting to questionnaire review to final approval. But when objectives stay high-level, important angles can remain implicit. They feel understood, but never fully articulated, which means no one realizes they’re missing until the results are already on the table.

That’s not a memory problem. It’s an objective-setting problem.

Another version of this shows up when we deliver recommendations that are thoughtful, supported, and genuinely useful — only to learn that something we tested was never actually on the table.

Maybe it was politically sensitive. Maybe leadership had already ruled it out. Maybe it simply wasn’t realistic from the start.

None of that is unusual. What is problematic is discovering those constraints after the research is complete.

I’ve also seen this play out in more concrete ways — like a study we conducted last year where nearly half the data turned out to be irrelevant. Not because the research was flawed, but because the target audience wasn’t fully aligned with real-world constraints. The client simply wasn’t allowed to operate in a particular geography, a detail that didn’t surface clearly during goal setting.

When objectives don’t fully account for who you can — and can’t — reach, even good data loses its value.

In my experience, this isn’t about carelessness. It’s about complexity.

Multiple stakeholders mean multiple definitions of what “direction” should look like. One person wants validation. Another wants permission to change course. Someone else is focused on internal alignment or how the findings will land with leadership.

Those are all legitimate needs — but they’re not the same question.

When no one forces those differences into the open early, research quietly becomes a compromise. The brief broadens. The questions get safer. The findings become easier to interpret in multiple ways — and that’s when clarity starts to slip.

Ironically, teams often stay vague because they want flexibility. They don’t want to narrow too early or bias the outcome. That instinct is understandable. But flexibility without clarity doesn’t reduce risk. It just moves it later, when the stakes are higher and the options are fewer.

And this isn’t limited to junior teams or rushed projects. I’ve seen even the most seasoned executives struggle with objective setting, especially when the audience is large and the decisions are consequential. When leadership isn’t aligned, that ambiguity trickles down — and the research reflects it.

This is fixable. And it doesn’t require more process or longer briefs — just more intention upfront.

A few things that consistently make a difference:

  • Direction should be tied to a decision. If it’s unclear what will change after the research, the objective probably needs tightening.
  • Constraints aren’t limitations — they’re context. If something is off the table, politically sensitive, or unrealistic, saying so early helps focus the work.
  • The target audience matters as much as the question. Clear goal setting should include a shared understanding of who the research is for — and just as importantly, who it isn’t.
  • Alignment matters more than coverage. Answering one hard question well is almost always more valuable than answering five safe ones halfway.
  • A little discomfort early is cheaper than confusion later. Slight tension during briefing often prevents much bigger frustration at the end.
  • Clear objectives don’t predict outcomes, and they don’t box teams in. They create a shared definition of success before the data starts talking.

After a couple decades of watching research succeed — and fail — for the same reasons, I’ve come to believe this:

Clear direction is one of the most underappreciated acts of leadership in the research process.

Not because it makes projects easier, but because it makes the insights braver, the decisions sharper, and the results far more useful when it matters most.