Be the second kind.
The other day I was meeting with an advisory client , talking with the CMO and a handful of go-to-market team members. We started to discuss marketing, topics like product positioning and the website. So I asked for some feedback.
I displayed the company’s website alongside a competitor’s and asked, “which do you like better? And why?”
I asked for feedback on product positioning, too. Since the company works in a somewhat ill-defined category, it can credibly position either as an XYZ or a PDQ. I noted that the competitor chose PDQ while we had chosen XYZ, and again asked for feedback.
I knew darn well that the positioning had been extensively debated at the e-team and board level. I also knew the marketing team was strongly quantitative and did a lot of testing and measurement. But I just wanted to hear what the people had to say.
Because there were potential power distance issues , I wanted to make everyone feel more comfortable. So I said, “I’m just looking for your opinions, there are no wrong answers.”
Turns out there were.
The CMO jumped in explaining why, despite their initial feedback, our website was better and how everything had been tested and that opinions didn’t matter, only conversion rates did.
The CMO continued, explaining why XYZ was superior to PDQ, that we’d A/B tested both, and XYZ outperformed PDQ on conversions. Opinions didn’t matter, only conversion rates did.
Just in case a dying ember of life still burned in the conversation, the CMO snuffed it out by explaining that the homepage itself didn’t matter — and therefore really wasn’t worth talking about — because most of our traffic didn’t arrive on the homepage, but on scores of landing pages customized to specific paid or organic search terms.
While I certainly flubbed the pre-meeting sync-up , this is an example of how some marketers use data to end conversations — when I think they should use data to start them.
Ending Conversations with Data
Killing conversations with data is easy. Use the data you have (ideally, that the audience has never seen) to tell them they’re wrong. We’ve tested this. We have the data. Trust the science. You are wrong. Case closed.
Once in a while, you do need to end conversations with data. For example, at the end of a long decision-making process where you have reviewed the data, had numerous conversations about it, and need to make a final decision. That’s fine. I’m not saying to never use data to end conversations.
What I’m saying is don’t use data to stifle a conversation. To cut one short. Or to avoid one entirely. Why do some marketers do this? It certainly varies by case, but I think some of the key reasons are:
- They forget that sales is the customer. If you view someone as your customer, you should want to listen to them any chance you get. Any time. About anything .
- They want to keep control. While boxing out people is a great short-term strategy to maintain control, it’s a great long-term strategy to find yourself needing new employment.
- They don’t want their apple cart upset. Particularly towards the end of a major project, marketers often close their ears to feedback because they get more focused on project completion than on project success. They become unveilers.
- They get offended. Don’t you think we tested this? Don’t you think we looked at the competitor’s positioning? Basically, don’t you think we know how to do our job? I get it. But marketing is not a sport for the thin-skinned.
- You hit a nerve. Maybe there’s baggage attached to the issue, they’re having a bad day, or they’re just tired of debate. These aren’t valid reasons to shut down conversations, but we’re all human. Marketers need to learn to manage these feelings. See note  for how I learned this.
I follow two principles that help me avoid these problems.
- Always be curious. My curiosity about their opinions must trump any potential sting in their response. If forced to choose between ignorance and hurt feelings, I’ll take the hurt feelings every time.
- Defensiveness kills communication. I know of no better way to stop all communication than to interrupt someone providing feedback with a defensive explanation. When you’re talking, you’re not listening.
Starting Conversations with Data
I like to start conversations with data. For example, on the XYZ vs. PDQ positioning question, you can run a few focus groups to discuss it . You can do some market research, such as surveys . You can add some keyword research. And a summary of how industry analysts and competitors position the space. Then you package that up into a short summary presentation  and run a series of internal meetings where you tee up discussions with the data — with both the groups you must meet (e.g., the exec staff) and with anyone willing to make the time and effort (e.g., town halls).
You’re not keeping the data under your cloak and using it as a secret weapon to silence opposition. You’re gathering the information you can afford to gather, packaging it up nicely, and having a series of open discussions about it.
That’s the way to start conversations with data. And people will love it when you do.
# # #
 While my desire to tell a given story in Kellblog is sometimes triggered by a single event, in order to preserve anonymity, be able to speak more generally, and spice-up things, I quickly adapt and meld such stories with dozens of others I’ve experienced over the years. Readers sometimes tell me, “I think I was in that meeting” — and they might well have been — but please don’t be surprised if the tale I tell is not a precise recounting. My goal is not to precisely describe a single experience, but instead to take lessons from the sum-total of them.
 A senior advisor and a CMO have quite a bit more power in an organization than a CSM or a seller. Thus, communication transparency can suffer. How much it suffers is a function of both organizational and national culture.
 And I’ll own that. My bad. Improvisation, as they say, only looks easy. Good improvisation usually happens among people who play together often and have a shared understanding of the underlying context and structure. Here, I tried to improvise a feedback exercise with someone with whom I rarely play, and we ended up stepping all over each other.
 I say this because some people only want feedback when they’re ready for it. Think: this is not a good time in our product lifeycle or campaign development cycle. Or, I can only accept feedback right now through this channel. When it comes to customers and feedback, it should be: at any time in any place. How you action it may vary based on where you are in a lifecycle, but listen first and explain those constraints later.
 I once had the good fortune of starting a marketing job on the day of a QBR where I got to watch my predecessor present the marketing update to the sales leadership. The whole thing got defensive very fast, the marketer bobbing and weaving, ducking blows, and having a few deer-headlight moments. I still remember the meeting and thinking one thing to myself: I never want to be that person. (Or more specifically, since they were a fine person, I never want to be in that place, in that situation.)
 While you can spend a lot of money on this, you can also spend a little or even none. For example, calling a couple of Zoom meetings to discuss things with trusted customers and prospects.
 Again, you can spend $25K to $50K on a study or you can make your own survey and mail it out. I’m not saying the data you get will be scientifically valid, but that’s not the point here. We’re not trying to prove anything with the data or make a decision on it alone. We’re trying to bring data to the conversation so we can have a better one.
 Spending 10 to 15 minutes of a 60 to 90-minute session teeing up the discussion, and the rest of it asking a few well-crafted questions and listening to the answers.