Category Archives: Decision Making

How to be Passionate without Being Self-Righteous

I can easily think of a dozen cringe-worthy times in my career when I look back and say, “wow, I must have appeared to be a self-righteous idiot when I said thing X.”  Let me thank anyone on the receiving end of those statements for their patience.  I now get it; I understand.

Look, I’m all for passion in business.  I’m all for speaking up.  One day I’ll write a book called “Management by 1970s Bumper Stickers” and the first chapter will be on this sticker:

questionauthority

I enjoy questioning authority — ask any of my old bosses. As CEO, I like being questioned.  Good CEOs don’t fear questioning because you typically end up in one of two cases:

  • The point raised has already been considered in making the decision, and explaining the rationale behind that helps the organization understand the decision and increase buy-in to it.
  • The point raised has not already been considering in making the decision and results in either changing or not changing the decision. Either way, the decision is better because we either find a better decision or another reason to support the existing one.  (As long as you beware confirmation bias.)

As CEO, your job is to get the right answer and make the best decisions, not to think everything up yourself.  Pride of authorship should have no place in CEO decision making.

Most people get questioning correctly.  They don’t assume things.  They’re not accusatory.  The simply ask the question that’s on their mind without a whole lot of overtone.  Every once in a while, however, I find someone who gets it all wrong and appears, as I did back in the day, to be self-righteous and dumb.

Let’s start with an example from one of my favorite old sci-fi movies, Soylent Green.

 

What a great scene.  But imagine if everyone knew that already.  Imagine how stupid you’d sound if you were delivering all those messages with all that same drama.

Imagine Hatcher saying, “Yes, yes, Detective Thorn, everybody knows that.  And boy are they tasty.”

You’d look pretty self-righteous.  And you’d look pretty dumb.  What rock did he crawl from under?  Everybody knows that Soylent Green is made out of people.

So what’s the best way to question authority?  Here are some tips.

  • Assume there is information that you can’t be told.  “We need to buy company X, why can’t anyone else see how critical that is, why won’t anyone listen to me?”  Now imagine the company tried to buy company X last quarter and they wanted 3x more than we could pay.  But no one can tell you that because the whole thing is under non-disclosure agreement.  Should you raise the point?  Sure.  Ask the question respectfully and lose the assumption (and overtone) that you are being ignored.

 

  • Assume there is personnel information that you don’t know.  “The HR department is failing and nobody over there can get the job done — why isn’t anyone doing anything about this, and why won’t anyone listen to me.”  Now imagine that the HR manager is already on a performance plan and 30 days from being terminated.  No one could ever tell you that.

 

  • Assume there is a bigger picture conversation that you’re not privy to.  “Why are we giving the new head of Engineering control over Product Management and making the job EVP of Products instead of SVP of Engineering?  Product Management was working fine, we don’t need to make this change, why won’t anyone listen to me.”  Now imagine the company has been struggling to hire a new head of engineering, the CEO is under big pressure to do so, and an extremely well qualified Engineering candidate won’t join unless he also gets Product Management.  No one’s going to tell you that in a Q&A forum.

 

  • Don’t ignore constraints.   Some of the most self-righteous rants I’ve heard completely ignore practical constraints on the business like a lack of talent, a lack of money, the need to keep paying customers happy, or product constraints related to compatibility.  Now, yes, sometimes great breakthroughs happen when people challenge constraints — but never pretend they don’t exist.  It’s not a great strategy for our company if we can’t execute it.  Maybe it’s a great strategy for some other company, maybe not.

 

  • Don’t trivialize execution.  Ideas are easy.  Execution is hard.  So when asking questions about ideas, don’t act as if they are free or if we could just get started with two people.  Yes, sometimes, great things start out with tiny investments — a $60K outsourced Twitter connector ended up serving as the genesis of Salesforce’s huge Social Enterprise strategy.   But often projects just end up dead because they were never properly resourced in the first place.  Execution is hard.

 

  • Don’t forget biases introduced by your personality type.  I stumbled into this great post the other day, What Everyone Desperately Wishes You’d Stop Doing, Based On Your Myers-Briggs Personality Type, and I just love the entry for ENFP — “Expecting everyone to be as excited as you are about today’s new BIG EXCITING PLAN when we all know you’ll have forgotten all about it by this time tomorrow.”  Look, some people are natural executors and others are natural idea generators.  Know which you are in assessing if you’re being ignored.  Is authority refusing to be questioned, or do you just have 10 ideas a day in a startup environment when the company needs to focus on one or two?

 

  • Don’t be naive.  Bob Waterman, co-author with Tom Peters of the legendary In Search of Excellence, was on our board at ASK, and one day he came down to hang out with the troops at the Friday beer bash.  I remember asking him (before I got my MBA) something akin to, “do you really believe all these green Harvard and Stanford MBAs should run companies or would businesses be better if everyone worked their way up.”  The man had an MBA from the Stanford and worked at McKinsey.  He must have thought I was the biggest idiot on Earth.  My spider-sense told me I’d done something wrong.  It was right.  He muttered something and walked away.   An opportunity wasted due to naivete.

I’m a big believer that the more someone knows about how a decision got made, the more they will agree it.  That’s why, as part of my management style, I spend a lot of explaining decisions to people.

That dumb corporate decision to prioritize X over Y might make more sense to you if you knew all the circumstances about how it got made.  Sometimes there’s a missing piece to the puzzle that makes everything make sense.  Sometimes you can be told about that missing piece.  Other times, you cannot.  But don’t assume it doesn’t exist, nor trivialize matters of focus and execution.

 

In-Memory Analytics: The Other Kind – A Key Success Factor for Your Career

I’m not going to talk about columnar databases, compression, horizontal partitioning, SAP Hana, or real-time vs. pre-aggregated summarization in this post on in-memory analytics.  I’m going to talk about the other kind of in-memory analytics.  The kind that can make or break your career.

What do you mean, the other kind of in-memory analytics?  Quite simply, the kind you keep in your head (i.e., in human memory).  Or, better put, the kind you should be expected to keep in your head and be able to recite on demand in any business meeting.

I remember when I worked at Salesforce, I covered for my boss a few times at the executive staff meeting when he was traveling or such.  He told me:  “Marc expects everyone to know the numbers, so before you go in there, make sure you know them.”  And I did.  On the few times I attended in his place, I made a cheat sheet and studied it for an hour to ensure that I knew every possible number that could reasonably be asked.  I’d sit in the meeting, saying little, and listening to discussion not directly related to our area.  Then, boom, out of left field, Marc asked:  “what is the Service Cloud pipeline coverage ratio for this quarter in Europe?”

“3.4,” I replied succinctly.  If I hadn’t have known the number I’m sure it would been an exercise in plucking the wings off a butterfly.  But I did, so the conversation quickly shifted to another topic, and I lived to fight another day.

Frankly, I was happy to work in an organization where executives were expected to know — in their heads, in an instant — the values of the key metrics that drive their business.  I weak organizations you constantly hear “can I get back to you on that” or “I’m going to need to look that one up.”

If you want to run a business, or a piece of one,  and you want to be a credible leader — especially in a metrics-driven organization — you need to have “in-memory” the key metrics that your higher-ups and peers would expect you to know.

This is as true of CEO pitching a venture capitalist and being asked about CAC ratios and churn rates as it is of a marketing VP being asked about keywords, costs, and conversions in an online advertising program.  Or a sales manager being asked about their forecast.

In fact, as I’ve told my sales directors a time or two:  “I should be able to wake you up at 3:00 AM and ask your forecast, upside, and pipeline and you should be able to answer, right then, instantly.”

That’s an in-memory metric.  No “let me check on that.”  No “I’ll get back to you.”  No “I don’t know, let me ask my ops guy,” which always makes me think: who runs the department, you or the ops guy — and if you need to ask the ops guy all the numbers maybe he/she should be running the department and not you?

I have bolded the word “expect” four times above because this issue is indeed about expectations and expectations are not a precise science.  So, how can you figure out the expectations for which analytics you should hold in-memory?

  • Look at your department’s strategic goals and determine which metrics best measure progress on them.
  • Ask peers inside the company what key metrics they keep in-memory and design your set by analogy.
  • Ask peers who perform the same job at different companies what key metrics they track.
  • When in doubt, ask the boss or the higher-ups what metrics they expect you to know.

Finally, I should note that I’m not a big believer in the whole “cheat sheet” approach I described above.  Because that was a special situation (covering for the boss), I think the cheat sheet was smart, but the real way to burn these metrics into your memory is to track them every week at your staff meeting, watching how they change week by week and constantly comparing them to prior periods and to a plan/model if you have one.

The point here is not “fake it until you make it” by running your business in a non-metrics-focused way and memorizing figures before a big meeting, but instead to burn the metrics review into your own weekly team meeting and then, naturally, over time you will know these metrics so instinctively that someone can wake you up at 3:00 AM and you can recite them.

That’s the other kind of in-memory analytics.  And, much as I love technology, the more important kind for your career.

Kellblog’s 2017 Predictions  

New Year’s means three things in my world:  (1) time to thank our customers and team at Host Analytics for another great year, (2) time to finish up all the 2017 planning items and approvals that we need to get done before the sales kickoff (including the one most important thing to do before kickoff), and time to make some predictions for the coming year.

Before looking at 2017, let’s see how I did with my 2016 predictions.

2016 Predictions Review

  1. The great reckoning begins. Correct/nailed.  As predicted, since most of the bubble was tied up in private companies owned by private funds, the unwind would happen in slow motion.  But it’s happening.
  2. Silicon Valley cools off a bit. Partial.  While IPOs were down, you couldn’t see the cooling in anecdotal data, like my favorite metric, traffic on highway101.
  3. Porter’s five forces analysis makes a comeback. Partial.  So-called “momentum investing” did cool off, implying more rational situation analysis, but you didn’t hear people talking about Porter per se.
  4. Cyber-cash makes a rise. CorrectBitcoin more doubled on the year (and Ethereum was up 8x) which perversely reinforced my view that these crypto-currencies are too volatile — people want the anonymity of cash without a highly variable exchange rate.  The underlying technology for Bitcoin, blockchain, took off big time.
  5. Internet of Things goes into trough of disillusionment. Partial.  I think I may have been a little early on this one.  Seems like it’s still hovering at the peak of inflated expectations.
  6. Data science rises as profession. Correct/easy.  This continues inexorably.
  7. SAP realizes they are a complex enterprise application company. Incorrect.  They’re still “running simple” and talking too much about enabling technology.  The stock was up 9% on the year in line with revenues up around 8% thus far.
  8. Oracle’s cloud strategy gets revealed – “we’ll sell you any deployment model you want as long as your annual bill goes up.”  Partial.  I should have said “we’ll sell you any deployment model you want as long as we can call it cloud to Wall St.”
  9. Accounting irregularities discovered at one or more unicorns. Correct/nailed.  During these bubbles the pattern always repeats itself – some people always start breaking the rules in order to stand out, get famous, or get rich.  Fortune just ran an amazing story that talks about the “fake it till you make it” culture of some diseased startups.
  10. Startup workers get disappointed on exits. Partial.  I’m not aware of any lawsuits here but workers at many high flyers have been disappointed and there is a new awareness that the “unicorn party” may be a good thing for founders and VCs, but maybe not such a good thing for rank-and-file employees (and executive management).
  11. The first cloud EPM S-1 gets filed. Incorrect.  Not yet, at least.  While it’s always possible someone did the private filing process with the SEC, I’m guessing that didn’t happen either.
  12. 2016 will be a great year for Host Analytics. Correct.  We had a strong finish to the year and emerged stronger than we started with over 600 great customers, great partners, and a great team.

Now, let’s move on to my predictions for 2017 which – as a sign of the times – will include more macro and political content than usual.

  1. The United States will see a level of divisiveness and social discord not seen since the 1960s. Social media echo chambers will reinforce divisions.  To combat this, I encourage everyone to sign up for two publications/blogs they agree with and two they don’t lest they never again hear both sides of an issue. (See map below, coutesy of Ninja Economics, for help in choosing.)  On an optimistic note, per UCSD professor Lane Kenworthy people aren’t getting more polarized, political parties are.

news

  1. Social media companies finally step up and do something about fake news. While per a former Facebook designer, “it turns out that bullshit is highly engaging,” these sites will need to do something to filter, rate, or classify fake news (let alone stopping to recommend it).  Otherwise they will both lose credibility and readership – as well as fail to act in a responsible way commensurate with their information dissemination power.
  1. Gut feel makes a comeback. After a decade of Google-inspired heavily data-driven and A/B-tested management, the new US administration will increasingly be less data-driven and more gut-feel-driven in making decisions.  Riding against both common sense and the big data / analytics / data science trends, people will be increasingly skeptical of purely data-driven decisions and anti-data people will publicize data-driven failures to popularize their arguments.  This “war on data” will build during the year, fueled by Trump, and some of it will spill over into business.  Morale in the Intelligence Community will plummet.
  1. Under a volatile leader, who seems to exhibit all nine of the symptoms of narcissistic personality disorder, we can expect sharp reactions and knee-jerk decisions that rattle markets, drive a high rate of staff turnover in the Executive branch, and fuel an ongoing war with the media.  Whether you like his policies or not, Trump will bring a high level of volatility the country, to business, and to the markets.
  1. With the new administration’s promises of $1T in infrastructure spending, you can expect interest rates to raise and inflation to accelerate. Providing such a stimulus to already strong economy might well overheat it.  One smart move could be buying a house to lock in historic low interest rates for the next 30 years.  (See my FAQ for disclaimers, including that I am not a financial advisor.)
  1. Huge emphasis on security and privacy. Election-related hacking, including the spearfishing attack on John Podesta’s email, will serve as a major wake-up call to both government and the private sector to get their security act together.  Leaks will fuel major concerns about privacy.  Two-factor authentication using verification codes (e.g., Google Authenticator) will continue to take off as will encrypted communications.  Fear of leaks will also change how people use email and other written electronic communications; more people will follow the sage advice in this quip:

Dance like no one’s watching; E-mail like it will be read in a deposition

  1. In 2015, if you were flirting on Ashley Madison you were more likely talking to a fembot than a person.  In 2016, the same could be said of troll bots.  Bots are now capable of passing the Turing Test.  In 2017, we will see more bots for both good uses (e.g., customer service) and bad (e.g., trolling social media).  Left unchecked by the social media powerhouses, bots could damage social media usage.
  1. Artificial intelligence hits the peak of inflated expectations. If you view Salesforce as the bellwether for hyped enterprise technology (e.g., cloud, social), then the next few years are going to be dominated by artificial intelligence.  I’ve always believed that advanced analytics is not a standalone category, but instead fodder that vendors will build into smart applications.  They key is typically not the technology, but the problem to which to apply it.  As Infer founder Vik Singh said of Jim Gray, “he was really good at finding great problems,” the key is figuring out the best problems to solve with a given technology or modeling engine.  Application by application we will see people searching for the best problems to solve using AI technology.
  1. The IPO market comes back. After a year in which we saw only 13 VC-backed technology IPOs, I believe the window will open and 2017 will be a strong year for technology IPOs.  The usual big-name suspects include firms like Snap, Uber, AirBnB, and SpotifyCB Insights has identified 369 companies as strong 2017 IPO prospects.
  1. Megavendors mix up EPM and ERP or BI. Workday, which has had a confused history when it comes to planning, acquired struggling big data analytics vendor Platfora in July 2016, and seems to have combined analytics and EPM/planning into a single unit.  This is a mistake for several reasons:  (1) EPM and BI are sold to different buyers with different value propositions, (2) EPM is an applications sale, BI is a platform sale, and (3) Platfora’s technology stack, while appropriate for big data applications is not ideal for EPM/planning (ask Tidemark).  Combining the two together puts planning at risk.  Oracle combined their EPM and ERP go-to-market organizations and lost focus on EPM as a result.  While they will argue that they now have more EPM feet on the street, those feet know much less about EPM, leaving them exposed to specialist vendors who maintain a focus on EPM.  ERP is sold to the backward-looking part of finance; EPM is sold to the forward-looking part.  EPM is about 1/10th the market size of ERP.  ERP and EPM have different buyers and use different technologies.  In combining them, expect EPM to lose out.

And, as usual, I must add the bonus prediction that 2017 proves to be a strong year for Host Analytics.  We are entering the year with positive momentum, the category is strong, cloud adoption in finance continues to increase, and the megavendors generally lack sufficient focus on the category.  We continue to be the most customer-focused vendor in EPM, our new Modeling product gained strong momentum in 2016, and our strategy has worked very well for both our company and the customers who have chosen to put their faith in us.

I thank our customers, our partners, and our team and wish everyone a great 2017.

# # #

 

Managing Change: The Sailboat Tack Principle

Change is hard in business.  A few things routinely get messed up:

  • Pulling the trigger.  Think:  “wait, are we still discussing this change or did we just decide to do it.”  I can’t tell you the number of times I’ve heard that quote in meetings.  I think continuous partial attention is part of the problem.  Sometimes, it’s just straight-up confusion as the enthusiasm for a new idea ebbs and flows in a group conversation.  It can be hard to tell if we’ve decided to change or if everyone’s just excited about the idea.
  • Next-level engagement.  Think:  “wait, I know we all like this idea on the exec staff, but this decision affects a lot of people at the next level.  I need some time to bounce this off my leadership team and get their input before we go ready/fire/aim on this.”
  • Communications.  Think:  “wait, this change is a big deal and I know we just spent every minute of the three-hour meeting deciding to do it, but we need to find another hour to discuss key messaging (5W+2H) for both the internal and external audience.”
  • Anticipatory execution.  Think:  “While we had not yet finally approved the proposal for the new logo, it was doing very well in feedback and I just loathed the idea of making 5000 bags with the old logo on them, so I used the new one even though it wasn’t approved yet.”

When you screw up change a lot of bad things happen.

  • Employees get confused about the company’s strategy.  “First they said, we were doing X, and then the execs did an about-face.  I don’t understand.”
  • The external market, including your customers, get confused about what you are doing.  This is even worse.
  • You can end up with 5,000 bags that have neither your old logo nor your new logo on them.
  • You can make your management team look like the Keystone Cops in one of many ways through screwing up sequencing:  like dropping off boxes before the big move is announced, or employees finding out they’ve been laid off because their keycards stop working.

In order to avoid confusion about change and the mistakes that come with it, I’ve adopted a principle I call the “sailboat tack principle” which I use whenever we are contemplating major change.  (We can define major as any change that if poorly executed will make the management team look like clowns to employees, customers, or other stakeholders.)

If you’ve ever gone sailing you may have noticed there is a strict protocol involved in a tack.  When the skipper wants to execute a tack, he or she runs the following protocol.

Skipper:  “Ready about”

Each crew member:  “Ready”

Last crew member:  “Ready”

Skipper:  “Helm’s a lee.”

That is, the skipper does not actually begin the maneuver  until every involved crew member has indicated they are ready.  This prevents partial execution, people getting hit in the head with booms, and people getting knocked off the boat.  It also implicitly makes clear when we are discussing a possible course change (e.g., “I think we should set course that direction”) from when we are actually doing it (e.g., “Ready about”).

For those with CS degrees, the sailboat tack principle is a two-phase commit protocol, used commonly in distributed transaction processing systems.

I like the sailboat tack protocol because the extra discipline causes a few things to happen automatically.

  • People know implicitly when we’re just talking about course changes.  (Because no one is saying “OK, so do we want tack here?”)
  • People know explicitly when we are actually making the decision whether to execute change.
  • The result of that extra warning — “hey, we are about to do this” triggers numerous very healthy “wait a minute” reactions.  Wait a minute:  I need to ask my team, I need to make a communications plan, I need to examine the compensation impact, I need to think about what order we roll this out in, etc.

Why, as CEO, I Love Driver-Based Planning

While driver-based planning is a bit of an old buzzword (the first two Google hits date to 2009 and 2011 respectively), I am nevertheless a huge fan of driver-based planning not because the concept was sexy back in the day, but because it’s incredibly useful.  In this post, I’ll explain why.

When I talk to finance people, I tend to see two different definitions of driver-based planning:

  • Heavy in detail, one where you build a pretty complete bottom-up budget for an organization and play around with certain drivers, typically with a strong bias towards what they have historically been.  I would call this driver-based budgeting.
  • Light in detail where you struggle to find the minimum set of key drivers around which you can pretty accurately model the business and where drivers tend to be figures you can benchmark in the industry.  I call this driver-based modeling.

While driver-based budgeting can be an important step in building an operating plan, I am actually bigger fan of driver-based modeling.  Budgets are very important, no doubt.  We need them to run plan our business, align our team, hold ourselves accountable for spending, drive compensation, and make our targets for the year.  Yes, a good CEO cares about that as a sine qua non.

But a great CEO is really all about two things:

  • Financial outcomes (and how they create shareholder value)
  • The future (and not just next year, but the next few)

The ultimate purpose of driver-based models is to be able answer questions like what happens to key financial outcomes like revenue growth, operating margins, and cashflow given set of driver values.

I believe some CEOs are disappointed with driver-based planning because their finance team have been showing them driver-based budgets when they should have been showing them driver-based models.

The fun part of driver-based modeling is trying to figure out the minimum set of drivers you need to successfully build a complete P&L for a business.  As a concrete example I can build a complete, useful model of a SaaS software company off the following minimum set of drivers

  • Number and type of salesreps
  • Quota/productivity for each type
  • Hiring plans for each type
  • Deal bookings mix for each (e.g., duration, prepayments, services)
  • Intra-quarter bookings linearity
  • Services margins
  • Subscription margins
  • Sales employee types and ratios (e.g., 1 SE per 2 salesreps)
  • Marketing as % of sales or via a set of funnel conversion assumptions (e.g., responses, MQLs, oppties, win rate, ASP)
  • R&D as % of sales
  • G&A as % of sales
  • Renewal rate
  • AR and AP terms

With just those drivers, I believe I can model almost any SaaS company.  In fact, without the more detailed assumptions (rep types, marketing funnel), I can pretty accurately model most.

Finance types sometimes forget that the point of driver-based modeling is not to build a budget, so it doesn’t have to be perfect.  In fact, the more perfect you make it, the heavier and more complex it gets.  For example, intra-quarter bookings linearity (i.e., % of quarterly bookings by month) makes a model more accurate in terms of cash collections and monthly cash balances, but it also makes it heavier and more complex.

Like each link in Marley’s chains, each driver adds to the weight of the model, making it less suited to its ultimate purpose.  Thus, with the additional of each driver, you need to ask yourself — for the purposes of this model, does it add value?  If not, throw it out.

One of the most useful models I ever built assumed that all orders came in on the last day of quarter.  That made building the model much simpler and any sales before the last day of the quarter — of which we hope there are many — become upside to the conservative model.

Often you don’t know in advance how much impact a given driver will make.  For example, sticking with intra-quarter bookings linearity, it doesn’t actually change much when you’re looking at quarter granularity a few years out.  However, if your company has a low cash balance and you need to model months, then you should probably keep it in.  If not, throw it out.

This process makes model-building highly iterative.  Because the quest is not to build the most accurate model but the simplest, you should start out with a broad set of drivers, build the model, and then play with it.  If the financial outcomes with which you’re concerned (and it’s always a good idea to check with the CEO on which these are — you can be surprised) are relatively insensitive to a given driver, throw it out.

Finance people often hate this both because they tend to have “precision DNA” which runs counter to simplicity, and because they have to first write and then discard pieces of their model, which feels wasteful.  But if you remember the point — to find the minimum set of drivers that matter and to build the simplest possible model to show how those key drivers affect financial outcomes — then you should discard pieces of the model with joy, not regret.

The best driver-based models end up with drivers that are easily benchmarked in the industry.  Thus, the exercise becomes:  if we can converge to a value of X on industry benchmark Y over the next 3 years, what will it do to growth and margins?  And then you need to think about how realistic converging to X is — what about your specific business means you should converge to a value above or below the benchmark?

At Host Analytics we do a lot of driver-based modeling and planning internally.  I can say it helps me enormously as CEO think about industry benchmarks, future scenarios, and how we create value for the shareholders.  In fact, all my models don’t stop at P&L, they go onto implied valuation given growth/profit and ultimately calculate a range of share prices on the bottom line.

The other reason I love driver-based planning is more subtle.  Much as number theory helps you understand the guts of numbers in mathematics, so does driver-based modeling help you understand the guts of your business — which levers really matter, and how much.

And that knowledge is invaluable.

If Marc Benioff Carried a Rabbit’s Foot, Would You?

In business we have a sad tendency to copy success blindly.

I remember the first time I read about this I didn’t even understand what I was reading:

“Nothing in business is so remarkable as the conflicting variety of success formulas offered by its numerous practitioners and professors.  And if, in the case of practitioners they’re not exactly “formulas,” they are explanations of “how we did it” implying with firm control over any fleeting tendencies toward modesty that “that’s how you ought to do it.”  Practitioners filled with pride and money turn themselves into prescriptive philosophers, filled mostly with hot air.”

Through blind luck, I’d had the good fortune that Theodore Levitt’s The Marketing Imagination (1983) was the very first book I read on marketing.  That paragraph — the opening paragraph of the book — stuck with me in some odd way, but it would be years before I truly appreciated what it said.

I was business-educated in the In Search of Excellence (1982) era and, while I suppose the same approach had been happening for years, In Search of Excellence was about as unscientific as they come.  The authors, Tom Peters and Bob Waterman, started out with a list of 62 companies identified by asking their McKinsey partners and friends “who’s doing cool work,” cut the list rather arbitrarily to 43 (excluding, for example, GE — but retaining Wang, Atari, and Xerox), and then “derived” eight themes which they thought were responsible for their success.

That was the mentality of the time.  Arbitrarily identify a set of companies you deem “cool” and then arbitrarily come up with things they have in common.  (And that’s not to mention the allegations of “faked data.”)

So I was happy when Jim Collins came along in 2001 arguing that he was bringing a more scientific approach in Good to Great.  Arguing that seeking only common traits could you lead to discoveries such as “all great companies have buildings,” Collins strove to differentiate good companies from great ones.  Starting with 1,435 companies and examining their performance over 40 years, Collins’ team identified 11 companies that became great along with 11 comparison companies in the same markets that did not.

While Collins’ thinking may have been clearer than Peters’, his luck was no better. Seven years after the book was published, several “great” companies like Circuit City were in deep trouble, Fannie Mae required a Federal bailout, and only only one of the eleven companies, Nucor, had dramatically outperformed the stock market.  Amazingly, despite the poor to lackluster performance of the “great” companies, it remains a best-seller to this day, ranking #5 on Amazon in management at last check.

Even when trying to avoid it, fake science and, in particular, survivor bias had struck again.  Thank goodness Phil Rosenzweig came along in 2009 with The Halo Effect, describing it and eight other business delusions from which managers suffer.  Here’s a nice excerpt:

On the way up to a stock market value of half a trillion dollars, everything about Cisco seemed perfect. It had a perfect CEO. It could close its books in a day and make perfect financial forecasts. It was an acquisition machine, ingesting companies and their technologies with great aplomb. It was the leader of the new economy, selling gear to new-world telecom companies that would use it to supplant old-world carriers and make their old-world suppliers irrelevant. Over the past year, every one of those characterizations has proved to be false.

As I often said about running analyst relations at Business Objects: “when the stock was going up everything I said was genius, when we missed a quarter, everything I said was suspect.”  This is, in my estimation, the real reason why some bad-egg companies such as bubble-era MicroStrategyFast Search & Transfer, or Autonomy (not yet settled) are tempted to inflate results.  I think it’s less about inflating valuation, and more about inflating the company’s perception of success in order to “validate” their strategy going forward.

But, to Levitt’s point at the start of this post, we are swimming in advice from successful practitioners.  

We have advice from Sequoia billionaire Mike Moritz who says the best advice he ever received was to “follow his instincts” which, as it turns out, works swimmingly well if you happen to have his instincts.  (And perhaps less so well, if you don’t.)

We have advice from billionaire Peter Thiel, who sounds vaguely like Timothy Leary with the drop-out part of turn on, tune in, drop out.

We have advice from Steve Blank, one of the more reasonable and thoughtful sources out there, and someone, in my opinion, to be admired for his commitment to giving back intellectually to Silicon Valley.

We have a plethora of advice from Marc Benioff, for example, the 111 “plays” in Beyond the Cloud, including “make your own metaphors” and “cultivate select journalists.” 

Who knows, maybe “beware of billionaires bearing business advice” may become the new “beware of Greeks bearing gifts.”

Finally, we also have advice from, dare I say, Kellblog who, while not a billionaire (yet), has opinions as tempered by experience and as firmly held as any of the above — and often as unscientific.

Given this sea of advice, how do I recommend processing it?  In the end, as Rosenzweig reminds us, in the absence of real silver bullets and magic formulae, we need to think for ourselves.  So every time I hear a successful businessperson bearing business advice I remind myself of one key fact — the plural of anecdote is not data — and ask myself two key questions:

  • Do I believe that he/she was successful because of, in spite of, or completely independent of this advice?
  • If Marc Benioff carried a rabbit’s foot, would I?

Twelve Questions Executives Can Ask To Improve Decision Making

I first became interested in decision making more than a decade ago, back when I was running marketing at Business Objects.  My interest was prompted by the evolution of taglines among BI vendors.  In the early days, taglines were descriptive like First in Enterprise Decision Support or The Enterprise Data Mart Company.

Over time, pressure mounted on marketing to pitch benefits — the message shouldn’t just be about getting people information, but the benefit of having it.  Slogans evolved accordingly:  Now You Know, The Power To Know, and Business Intelligence:  If You Have It, You Know.

But was knowing enough of a benefit?  You could certainly take it up a level, and Cognos did:  Better Decisions Every Day.  For a marketing slogan it was good enough, but was it true?   Did providing better access to corporate information  invariably improve decision making?  It seemed like a leap so I decided to research it.

I’ll never forget when Cornell professor Jay Russo told me, “the primary use of new information is selective filtering to justify previously established conclusions.”  So, despite the commonsense appeal of the Cognos tagline, you most certainly could not draw a straight line from “more information” to “better decisions.”

I studied how individuals and groups  made decisions.  I read interesting books like Russo’s Decision Traps (later positively reframed into Winning Decisions) and Smart Choices.  Years later I became interested in mass decision making  in The Wisdom of Crowds and behavioral economics in Predictably Irrational and Why Smart People Make Big Money Mistakes.

I remember asking Russo why decision making wasn’t more of a focus in business schools.  His answer came down to two things:

  • If you can’t measure it, you can’t manage it.  Until corporations want to start measuring decision making, you can’t focus on improving it.  (I remember once suggesting a BI product that tracked votes on strategic decisions, evaluated their success years later, and calculated batting averages for team members.  The idea was shot down as my colleagues imagined executives fleeing like cockroaches under an illuminated light.)
  • Executives perceive their jobs as decision-making and themselves as experts.  Think:  Why would I need a class in decision making?  I make decisions for a living and my success in rising up this organization is proof that I am good at it.

But if quenching thirst is the ultimate benefit of Coke, improved decision making really is the ultimate benefit sought by BI consumers.  The problem was  — and is — that BI software can’t deliver it.

So if you want to improve your decision making, then you’re going to have to read up a bit, either through the books I’ve referenced above or via a recent article in Harvard Business Review entitled Before You Make That Big Decision, which provides 12 questions that senior executives can ask about decisions and decision-making processes to avoid the most common errors.

Here are those 12 questions and the biases that they are trying to detect:

  1. Is there any reason to suspect motivated errors, or errors driven by the self-interest of the recommending team?  (self-interest bias)
  2. Have the people making the recommendation fallen in love with it?  (affect heuristic)
  3. Were there dissenting opinions within the recommending team?  (groupthink)
  4. Could the diagnosis of the situation be overly influenced by salient analogies?  (saliency bias)
  5. Have credible alternatives been considered?  (confirmation bias)
  6. If you had to make this decision again in a year, what information would you want and can you get more of it now?  (availability bias)
  7. Do you know where the numbers came from?  (anchoring bias)
  8. Can you see a halo effect? (halo effect)
  9. Are the people making the recommendation overly attached to past decisions?  (sunk-cost fallacy, endowment effect)
  10. Is the base case overly optimistic?  (overconfidence)
  11. Is the worst case bad enough?  (disaster neglect)
  12. Is the recommending team overly cautious?  (loss aversion)
The full article is here.