Why You Shouldn’t Be Fooled by Your Own Expertise

Mark Suster, entrepreneur I am wired to discount people who have total assuredness in their point-of-view, have dogmatic positions or use data as a crutch or substitute for logic. I appreciate people who have strong opinions or conviction but expect them to constantly be testing their opinions and refining their approaches as they encounter new people, facts or logic.

I have long believed that humans (myself included) err on the side of over-confidence in their own abilities and thoughts. Thus one of my favorite sayings is

Strong opinions, weakly held

Years ago when I worked with my dear friend Carlo Gagliardi he used to say that “it took three bullets to kill you, Mark.” He told me I was stubborn and opinionated and if he brought me a weak argument but didn’t have enough evidence to support his position he would get pushed back. He would then bring some basic facts to prove that I wasn’t seeing things correctly but I wouldn’t budge without compelling logic. But he knew that if he had real conviction, if the facts were on his side and if he could build enough logic with his data or facts he could eventually win. Three bullets.

Weak-sauce arguments fall on deaf ears with me and sloppy logic annoys me to the point of frustration. A well-mounted case and I drop my defenses and embrace what I hadn’t seen before. Coming back with an empty chamber for the third battle is much worse than not coming back.

I was influenced heavily before my career even began because in my undergraduate work I took a ton of statistics classes that showed how easily we human brains fall prey to easy biases and slights of data and try to draw conclusions that don’t exist.

Thus when I read “A Random Walk Down Wall Street” in my early 20’s it reinforced my belief about how people are fooled by data. Given enough performance over time any number of random people will be proven “right” about a market and if you look back in time you can believe that they had a method to their outcomes. If you have hundreds of millions of people and you do 20 rounds of coin flips in a row there will be some people who were right every time and we can be quick to ascribe these people as geniuses rather than being right by chance.

Even worse, when I was a strategy consultant I learned how easily data could be manipulated to prove just about any reasonable thesis and how a well-structured argument lined with data or pseudoscience could persuade large groups of people of dubious conclusions.

I wrote about these experiences 7 years ago in a post I titled

73.6% of all Statistics are Made Up

My summary is an example I use often: when I consumed reports from Goldman Sachs about the future predictions of mobile phone penetration for a report I was working on for a client I would call the analyst to ask how she calculated her data because I wanted to be sure of my conclusions. She told me she got it from Gartner Group. I called that analyst and he told me that he was in a rush to calculate future estimates and his boss told him that saying that the next 3 years would be the same growth rate as the past three years wouldn’t get headlines so he was encouraged to show 8% CAGRs rather than 5%. Ultimately the Goldman Sachs report gets picked up by a journalist who writes a hyperbolic headline with emphatic conclusions because nobody wants to read that tomorrow will be just like today.

So a degree of skepticism, critical thinking and understanding data biases are always in order.

In my 30’s I devoured “Fooled by Randomness” and fell in love with the writing style of Nassim Nicolas Taleb as it re-informed my belief system in being a disbeliever. He used a self-referential term I had never heard before, which is that he is a “skeptical empiricist” and that has always stuck with me.

But his subsequent book really solidified my beliefs in the limitations of humans and analyses of all humans was “The Black Swan.” I read it just after I entered the Venture Capital industry more than a decade ago. He introduced a term in that book that I invoke often — “the narrative fallacy” in which we construct narrative from our past to give us assuredness that we “know” what will happen in the future. The narrative fallacy is insidious and I encourage all of you to think about you own narrative biases when you make decisions — especially to investors. The Black Swan should be required reading for any VC or investor.

In fact, I think the older you get the more hardened you get in your narratives or beliefs about markets and you become “jaded” about what can’t possibly happen in the future. Anybody my age will surely remember Glum from “Gulliver’s Travels” — the character who would always say, “It will never work.” or “we’ll never make it” or “we’re doomed.”

Given the nature of ‘the narrative fallacy” and how one becomes jaded over time, my mantra has become:

“I look for naive optimism”

I like backing founders who sometimes “don’t know what they don’t know” and when everybody else says “it’s not possible” they carry on and do this. I remember when I spoke with Aaron Levie early in the days of Box. Since I had built 2 document management companies I knew all the reasons why building a document management company would “never work.” Aaron was very impressive and had I not had so much “experience” in documents perhaps I would have backed him but my narrative fallacy got in the way. Of course he’s gone on to become one of the most influential thinkers in our industry and Box is worth $2.3 billion!

In order to get a better sense on your own fallibility I would encourage every entrepreneur to watch Brain Games, a Nat Geo series that is available on Netflix (and YouTube for a fee). It is not only a critical series in understanding the basics of human cognition but also will help you better understand customer decisions on pricing, packaging, etc. A short set of free YouTube videos are in the playlist below if you want to get a taste of what it’s like.

I watch this show religiously with my boys to introduce them to the notion that you brain plays tricks on you and what your mind thinks it knows are often constructs that don’t objectively exist.

And of course the grandfather of many theories of the fallibility of “professionals” is Daniel Kahneman, a Nobel-prize winning behavioral economist who wrote the absolute MUST READ, “Thinking, Fast and Slow.” In fact, I don’t know how you can be an investor without reading this book.

 

Kahneman goes through many experiments that he and his colleagues ran over the years to show just how biased we are in decision makers.

For example, he showed data supporting that judges tend to hand out much stronger sentences just before lunch when they are at their hungriest. He shows that prior to the Apgar Score many more babies died because doctors thought they knew whether a baby was healthy or not but lacked a simple system (a checklist!) for judging this in a less biased way.

And importantly he showed how most humans value “loss avoidance” more than the ability to gain. I see this often in investors. Venture capital is an industry where you have to accept losses because they will come more often than you would want and if you spend all of your effort in avoiding the downside (which is human nature) you end up playing defense and defense is a losing formula in venture capital.

If you know that humans are pre-wired to be loss-avoiders you have to create the work environment that encourages your colleagues to take risks and not sweat the deals that didn’t work out if the original investments were made on strong foundations. There has to be some underlying technology advantage to the team you’re backing but beyond that I would encourage more VCs to go after big ideas, accept being wrong more than right and understand that our industry is driven by huge upside outcomes on a small number of deals in which the outcome initially seemed impossible.

So I leave you with the master — with Nobel Prize winning Daniel Kahnemann on Charlie Rose.

Source: Both Sides of the Table

Leave a Reply