There is No Message in That Latte

What happens when you base your strategy on a pattern that isn’t actually there?

Jason Goldberg and Bradford Shellhammer founded Fabulis, a social network for gay men. In 2009 they transitioned it into an e-commerce site called Fab.com. Fab catered to its clientele with short-notice flash sales of specialty items like rhinestone-bejeweled motorcycle helmets. Customers flocked in to spend!

Its social networking roots lived on, and customers would link their various social profiles to their Fab accounts, for which they received coupons, credits and other perks.

Plus, it was exclusive! Only those with an invitation from an existing customer were allowed to spend money at Fab.com! Between the social networking effects and the rewards, people invited and invited! It grew fast. Between 2009 and late 2011 it went from 275,000 to over 10 million members.

With investments from Silicon Valley A-listers it seemed certain that Fab.com was headed for unicorn status.

Scaling or Pivoting?

With such high growth, the leadership drew an unusual inference. Instead of doubling down on their differentiator—unique products, unique market, and a unique go-to-market process—they saw harbingers of mass market success.

To accelerate that trend they acquired 3 European competitors. The newly acquired customer base was hungry. To feed that ravenous consumerism, Fab invested in a supply chain to provide more inventory, and the real estate to store it. All of that added costs.

The huge growth in demand meant less unique inventory. Customers complained that they saw the same stuff on Amazon and eBay. And as they streamlined to accommodate the growth, even that last harbinger of the original Fab.com had to go; they ceased offering flash sales.

You can probably see the writing on the wall. Without any of the elements that had made Fab successful, it lost its shine.

There were a lot of bad decisions along the way to failure. But they all started with one misjudgment: The founder and board’s belief that the pattern they saw in their early growth indicated mass market appeal. That small inference about what was in the data and what it implied would kill the company. By 2015 Fab was all but gone.

Jesus is on This Slice of Toast

We are all wired to identify patterns. It’s essential for learning, decision-making, and survival. Without it, we would never have discovered how typhoid was transmitted, or the relationship between thunder and lightning.

But everything that seems like a pattern is not. Occasionally we see non-existent patterns. For example, If you get a car in a rare color, say, British racing green, it will suddenly seem like there are many more green cars on the road than there used to be. There aren’t. But it feels so real. “Look at me. I’m a trendsetter”, you think as you pass another green vehicle.

Seeing non-existent patterns is called Apophenia.  It was first described to apply to psychosis. For example, you probably remember the movie A Beautiful Mind. In it, mathematician John Nash believes that men in Manhattan are signaling him by wearing red ties.

Today, with the world awash in conspiracy theories, false pattern recognition seems epidemic. Of course, there is a marked difference between noticing more green cars and believing that your Netflix recommendations are advising you to buy crypto. But the difference is one of degree, not kind.

Helpful Pattern Recognition

Patterns are sometimes real. In 2009, Airbnb was not thriving. They were bringing in an unimpressive $200 a week in revenue. Founders Brian Chesky and Joe Gebbia scrutinized their site for answers and noticed something. Of the worst performing Manhattan locations, all 40 had awful photos. Perhaps, they thought, that accounts for their lack of guests.

To learn more, they tested their idea by going to New York, taking high-quality images of the same properties, and updating their Airbnb profiles. It worked. Better pictures doubled weekly revenue.

Notice the difference: Instead staking everything on an instantaneous observation, they posed a hypothesis. It could have been wrong. If after adding better photos the results had not changed, they would have sought different solutions.

Strategic decisions demand actual information, not speculation about unproven patterns and inferences. Whether the data are sales, geography, or demographics, without serious analysis it’s impossible to distinguish between what’s real and what’s coincidental.

Bananas and Liars

If your organization is anything like my clients’, then you are awash in data. Not only do they collect masses of data—two of them are even in the data business.
But data alone is NOT actionable information. In its raw form, it is largely noise. We must navigate to the signal—and it isn’t always a straight line. [click to tweet this thought]. When we see patterns instantly, they are usually shallow observations that ignore randomness.

In those moments, “System 1” reacts, and usually identifies a pattern that confirms our beliefs. We bring our confirmation bias with us even to our data analysis.

Big data is noisy. That makes it possible for myriad different patterns and conclusions to be gleaned from any large dataset.

Given the right instructions, an AI will do it for you –equally as fallaciously. If you train a neural network to recognize (say) bananas, and then ask it to find a banana in an image of random pixels, it will find it. (This was a real experiment). When software sees false patterns, it’s not apophenia. Instead, Google named it Inceptionism.

Data Don’t Lie Unless We Insist

Whether apophenia, confirmation bias, availability heuristics or any of numerous other false guides –we are at our worst when trying to make decisions based solely on data and our perception of patterns. Modern culture says that data don’t lie. But neither does it tell the truth. Instead, data contain atomic particles of information, some of which may lead to truth and most of which are fodder for imagination.

We see what suits us.

Comments are closed.

Earlier Posts