Statistics is just formalized common sense.
I'm sure I have the exact quote wrong and I don't even remember the source, but the gist has stuck with me for a while. Statistics is fundamentally quite simple: it's common sense written down. Distributions are a way of representing what might happen and their likelihoods, and everything derived from that is just logical common sense.
Simple.
When working with statistics, however, it's easy to lose sight of the true goal. Much like a speedometer on a car, statistics can provide data and insight about our world, but it's important to know when to look at the speedometer and when to look at the road.
When we decided to take a seed check from GSR Ventures, we chose them not because they were objectively the best deal, but because they felt like the right people to work with. We used information and historical data to ensure we reached a deal that we were happy with and everything beyond that was intuition and trust.
I'd like to think that everything can be characterized with statistics, but the truth is that as we make decisions we take in way more information than we know how to formalize. Sometimes, decisions need to be made based on intuition.
Why does this happen? Because people aren't rational agents. Economics has found time and time again that our models can't effectively characterize all of human behavior yet. Even as we push the frontiers of behavioral economics, modeling subtle nuances in behavior with statistics, a grand theory of human behavior still eludes characterization. So as much as we'd love to model everything and make perfectly-informed decisions, ultimately how our choices affect outcomes will be driven by a lot of this uncharacterized behavior.
Effectively unpredictable.
Sometimes, I see people trying to optimize and inform decisions by collecting every possible data point they can. But collecting data isn't free. At some point, the marginal return of each data point diminishes far below the cost of the data itself. Engineers fall into this trap a lot, trying to fix bugs before they even happen. When was the last time you tried to optimize something that wasn't even a problem yet?
I have to wonder why we ended up this way. I think the introduction of information has scared us a little. Unintuitively, knowing more about the world has made us more afraid of it. Certainly humans were not always this risk averse.
If you're reading this, you're probably in the top 1% of the world. Lucky enough to have a massive social safety net underneath us. The days of daily life and death are gone, replaced with fears of being "unsuccessful" which seems to somehow scare people more than the fear of being eaten by a lion. But how many people in that top 1% couldn't recover from failure? Probably fewer than those who were eaten by lions.
Take that leap of faith. Enjoy the ride. It'll be okay. Statistically speaking, of course.
Comment on this post on Twitter