Content is king, but it seems that data is ruling the world. We are exposed to loads of content every day, including pieces of data designed to convince us some thesis is correct and drive us to action. Opinions, so it seems, are no longer enough. Why should anyone take you seriously if you can’t back up your ideas with data?
This by itself is not bad, but as we all know, fake data is as common as real data (if not more widespread). Identifying accurate data and dismissing fake facts and figures is essential but far from trivial.
Some issues are relevant to us even though we don’t have any personal or professional experience to help us make sense of them. In one of the essays in his book Escape into Meaning , Evan Puschak describes his thoughts on issues such as climate change or economic dilemmas. When he reads some information or thesis on such topics, he has no choice but to cross-check the info against other resources, ask difficult questions, and hope to find validating or refuting answers from different domain experts. Issues such as climate change impact all of us, but unless you are a climate scientist, you cannot evaluate one piece of information against another. You can do nothing beyond consulting more experts and deciding who you trust.
But there are domains where we do have some personal or professional experience. In these cases, it is also imperative to challenge the data either by validating the credibility of its origin or by trying to find supporting or contradicting data; it is equally important to ask ourselves whether this piece of data is relevant. Being credible is not enough for a bit of data to change how we operate.
When I write about workplace communication, I often say most knowledge workers (employees and managers alike) are frustrated with the ineffectiveness of communication. The collective experience suggests that emails are distracting and meetings (as they are typically managed) are a waste of time. I quote these numbers because I believe them to be true. I didn’t invent them, and numerous resources share the same conclusion, even if slightly different numbers. You, the reader, don’t have to take my word for it. As Evan Puschak suggests, you can do your own research, cross-check the numbers with other resources, and form an opinion on the validity of the statistics I quote. But unlike the case of climate change or global economics, workplace communication is an excellent example of a topic that is part of your personal experience. In such cases, your experience becomes an essential source of data you must consider.
Not every universal problem is also your problem. As solid as an argument is statistically, no statistical fact is valid for 100% of the cases. When I have personal experience with the issue at stake, I don’t just challenge the validity of the universal data; before deciding what to do with it, I apply the relevance challenge.
Does it Resonate with My Experience
The first question I ask is whether the data resonates with my experience. 80% of knowledge workers find their meetings ineffective. That’s impressive, but what do I feel based on my experience? There might be problems I don’t think of, and reading that some shared experience somehow eluded me is a good trigger for reflection. This reflection, however, can go both ways. I might conclude I know exactly what the surveyed workers talked about because I have a similar experience. But I might end up thinking that despite that common problem, the meetings I conduct and participate in are, in fact, quite effective.
Personal experience does not invalidate statistical insights. It is crucial when I consider whether there is something I should do with it.
Does it Resonate with the Experience of Others
I could be wrong, though. My experience might not represent the shared experience of my team and colleagues. The data in question might actually be part of their daily experience. So, after doing my personal relevance check, I ask around. I share the information I’ve read and ask people if it resonates with them.
This is not an act designed to invalidate the data but merely verify it is relevant to the broader group of people I work with. If my team is frustrated with our meetings, it doesn’t do much good that I find them effective and vice versa. Simply put, I am trying to take the conclusion derived from a global dataset and see if it applies to the relevant dataset: my colleagues, team, and myself.
Can I Benefit from the Ideas
Challenging the relevance of the data by itself might make us miss opportunities, though. What might be an essential solution for the general population can still yield a significant improvement for us, even if we don’t seem to experience the same concrete problems.
The effectiveness of workplace communication is a great example. You don’t have to be very frustrated to benefit from some ideas designed to improve communication. Even if you can honestly say you and your team are communicating effectively, some ideas intended to solve a more acute problem can help you take your communication to the next level. The bar for implementing these ideas will undoubtedly be higher than if you experience the more radical problems presented in the statistical data. You should consider this when asking yourself what you can gain from implementing the ideas.
The relevance challenge is still a challenge, and there will be cases where after considering these three questions, you will realize neither the data nor the ideas derived from it apply to you. Implementing the relevance challenge is, first and foremost, a filtering mechanism: You cannot afford to implement every good idea out there, nor should you. At the same time, the relevance challenge is a way to avoid missing opportunities.