As a data analyst, we all read articles stressing the importance of data quality, and it absolutely is crucial. But there’s another side to the story: achieving perfect data can be expensive and time-consuming. If your data is already 99% accurate, is it worth the extra effort to squeeze out that last 1% for your business? I think the key here is finding the “good enough” point for your data quality target.
I’ve learned through my career that sometimes aiming for super high data accuracy can be a waste of time. While it’s important to aim for high-quality data, you also need to consider the impact on resources and the missed opportunity if not reflecting on what we have at the moment. The industry you’re in also plays a role. In sectors like healthcare or finance, accuracy is super important, while in others like tech or marketing, a degree of imperfection is probably acceptable. Understanding the industry’s data needs is key to making informed decisions about data quality initiatives. This is also the case when using different tools and techniques to do the analysis. We should not forget that the whole point is to bring actual value to the business and not testing out cutting-edge or doing something cool.As a data analyst, being able to define the “good enough” point for data and understanding what quality level actually delivers value for the business, and what data quality level is actually a problem, without being trapped in playing cool, is an important skill. Sometimes good enough data can still yield valuable insights. The point is, we should not hang up on the impossible dream of perfect data and focus on finding the “good enough” zone that meets the business needs instead.