Five mistakes interpreting data

We’re living in the age of data and everybody is making data-driven decisions by interpreting data.

As a data scientist, I often see people making the same mistakes again and again. Here are the top five most common mistakes.

Failing to relate the numbers

Data analysis shows that one of the KPI is 65%. The newspaper reports reduction of carbon emissions of five tonnes per year. Your diating app says you have consumed 200000 joules of energy at the last meal.

Suprisingly many people would believe that the KPI is quite well and solid, the carbon emissions were hugely reduced, and they’ve overeaten during last meal. This happens, because they relate these numbers to some average, implicit baseline, which may or may not be correct.

We often see percents when we speak about revenue, interest rate or growth and we get used that revenue and interest rates run around 0% to 20% and that growth of 20% is often considered to be very good.

We usually weight below 100 kg, our car weights 1.5 tonnes, so when we hear about 5 tonnes, it has to be a lot, right?

Whenever we see numbers like 200,000, we think about something really expensive, like a very luxury car or some small real estate. So 200,000 joules has to be a lot.

But if your performance indicator was between 90% and 120% in the last ten years, 65% is quite an unsatisfactory news.

The worldwide yearly carbon emissions are 38,000,000,000 tonnes, so 5 tonnes are merely 0.000000013% of the total emissions.

And 200000 joules are just about 50 cal, something like one apple.

Establish a meaningful baseline and compare any analytics results with this baseline.

The most common baselines are the values from the other time periods (KPI example), some global total value (carbon emissions) or the conversion from some rarely used unit like joules to a more intuitive one like calories.

Failing to account for costs or consequences

Everything has advantages and disadvantages. Everything good comes at cost. Looking only at one chart in the analytics report can lead to wrong decisions.

Yes, the KPI has fallen down from 100% to 65%. But what if it has been achieved with only 10% of the costs?

Yes, five tonnes CO2 per year is very little. But if it has negative costs (selling the car and using public transportation), then 100 millons people can be motivated to do that and so together they will contribute to 1.3% of carbon emissions reduction.

Yes, consuming just 50 cal for a meal looks good if we want to lose weight, but eating just one apple for lunch might induce food crawings and so increase the risk of abandoning the diat.

Put results of the analytics report into relation with its costs and consequences, before labeling the numbers to be “good” or “bad”

Failing to take data quality into account

Just because an analytical report looks authoritative with its tables of numbers and charts, there is no guarantee that the numbers have meaning at all.

Was our KPI really 65%? Or some new processes and recent reorganizations in our company were just overlooked and not included in the KPI calculation? Is the KPI of this year comparable with the KPI of the last year at all?

How did they measure the 5 saved tonnes of carbon emission? Did they attach a measurement device at the exhaust of the car? Or they just took the carbon captured in the gasoline? Did they take the emissions into account for production and for scraping of the car? Can public transport be considered emission-free?

How did we measure the 200,000 joules of energy? Did we burn the food and measure the released warmth? Or we just pointed our smartphone camera and hoped that the app can tell an apple from a brownie in shape and color of an apple?

Understand how the data has been produced or measured and take its limitations into account. Improve data quality if you are to make an important decision, or else make your decision tentatively and be ready to change it if some contradictory data emerges.

Failing to consider casuality or confounders

Consider these two maps of the USA:

On the left is the frequency of UFO sightings [1], on the right is the population density [2].

The red dots on the left correlate with the red dots on the right, and we should have at least three possible explanations for that:

1) More people means there is a higher chance that at any given second at least one of them is looking at the sky so the UFOs have higher chances to be spotted.

2) Aliens are secretly living at Earth and they use their secret political influence to support policies that lead to higher population density (agglomeration of attractive employers, zoning rules etc), because they feed from the stress that humans emit when living in overpopulated areas.

If we want to depict these ideas, we can use causation graphs like this:

This is also widely known as “Correlation is not Causation”. From the data alone we cannot determine the direction of the arrow.

Confounders are less widely known and they go like this:

3) There is a unknown field, distributed everywhere in the world with different strength. Physicists don’t know about this field yet. Let’s call it the Force. Some part of Earth emanate the Force stronger than others. The force makes people more optimistic and happy and so people tend to unconsciously dwell around sources of the Force. Aliens fly by to mine the Force, because they use it as a currency.

The Force is a confounder for the correlation beween UFO sightings and population density:

So, which explanation is the correct one? From this data alone, it is impossible to tell.

Perform additional analysis to establish casuality and control confounders, or else, avoid making statements about the causation of the data report.

Failing to be really data-driven

I have a strong opinion that the Explanation 1 (more people lead to more UFO sightings) is the correct one. This opinion is based on my personal experiece and common sense. If I was especially well trained in sociology and UFO data, you could call it “my expert opinion”.

Expert opinions are not data-driven. They are based on experience, common sense and intuition.

Another reason of being not or not fully data-driven is the pressure of responsibility, typically to find at the C-level of management or in politics. There are so many political factors as well as soft factors, marketing, influence, group dynamics, psychological factors, company culture, and big money that discourage the search for truth. Data is only accepted when it confirms the VIP opinion.

Don’t hesitate to state that your make the one or other decision in a not data-driven manner. Data is one, but not the only way to make great decisions.

Do not pretend being data-driven while making some or all of these five mistakes, just because being data-driven is “in” and it gives to your decision an scientific touch. Data scientists are watching you and they will know the truth.

Data sources



Leave a comment