Check check double check!

Have you ever doubted the numbers in a news article? That isn’t crazy at all. A misinterpretation of the results of a research or a large data file is easily made. As a consequence, the information that is spread with the written news article will contain false assertions. In some cases is the fault too obvious and you will doubt the information, but most of the time the people will think that the assumptions are certainly true. To reduce these false assertions, the Dutch newspaper NRC Next created a new rubric a few years ago. In this rubric, called ‘Next checkt’, editors check whether particular newsworthy assertions are true or not. NRC-next-checkt

They are not the only one with this kind of rubric; in America there were already whole organisations that started with fact checking in the media. In this blog we will take a look at some faults that can be made when you want to turn numbers in a news article. At the end we will look at an example of a news article in which it went wrong.

The risks of making data interpretations
In data journalism there are five important W questions, namely: Who?, What?, When?, Where? and Why? (Rogers). First of all, it is important to get an answer on the question ‘Where did the data come from?’. Can you say that it is a trustworthy source? The more reliable the source, the greater the chance that the data is correct. Beside that, the transparency about your source is important too. If a reader can see from where you got the data, the more he trust you. In this first part, you can already make a mistake, by believing a trustworthy immediately. So even when it is a reliable source, it is really necessary to question all the numbers you get in a data. The second step in the process is to create a clear story with the numbers in a way that readers can follow easily. It is certainly easy to write a good story about something when you have amazing numbers, but stay close to the real numbers. The third question deals with how old the data is that you collect. How younger the data that you gathered, how more up to date you are. The last two questions are about whether you can find more and other data to combine for a new story and whether you can correlate the different data collections.

The Dove self-esteem research
Screen Shot 2014-11-10 at 10.58.43In November 2012 the Dutch newspaper Telegraaf published an article with the title ‘Low self-esteem young girls’. This assertion was based on a self-esteem research carried out by Panelwizard in order of Dove. Afterwards, Dove claimed that merely one out of ten girls found themselves beautiful. If we can believe the editors of the rubric ‘Next checkt’, this claim gives a more negative perception of the situation than the research of Panelwizard first suggested (click here to read the article of ‘Next Checkt’ about this topic). The online questionnaire literally asked the girls to assess themselves. 11,9 per cent of the 503 Dutch girls between the 10 and 17 years measured themselves as ‘beautiful’. This is in line with the claim, but the editors suggest that there were more positive words than ‘beautiful’ that the girls could select: attractive (3.5 per cent), fun (33.9 per cent), hot/flashing (1.3 per cent), natural (6.3 per cent) and sexy (0.6 per cent). Only a small group (3.7 per cent) agreed with negative words (unattractive, ugly and unremarkable). The editors suggest that when you sum all these per cents, you could say that 57.5 per cent of the 503 Dutch girls are positive about themselves. This is also not entirely true, because some girls select more words than other girls. However, you certainly can’t say that the self-esteem of the girls is very low. Furthermore, the editors suggest that the research didn’t specify the term self-esteem. What is self-esteem, is that only the look? Finally, you can ask yourself whether the results that Dove showed were objective. When you take a look at the website of Dove than you see that they offer a self-esteem program to improve your self-esteem. You can join this program as parents, but also as a teacher or school.

Screen Shot 2014-11-10 at 11.44.05

So… What went wrong with the W questions?
In this example the journalist already went wrong by the first question ‘Who?’. The source wasn’t reliable enough, because they also had other interest with the results. He didn’t check other sources, which would have showed, according to the editors of ‘Next checkt’, the opposites results of Dove. If he would have checked other sources, he was also in the position to check the numbers with the result that the numbers stayed closer to the real numbers. On the other hand, the results would have been less fascinated than Dove presented them. In that case you can question yourself whether the numbers would still newsworthy enough or that the article wouldn’t have appeared at all.

5 thoughts on “Check check double check!

  1. It’s entirely true that in this case the wrong source was trusted if you want to give an objective view on the situation. The question is whether this indeed was the intention of the writer. In this case the writer decided to trust a source that is known to be commercial, and as you can read in my blog, commercial organizations often have secret goals. As already discussed during your presentation, it is a possibility that the writer was asked by Dove to write this article as some sort of advertisement for their self esteem programme.

    Like

  2. After your presentation and your blog, it seems to me, also, that the writer might have deliberately wrote the article in this way. He intentionally left out the other information to get the conclusion of low self esteem for commercial reasons. This is a nice example of negative data manipulation or carelessness of some writers. A very good solution, as you say, is if you are interested in a topic, check the data.

    Like

  3. Clear story, no clear interpretation of the data. I have a similar example in my blog, so I totally agree with you that journalists should be more careful in spreading data. I noticed (in both your as mine example) that journalists simply fail in explaining the relations between data. When they see a story in the information, they only focuss in this story and are not able to critically look at the data anymore. In most cases readers do the same, because they only spend a couple of minutes (or less) on an article.

    Like

  4. I think your application of the 5 W’s on the Dove case is very clear. It is also nice that you first give an overview of the theory. Journalists are definetly responsible for the data they are spreading. A thorough check therefore is very important. That this does not happen all the time, is pointed out in the example of the Dove Campaign. I still wonder if the fault made in this campaign is due to carelesness or due to the fact that large and shocking numbers provide a good story…

    Like

  5. I really like your example of the Dove case! This clearly shows that journalists sometimes just don’t check the data they are using or just don’t know how to interpret them. The questions remains if maybe it is none of the above and the article you found is just payed content. Maybe the journalist was contacted by Dove to write an article about this subject. The 5 W’s are a clear example of ways to prevent these kinds of mistakes.

    Like

Leave a reply to mepeia Cancel reply