In the wake of its annual Bratislava Global Security Forum at the end of May, Slovak think tank GLOBSEC released a report of what it called a “comprehensive analysis of public opinion surveys” from surveys in seven central and eastern European (CEE) countries.

Compared to ugly Word reports I’ve spit out in my time, this one’s got no shortage of big bold headlines, shiny graphics and sexy graphs. But as someone who writes a lot about pro-Kremlin disinformation in Europe, it’s probably no surprise which page caught my eye.

fakenews
You’ve got my attention.

“Almost 10% of people in the CEE trust online disinformation outlets as relevant sources of information on world affairs,” they say.

Well. I’m hooked.

The next page was even better…but it’s when I started asking questions.

media d and e
Wait…what?

According to their surveys a range of 1% of people in Croatia to 31% (?!) in Romania “consider online disinformation websites as relevant sources of information.”

This is where I started realizing how many pieces of the puzzle are missing here.

1) How exactly did you ask this question? You can’t just ask someone “do you consider online disinformation websites as relevant sources of information?”

So how did you ask it? Was it a proxy question, like the way that the International Republican Institute (IRI) asked in their recent Visegrad surveys (i.e., “Do you watch or read media outlets that often have a different point of view than the major media outlets?”) Was it a series of questions, or what? Without the actual questioning wording here it’s hard for me to take this seriously.

2) How many people were asked the question? This is so ridiculously basic and yet it’s nowhere to be found.

The methodology “section” is up at the front of the report and it’s about as long as a calm, decidedly-non-mega Twitter thread:

“The outcomes and findings of this report are based on public opinion surveys carried out in the form of personal interviews using stratified multistage random sampling from February to April 2017 on a representative sample of the population in seven EU and NATO member states…

For all countries, the profiles of the respondents are representative of the country by sex, age, education, place of residence and size of settlement. „Do not know“ responses were not included in data visualizations.”

So we don’t know how many people were asked the question – and we don’t know how many people responded “don’t know,” so we have absolutely no idea how large or small the base are for the numbers they’ve graphed up for us here. And we don’t even know exactly what questions were asked. Weak.

3) What the hell is going on with Romania’s number? Look, in the dozens upon dozens of surveys I’ve run in my life, if I see six of seven figures on the low end and then one of them almost three times higher I’m going to ask questions. Sometimes there’s an obvious, easy explanation. Sometimes it’s a more complicated explanation. Sometimes you don’t have one. And, sadly, sometimes it’s because you screwed something up running the numbers or, worse, the whole lot of you muffed something up administering the survey.

Doesn’t look like anyone’s asking questions here. We get no explanation of why Romania’s figure should be that much higher. No explanation of why, apparently, almost a third of Romanians “consider online disinformation websites as relevant sources of information” when only 1% of Croatians (i.e., basically no one) do. Surely that merits at least some attempt at an a explanation.

4) How exactly did you arrive at the conveniently round “10 million” figure? OK, part of this is obvious – you took the percentages in each country of who said (“said”) they trusted disinformation websites and divided into the population of each country.

But what population? 18+? 16+? Official population figures? Registered voters?Transparency is a lovely thing, especially with survey numbers that you use to make bold, attention-grabbing claims.

This “10 million people in CEE trust fake news and disinformation websites” headline has already buzzed around CEE/disinfo-busting social media for a week now. It’s since made it into the East Stratcom Task Force’s Disinfo Review, and surely it’s going to find its way into a few more articles and probably even a few speeches and still more conference panels.

It shouldn’t. It’s a questionable claim based on a completely non-transparent survey analysis, delivered as part of a think tank’s glossy PR exercise. Bullshit’s no way to win the (dis)information war, guys.

Advertisements