Recent Czech survey data + elections = a disinformation site’s dream

Recent Czech survey data + elections = a disinformation site’s dream

The most recent round of Eurobarometer stats just came out, and they’re bad news for pretty much anyone in Czech politics right now.

Only 18% of Czechs trust their government right now, a decline of 10% from autumn 2016 – by far the sharpest decline in the EU – and only Greeks, Italians and Spaniards distrust their government as much as Czechs do.

DDBixubXYAAZSFv
Oy.

Still, this isn’t nearly as ugly as the table for the question on trust in parliament…

DDBjvMjW0AAi0f9
Double oy.

Nobody in the EU distrusts their country’s parliament more than Czechs do right now, all thanks to the Czech government’s farcical three-part comedy act/political crisis over the past few months (yeah I linked to a Wikipedia article I don’t care).

This level of (dis)trust shows up in recent Czech Public Opinion Research Centre (CVVM) survey data too – their numbers also show that trust in President Miloš Zeman , the government (“Vláda”) and the Chamber of Deputies (“Poslanecká sněmovna,” the lower house of the Czech parliament) has completely tanked.

may 2017
“Table 1a: Population’s confidence in constitutional institutions (%) – comparison over time”

Worse, look at the way Czech satisfaction with the current political situation has driven right off the cliff after a slow recovery from 2013’s scandals.

political cvvm
“Graph 4: Satisfaction with the current political situation from 2011-2017 (Satisfaction Index 0-100)”

And, as if you needed another graph to show how bad it is, look at the drop for both president and government here (the blue and red lines, respectively).

cvvm trends
“Graph 3: Confidence in institutions 2011-2017 (confidence index)”

Numbers like this should be worrisome for a country at any time, but remember the Czechs are going to the polls in just under four months to elect a new parliament – and at the polls again not long after to vote for president.

If I were, say, part of a government of an unnamed country’s efforts to interfere and meddle in other countries’ elections, I’d be all over the Czech Republic this summer.

…to that end, another set of numbers I’ve had kicking around for a few weeks from two previous Eurobarometer surveys (both autumn 2016) show just how said unnamed country’s efforts could actually work.

One, Czechs seem to trust social media more than most other Europeans. While it’s still a minority (40% disagreeing that “information on political affairs from online social networks cannot be trusted,” which is a mouthful of a double negative but yeah), it’s also more than any other EU country.

cz

Eurobarometer’s data is free to download for losers like me, so I took a look in more detail at who exactly in the Czech Republic thinks information on politics from social media can be trusted (to the extent the data can tell me – the sample size is ~1,000, so it can’t be parsed all that much, and Eurobarometer IMO doesn’t have the best questions about education level and essentially not much useful on income or a proxy for income).

As shouldn’t be any surprise, it’s the young: 60% of Czechs aged 15 to 24 disagreed that “information on political affairs from online social networks cannot be trusted” compared to 48% of those 25 to 39, 44% of those 40 to 54 and 27% of those 55 and older. Also interesting are the “don’t knows” – only 5% of 15 to 24 year olds compared to 19% of those 40 to 54 and 39% of those 55 and older.

The same trends show up in different questions about social media – in this one below, for example, Czechs are among the most likely in the EU to think social media is reliable.

social media

Interestingly, 49% of those 15 to 24 years old, 50% of those 25 to 39 and 46% of those 40 to 54 think social media is reliable – in other words, a similar if not identical proportion – but only 31% of those 55+ think social media’s reliable.

One of these Eurobarometer surveys, coincidentally, happened to ask people their attitudes about various countries, including Russia (my mention of Russia is, of course, purely hypothetical and definitely, definitely not related to the “unnamed country” above).

Run what Czechs think about social media (i.e., the agree/disagree question on whether it’s reliable) against what they think of Russia and the results are pretty interesting – Czechs who think social media is reliable also tend to be more positive towards Russia.

   Total “positive” towards Russia   Total “negative” towards Russia
 Social media reliable?  49% 49% 
 Social media unreliable?  35% 64% 

Caveat, though. This question about being positive/negative towards Russia isn’t necessarily a proxy for what they think about the Kremlin or, for that matter, anyone else in the world. It doesn’t necessarily mean the respondent is some sort of zombie “radicalized by Russian propaganda” or even necessarily positive towards the Kremlin or Russia’s foreign policy, etc. Also, some respondents may well have interpreted the question as being positive/negative towards Russian people in general. Still, it’s interesting that the data falls out this way – and falls out this way across many other EU countries – and merits a hell of a lot more study than it’s getting.

There’s a ton more numbers I haven’t mentioned here (e.g., Czechs get more news from websites and trust the Internet more than most other Europeans) that, in all, paint a potentially very ugly picture – a population that increasingly distrusts its politicians and tends to trust social media and the web more than most other people. It’s a disinformation site’s dream.

Advertisements

A lesson on how not to fight disinformation

A lesson on how not to fight disinformation

In the wake of its annual Bratislava Global Security Forum at the end of May, Slovak think tank GLOBSEC released a report of what it called a “comprehensive analysis of public opinion surveys” from surveys in seven central and eastern European (CEE) countries.

Compared to ugly Word reports I’ve spit out in my time, this one’s got no shortage of big bold headlines, shiny graphics and sexy graphs. But as someone who writes a lot about pro-Kremlin disinformation in Europe, it’s probably no surprise which page caught my eye.

fakenews
You’ve got my attention.

“Almost 10% of people in the CEE trust online disinformation outlets as relevant sources of information on world affairs,” they say.

Well. I’m hooked.

The next page was even better…but it’s when I started asking questions.

media d and e
Wait…what?

According to their surveys a range of 1% of people in Croatia to 31% (?!) in Romania “consider online disinformation websites as relevant sources of information.”

This is where I started realizing how many pieces of the puzzle are missing here.

1) How exactly did you ask this question? You can’t just ask someone “do you consider online disinformation websites as relevant sources of information?”

So how did you ask it? Was it a proxy question, like the way that the International Republican Institute (IRI) asked in their recent Visegrad surveys (i.e., “Do you watch or read media outlets that often have a different point of view than the major media outlets?”) Was it a series of questions, or what? Without the actual questioning wording here it’s hard for me to take this seriously.

2) How many people were asked the question? This is so ridiculously basic and yet it’s nowhere to be found.

The methodology “section” is up at the front of the report and it’s about as long as a calm, decidedly-non-mega Twitter thread:

“The outcomes and findings of this report are based on public opinion surveys carried out in the form of personal interviews using stratified multistage random sampling from February to April 2017 on a representative sample of the population in seven EU and NATO member states…

For all countries, the profiles of the respondents are representative of the country by sex, age, education, place of residence and size of settlement. „Do not know“ responses were not included in data visualizations.”

So we don’t know how many people were asked the question – and we don’t know how many people responded “don’t know,” so we have absolutely no idea how large or small the base are for the numbers they’ve graphed up for us here. And we don’t even know exactly what questions were asked. Weak.

3) What the hell is going on with Romania’s number? Look, in the dozens upon dozens of surveys I’ve run in my life, if I see six of seven figures on the low end and then one of them almost three times higher I’m going to ask questions. Sometimes there’s an obvious, easy explanation. Sometimes it’s a more complicated explanation. Sometimes you don’t have one. And, sadly, sometimes it’s because you screwed something up running the numbers or, worse, the whole lot of you muffed something up administering the survey.

Doesn’t look like anyone’s asking questions here. We get no explanation of why Romania’s figure should be that much higher. No explanation of why, apparently, almost a third of Romanians “consider online disinformation websites as relevant sources of information” when only 1% of Croatians (i.e., basically no one) do. Surely that merits at least some attempt at an a explanation.

4) How exactly did you arrive at the conveniently round “10 million” figure? OK, part of this is obvious – you took the percentages in each country of who said (“said”) they trusted disinformation websites and divided into the population of each country.

But what population? 18+? 16+? Official population figures? Registered voters?Transparency is a lovely thing, especially with survey numbers that you use to make bold, attention-grabbing claims.

This “10 million people in CEE trust fake news and disinformation websites” headline has already buzzed around CEE/disinfo-busting social media for a week now. It’s since made it into the East Stratcom Task Force’s Disinfo Review, and surely it’s going to find its way into a few more articles and probably even a few speeches and still more conference panels.

It shouldn’t. It’s a questionable claim based on a completely non-transparent survey analysis, delivered as part of a think tank’s glossy PR exercise. Bullshit’s no way to win the (dis)information war, guys.

The story behind that “terror attacks” map you keep seeing

The story behind that “terror attacks” map you keep seeing

You’ve seen this map somewhere on social media the last few weeks, haven’t you?

Here it is from Poland’s deputy justice minister, because somehow this is how you show solidarity with the citizens of a country millions of your own people live and work in.

2017-06-04 18.10.49

I’ve also seen it from Polish MEP and unsuccessful candidate for the presidency of the European Council Jacek Saryusz-Wolski, though my favourite version is the one tweeted out by that one guy who managed to get fired from The Rebel.

IMG_cyjezr
OOOOH this one has different colours!!

And last, a version I saw on Instagram this week.

IMG_3d1kpb
Cool.

So where the hell is this data even from? Turns out, as someone from the right-wing Polish Twitterati told me, it’s data from the reputable Global Terrorism Database (GTD) at the University of Maryland, which is an “open-source database including information on terrorist events around the world from 1970 through 2015.” (Notice right now that says “events,” not “attacks.” This will be important). I was further informed that, for some reason, the data on this map that’s been making the rounds is only from 2001 on. OK.

So I took a look through the GTD data on some of the countries (including Poland) on this map.  There are certainly no terror “incidents” (read, “incidents”) listed in Poland from 2001 on. OK, so that seems (seems) accurate.

But what about terror in other countries? I’m particularly interested in these apparent incidents in Iceland, which shows up in some versions of the map and, having been there, doesn’t exactly strike me as a terror hotbed.

Since 2001, there have apparently been two terror incidents in Iceland that explain the two Icelandic dots on the map:

  • In 2012, “An explosive device detonated near government offices in Reykjavik city, Reykjavik North Constituency, Iceland. The explosive device was partially detonated by a robot meant to deactivate it. No group claimed responsibility for the incident.” Property damage was listed as unknown.
  • In 2014, “Assailants attempted to set a Lutheran Church on fire in Akureyri city, Northeast constituency, Iceland. No one was injured in the attack; however, the building was damaged. No group claimed responsibility for the incident.” Property damage is listed as “minor.”

No one was killed or injured in these two incidents.

Again, “incidents” is the key word. These two big red Icelandic points, and many others on the map, don’t represent terror attacks at allMany of them, including these two in Iceland, represent vague criminal acts that may not actually have anything to do with terrorism (let alone jihadist terrorism), that have barely caused any property damage and, more importantly, haven’t killed or injured anyone.

Why no Polish incidents in the GTD since 2001? Surely there’s been at least one shitty attempt at something like a pipe bomb in a car that never went off (there was one in the Czech Republic database, as I discovered) that would merit a mention in this database, though presumably this will make it into 2017’s list for Poland, given the criteria for inclusion.

So the next time you see this map, you’ve got a few options. If it’s got no legend or title, you can always tell whoever shared it that the points represent vague definitions of criminal acts that don’t always seem to be reported consistently. If it says something about “terror attacks,” tell them they’re completely, 100% wrong, and tell them there’s more than enough data on the GTD website for them to make a proper map of actual terror attacks that isn’t just a cute meme for people who don’t like Muslims.

Another quick look at Bulgaria’s “Gallup International” (aka Part II)

Another quick look at Bulgaria’s “Gallup International” (aka Part II)

I’ll forgive you if you don’t remember Bulgaria’s Gallup International (no, not that Gallup – the Gallup International “being sued by Gallup Inc. for using its name without authorization”), who very briefly popped into western headlines in March thanks to a Wall Street Journal article.

Published only days before the election, the article alleged that the pro-Kremlin Bulgarian Socialist Party (BSP) received a secret strategy document proposing, among other things, that they “promote exaggerated polling data” to help build momentum for the party and get them elected. Gallup International was specifically named in the article, and referred to by Bulgaria’s former ambassador to Moscow Ilian Vassilev as one of Moscow’s “Bulgarian proxies”; Vassilev, talking about a poll Gallup International published on an apparent lack of NATO support in Bulgaria, accused the polling agency of making a “wrapped-in-secrecy poll [that] had no details on methodology nor funding sources.”

Of course, the WSJ article wasn’t without its critics, myself included. One journalist in Bulgaria said the WSJ article “completely ignored all the internal political factors that led to the landslide victory of [President] Rumen Radev and focused only on Russian influence in elections,” which ultimately, in his opinion “backfired, instead of there being alarms for (legitimate) concerns about Russian involvement in Bulgaria.” (To say nothing of the role of the “crackpot outfit” at RISI apparently being involved in producing the document.)

Debates about the scope and scale of Russian influence in Bulgaria aside, I decided to take a look at polling data in Bulgaria leading up to the parliamentary election on March 26. Is there anything to support this idea that Gallup International and potentially other Bulgarian pollsters have used and promoted “exaggerated polling data” to benefit the BSP?

There might be.

Here’s a table of all the polls I’ve been able to find in 2017 leading up to the election, with levels of party support noted and the final actual election results at the bottom.

bulgaria poll jpg

(larger pdf of same chart here: Bulgaria polls)

Almost all polls from polling firms like Trend Research (3 of 3), Alpha Research (3 of 3, and, full disclosure, whose managing director I interviewed as part of my April piece for Coda Story) and Estat (2 of 3) showed Boyko Borisov’s GERB having a lead.

To break it further down (remembering that GERB ended up winning by 5.6%)

  • Trend Research: average GERB lead of 1.93%
  • Alpha Research: average GERB lead of 2.77%
  • Estat: average GERB lead of 3.47%

But the story’s pretty different with Gallup International, who one Bulgarian source described to me as “the main political/sociological advisor of the [BSP] at election time.” They published four polls during the campaign, only one of which showed GERB in the lead – their final poll, about a week before the vote, which showed GERB with only a 0.6% lead. On average, Gallup International’s polls showed an average BSP lead of 0.7% throughout the campaign.

Again, GERB ended up winning by 5.6%.

The numbers are even stranger for AFIS, a pollster run by Yuriy Aslanov, a sociologist who has sat on the BSP board. Both its polls showed the BSP in the lead by an average of 1.3%.

Again, GERB ended up winning by 5.6%.

In sum – a pollster that’s been accused as being part of an effort to “promote exaggerated polling data” on behalf of the party it’s linked to consistently showed results that played up the support for said party, contradicting most other polling firms’ results throughout the campaign. As a western observer and polling nerd who’s worked for multiple firms in Canada and the UK, I feel I’m in a position to confidently say this isn’t normal.

Of course, this all needs a few caveats. There’s absolutely no evidence of anyone cooking up numbers or anything like that – I am accusing no one of that. I’m also well aware of how statistics work and am well aware of the (however unlikely) possibility that these figures from Gallup International and AFIS are all down to random survey errors or even differences in methodology. Still, something’s off here.

Above all though, I’d stress this message to other journalists who haven’t had the pleasure of drowning a considerable part of their adult lives in SPSS – if some polling numbers look consistently very different from what other polling numbers are saying, ask why. Sometimes it’s an easy answer – a different, new methodology that may or may not pan out, a rogue poll (i.e., the one out of 20) or the polling firm just really sucks. But sometimes it’s not that simple.