Recent Czech survey data + elections = a disinformation site’s dream

Recent Czech survey data + elections = a disinformation site’s dream

The most recent round of Eurobarometer stats just came out, and they’re bad news for pretty much anyone in Czech politics right now.

Only 18% of Czechs trust their government right now, a decline of 10% from autumn 2016 – by far the sharpest decline in the EU – and only Greeks, Italians and Spaniards distrust their government as much as Czechs do.

DDBixubXYAAZSFv
Oy.

Still, this isn’t nearly as ugly as the table for the question on trust in parliament…

DDBjvMjW0AAi0f9
Double oy.

Nobody in the EU distrusts their country’s parliament more than Czechs do right now, all thanks to the Czech government’s farcical three-part comedy act/political crisis over the past few months (yeah I linked to a Wikipedia article I don’t care).

This level of (dis)trust shows up in recent Czech Public Opinion Research Centre (CVVM) survey data too – their numbers also show that trust in President Miloš Zeman , the government (“Vláda”) and the Chamber of Deputies (“Poslanecká sněmovna,” the lower house of the Czech parliament) has completely tanked.

may 2017
“Table 1a: Population’s confidence in constitutional institutions (%) – comparison over time”

Worse, look at the way Czech satisfaction with the current political situation has driven right off the cliff after a slow recovery from 2013’s scandals.

political cvvm
“Graph 4: Satisfaction with the current political situation from 2011-2017 (Satisfaction Index 0-100)”

And, as if you needed another graph to show how bad it is, look at the drop for both president and government here (the blue and red lines, respectively).

cvvm trends
“Graph 3: Confidence in institutions 2011-2017 (confidence index)”

Numbers like this should be worrisome for a country at any time, but remember the Czechs are going to the polls in just under four months to elect a new parliament – and at the polls again not long after to vote for president.

If I were, say, part of a government of an unnamed country’s efforts to interfere and meddle in other countries’ elections, I’d be all over the Czech Republic this summer.

…to that end, another set of numbers I’ve had kicking around for a few weeks from two previous Eurobarometer surveys (both autumn 2016) show just how said unnamed country’s efforts could actually work.

One, Czechs seem to trust social media more than most other Europeans. While it’s still a minority (40% disagreeing that “information on political affairs from online social networks cannot be trusted,” which is a mouthful of a double negative but yeah), it’s also more than any other EU country.

cz

Eurobarometer’s data is free to download for losers like me, so I took a look in more detail at who exactly in the Czech Republic thinks information on politics from social media can be trusted (to the extent the data can tell me – the sample size is ~1,000, so it can’t be parsed all that much, and Eurobarometer IMO doesn’t have the best questions about education level and essentially not much useful on income or a proxy for income).

As shouldn’t be any surprise, it’s the young: 60% of Czechs aged 15 to 24 disagreed that “information on political affairs from online social networks cannot be trusted” compared to 48% of those 25 to 39, 44% of those 40 to 54 and 27% of those 55 and older. Also interesting are the “don’t knows” – only 5% of 15 to 24 year olds compared to 19% of those 40 to 54 and 39% of those 55 and older.

The same trends show up in different questions about social media – in this one below, for example, Czechs are among the most likely in the EU to think social media is reliable.

social media

Interestingly, 49% of those 15 to 24 years old, 50% of those 25 to 39 and 46% of those 40 to 54 think social media is reliable – in other words, a similar if not identical proportion – but only 31% of those 55+ think social media’s reliable.

One of these Eurobarometer surveys, coincidentally, happened to ask people their attitudes about various countries, including Russia (my mention of Russia is, of course, purely hypothetical and definitely, definitely not related to the “unnamed country” above).

Run what Czechs think about social media (i.e., the agree/disagree question on whether it’s reliable) against what they think of Russia and the results are pretty interesting – Czechs who think social media is reliable also tend to be more positive towards Russia.

   Total “positive” towards Russia   Total “negative” towards Russia
 Social media reliable?  49% 49% 
 Social media unreliable?  35% 64% 

Caveat, though. This question about being positive/negative towards Russia isn’t necessarily a proxy for what they think about the Kremlin or, for that matter, anyone else in the world. It doesn’t necessarily mean the respondent is some sort of zombie “radicalized by Russian propaganda” or even necessarily positive towards the Kremlin or Russia’s foreign policy, etc. Also, some respondents may well have interpreted the question as being positive/negative towards Russian people in general. Still, it’s interesting that the data falls out this way – and falls out this way across many other EU countries – and merits a hell of a lot more study than it’s getting.

There’s a ton more numbers I haven’t mentioned here (e.g., Czechs get more news from websites and trust the Internet more than most other Europeans) that, in all, paint a potentially very ugly picture – a population that increasingly distrusts its politicians and tends to trust social media and the web more than most other people. It’s a disinformation site’s dream.

Another quick look at Bulgaria’s “Gallup International” (aka Part II)

Another quick look at Bulgaria’s “Gallup International” (aka Part II)

I’ll forgive you if you don’t remember Bulgaria’s Gallup International (no, not that Gallup – the Gallup International “being sued by Gallup Inc. for using its name without authorization”), who very briefly popped into western headlines in March thanks to a Wall Street Journal article.

Published only days before the election, the article alleged that the pro-Kremlin Bulgarian Socialist Party (BSP) received a secret strategy document proposing, among other things, that they “promote exaggerated polling data” to help build momentum for the party and get them elected. Gallup International was specifically named in the article, and referred to by Bulgaria’s former ambassador to Moscow Ilian Vassilev as one of Moscow’s “Bulgarian proxies”; Vassilev, talking about a poll Gallup International published on an apparent lack of NATO support in Bulgaria, accused the polling agency of making a “wrapped-in-secrecy poll [that] had no details on methodology nor funding sources.”

Of course, the WSJ article wasn’t without its critics, myself included. One journalist in Bulgaria said the WSJ article “completely ignored all the internal political factors that led to the landslide victory of [President] Rumen Radev and focused only on Russian influence in elections,” which ultimately, in his opinion “backfired, instead of there being alarms for (legitimate) concerns about Russian involvement in Bulgaria.” (To say nothing of the role of the “crackpot outfit” at RISI apparently being involved in producing the document.)

Debates about the scope and scale of Russian influence in Bulgaria aside, I decided to take a look at polling data in Bulgaria leading up to the parliamentary election on March 26. Is there anything to support this idea that Gallup International and potentially other Bulgarian pollsters have used and promoted “exaggerated polling data” to benefit the BSP?

There might be.

Here’s a table of all the polls I’ve been able to find in 2017 leading up to the election, with levels of party support noted and the final actual election results at the bottom.

bulgaria poll jpg

(larger pdf of same chart here: Bulgaria polls)

Almost all polls from polling firms like Trend Research (3 of 3), Alpha Research (3 of 3, and, full disclosure, whose managing director I interviewed as part of my April piece for Coda Story) and Estat (2 of 3) showed Boyko Borisov’s GERB having a lead.

To break it further down (remembering that GERB ended up winning by 5.6%)

  • Trend Research: average GERB lead of 1.93%
  • Alpha Research: average GERB lead of 2.77%
  • Estat: average GERB lead of 3.47%

But the story’s pretty different with Gallup International, who one Bulgarian source described to me as “the main political/sociological advisor of the [BSP] at election time.” They published four polls during the campaign, only one of which showed GERB in the lead – their final poll, about a week before the vote, which showed GERB with only a 0.6% lead. On average, Gallup International’s polls showed an average BSP lead of 0.7% throughout the campaign.

Again, GERB ended up winning by 5.6%.

The numbers are even stranger for AFIS, a pollster run by Yuriy Aslanov, a sociologist who has sat on the BSP board. Both its polls showed the BSP in the lead by an average of 1.3%.

Again, GERB ended up winning by 5.6%.

In sum – a pollster that’s been accused as being part of an effort to “promote exaggerated polling data” on behalf of the party it’s linked to consistently showed results that played up the support for said party, contradicting most other polling firms’ results throughout the campaign. As a western observer and polling nerd who’s worked for multiple firms in Canada and the UK, I feel I’m in a position to confidently say this isn’t normal.

Of course, this all needs a few caveats. There’s absolutely no evidence of anyone cooking up numbers or anything like that – I am accusing no one of that. I’m also well aware of how statistics work and am well aware of the (however unlikely) possibility that these figures from Gallup International and AFIS are all down to random survey errors or even differences in methodology. Still, something’s off here.

Above all though, I’d stress this message to other journalists who haven’t had the pleasure of drowning a considerable part of their adult lives in SPSS – if some polling numbers look consistently very different from what other polling numbers are saying, ask why. Sometimes it’s an easy answer – a different, new methodology that may or may not pan out, a rogue poll (i.e., the one out of 20) or the polling firm just really sucks. But sometimes it’s not that simple.

On “First the journalists, then tanks and bombs”

On “First the journalists, then tanks and bombs”

OK, I’d seen this article and graph kicking around Twitter for a day or two before I finally looked at it, and I’m both glad and not glad I did.

14596705_4d6b2eaed9829d836f3bdd7b03ca9ba4_wm
This impressive-looking graph. You’ve seen it, right?

For anyone who hasn’t already seen it or (like I had) has given it only a cursory weekend glance,  the graph is based on an analysis done by Semantic Visions, “a risk assessment company based in Prague” who “conduct…big data (meaning non-structured, large data requiring serious calculations) analyses with the aid of open source intelligence, on the foundation of which they try to identify trends or risk factors.” They also use a “private Open Source Intelligence system, which is unique in its category and enables solutions to a new class of tasks to include geo-political analyses based on Big Data from the Internet.”

OK, cool.

The gist in this case: Semantic Visions had algorithms read hundreds of thousands of online sources, including 22,000 Russian ones,  searching for different trends.

OK…though as someone who chose to suffer through a media content analysis as a thesis for some reason I have a number of methodology-related questions I don’t want to harp too much on (e.g., how is the algorithm actually designed to determine positive/negative stories vis-à-vis a human? how were the online sources chosen? etc.). A little transparency here would go a long way, proprietary nature of the algorithms notwithstanding.

What gets me is the conclusion they’ve drawn based on the data they’ve gathered and present here in this article.

The article says “the number of Russian articles with a negative tone on Ukraine [from February 2012] started to show a gradual and trend-like increase – while no similar trend can be found in English-language media.”

Yes, your data does show that. Got no problem there.

But it’s this (my emphasis in bold):

“Therefore, based on hundreds of millions of articles the possibility that the actual events in Ukraine could themselves be the reason for the increasing combativeness of Russian-language articles can be excluded. Moreover, the strongly pro-Russian President Yanukovych was still in government at the time and the similarly Eastern-oriented Party of Regions was in power. The explanation is something else: the Putin administration was consciously preparing for military intervention and the Kremlin’s information war against Ukraine started two years before the annexation of Crimea to turn Russian public opinion against Ukrainians…”

How can someone possibly draw that conclusion based solely on the numbers presented here?? Are you privy to other data or pieces of analyses that aren’t public? Because, based on the data that’s presented here, I see absolutely no justification for the conclusion that the Kremlin “was consciously preparing for military intervention.”

Consider:

  • A big part of the explanation for any apparent increase in negative coverage would be the EU Association Agreement being initialed in March 2012, right?
  • Why start the analysis at June 2011? I’d want to see the tone of coverage compared to the last bit of Yushchenko’s presidency through the beginning of Yanukovych’s – maybe the increase over 2012-2013 isn’t so much an increase as a return to “normal” negative coverage of Ukraine.
  • (OK, I lied about no more methodology questions) What about positive stories? Were negative stories about Ukraine taking up a greater share of overall coverage, or did the overall number of articles itself increase? Not being transparent on methodological nerdish issues like this really, really doesn’t help, guys.

Please – no more divining of Kremlinological intentions from incomplete, unclear sets of numbers.

Comparing Ukrainian & Russian attitudes toward each other (KIIS/Levada Centre data)

Comparing Ukrainian & Russian attitudes toward each other (KIIS/Levada Centre data)

KIIS and Levada released results this week from their regular surveys of Russians and Ukrainians and their attitudes towards each other (link to KIIS in Ukrainian, Levada’s link in Russian).

tl;dr: Ukrainians tend to have more positive attitudes towards Russia than vice versa.

  • Attitudes of Ukrainians towards Russia:
    • 40% of Ukrainians in September 2016 said their attitudes towards Russia were ‘good’ or ‘very good’ (an statistically insignificant change from May 2016)
    • 46% of Ukrainians in September 2016 said their attitudes towards Russia were ‘bad’ or ‘very bad,’ a significant increase from 43% in May 2016
  • Attitudes of Russians towards Ukraine:
    • One in four Russians (26%) said their attitudes towards Ukraine were ‘good’ or ‘very good’, a significant drop from 39% in May 2016 – which was itself a significant increase from 27% in February 2016. Some zigzaggin’ goin’ on here.
    • 56% of Russians said their attitudes towards Ukraine were ‘bad’ or ‘very bad,’ a significant increase from 47% in May.

The data over time since 2008 is pretty interesting, so interesting I decided to make a barely readable graph. Ukrainians’ attitudes to Russia = blue. Russians’ attitudes to Ukraine = yellow/…mustard?

image-29

A few observations, if you haven’t got a headache yet from having to squint at this thing:

  1. At no point are Ukrainians’ attitudes towards Russia worse than Russians’ attitudes towards Ukraine, even in the aftermath of the annexation of Crimea and the start of war in Donbas by May 2014. At every single data point Ukrainians have more positive and less negative feelings about Russia than Russians have for Ukraine.
  2. Russians’ attitudes towards Ukraine got really damn low in late 2008/early 2009. A function of Yushchenko’s presidency and the gas disputes?
  3. Once Yanukovych got elected in February 2010, Russians’ attitudes tend to even out (keeping in mind the gaps in actual survey dates in 2011).
  4. Ukrainians’ attitudes have got a bit better towards Russia recently but, not surprisingly, are still far below pre-Maidan levels.
  5. I can’t explain the zigzagging with Russians’ attitudes over 2015/2016. If you can, great.

Russian Duma elections: just how bad was voter turnout?

Russian Duma elections: just how bad was voter turnout?

According to Vladimir Putin, voter turnout in Sunday’s Duma elections – estimated at 39% as I write this – was “not the greatest, but high.”  Was it?

I took a look at IDEA’s Voter Turnout Database, which has data on all parliamentary, presidential and European Parliament elections across the world since 1945. Where does a 39% voter turnout in a national parliamentary/legislative elections rank?

Well, for starters:

  • The lowest turnout in an American congressional election was 2014, at 42.5%. Yes, that’s pretty close to 39% and might make easy fodder for the quick-to-false-equivalence crowd, but keep in mind that:

1. Americans vote in congressional elections every two years (all House of Representative seats plus 1/3 of the Senate) unlike the rest of us who go every four, five or six years. Voter fatigue much?

2. 2014 was a midterm election (i.e., not voting for a President at the same time) which always have markedly lower turnouts than in presidential years. Case in point: 2010: 48.6% / 2014: 42.5%. 2008: 64.4% / 2012: 64.4%, the years Obama was (re)elected.

  • The lowest turnout in recent Canadian history was 2008 (59.5%), if anyone other than me cares about Canada as a reference point. We’d had one less than three years before, both producing Stephen Harper (Conservative) minorities, or ‘hung parliaments’ for the more British among you.
  • As for the UK, the lowest was 59.4% turning out in 2001 for Tony Blair and Labour’s second straight victory.
  • France’s lowest was 55.4% in 2012. Legislative elections, since 2002, fall right after presidential elections in France (i.e., a month after you vote for president).
  • Next door in Ukraine, the October 2014 Rada elections had a turnout of 52.4%.

It gets worse when you look at the entire data set for parliamentary elections (excluding countries like Australia that have compulsory voting, and leaving out two outlier elections that had [!!!] 2.3% and 100.3% turnout)…out of more than 1400 national parliamentary elections worldwide, 4% had voter turnouts of 40% or less. Only 11% even had voter turnouts of 50% or less.

I guess it all depends on what your definition of высокой is.

Taking issue with the OSCE SMM’s report on IDPs in Ukraine

Taking issue with the OSCE SMM’s report on IDPs in Ukraine

Last Friday the OSCE Special Monitoring Mission (SMM) released a report on internal displacement in Ukraine and it seems like they want it to be read by as few people as possible.

The OSCE SMM didn’t just release this report on the last Friday in August – they buried it late in the afternoon on the last Friday in August (17.30 Ukraine time – 10.30 am Eastern in Canada/US). OK?

And looking at when the focus groups and interviews were actually done for the report, it’s not like they didn’t have time to release it when people might actually be paying attention:

“Focus group discussions and individual interviews were conducted between August and November 2015 in 19 regions across Ukraine” [emphasis mine]

Listen, I know there were a lot of focus groups and interviews – 161 groups and 39 individual interviews, to be precise, so more than 1,600 people in total. I’ve been that guy having to organize transcriptions and analysis of piles of focus group and interview findings. It takes time. But you mean to tell me it’s taken no less than nine months to do all this?

If they have, the quality of the report is pretty disappointing. This thing rambles on, with barely a signpost for the reader to know what the most important findings are. We don’t get any stand-alone block quotes from IDPs themselves to help contextualize and understand how they’re coping in new communities. We’re treated to vague discussions of IDP-community relations that could leave a reader thinking they’re far worse than they actually are. We get a conclusion (“Concluding Remarks”) that reads like it was pieced together the morning of (I know, cuz I’ve done it), a flimsy set of remarks that summarizes almost nothing of substance. If I ever handed a draft report like this to one of my old bosses I’d have had it handed back to me pretty quickly.

Read this report, then take a look at the UNHCR’s report from a few months ago about IDPs and host communities in Ukraine, and also one of the IOM’s regular reports every few months. I challenge you to reach a different conclusion than mine: that this report’s a watered-down stream of paragraphs that doesn’t really help us understand IDPs any better.

Should we be surprised? Probably not. According to one former OSCE observer:

“According to established OSCE practice, reports should not provoke major controversies. Instead, they should be politically acceptable to all member states, with the emphasis on ‘balance’ rather than ‘objectivity’. In addition to this approach, I also quickly learned that I was only one of several links in the chain of report preparation. Information provided by OSCE monitoring teams had been often already been ‘sterilized’ by the time it reached me. As a result, the reports posted on the OSCE website were often far removed from that what I personally wished to include, and what should have been included.”

I think we can add this report to that list.

Russia’s HIV epidemic dismissed as part of Western ‘information war’

Russia’s HIV epidemic dismissed as part of Western ‘information war’

(latest in Sydney Morning Herald)

Even as a UN conference began last week in New York, taking up the subject of ending AIDS, a Kremlin-backed research institute claimed the West is using HIV and AIDS as part of an “information war” against Russia.

The Russian Institute for Strategic Research (RISR) presented a report to the Moscow City Council last week on HIV in Russia where, unlike almost everywhere else in the world, rates of HIV infection are on the rise.

According to RISR deputy director Tatyana Guzenkova and her colleagues, the real goal of the West’s fights against HIV “is the implementation of the economic and political interests of US-led global structures, relying on an extensive network of international and quasi-NGOs.”

But none of this comes as a shock to Anya Sarang, head of the Andrey Rylkov Foundation.

“We’re not surprised at these kinds of statements anymore,” she tells me from Moscow, where she works with drug users and sees the scale of Russia’s growing HIV epidemic firsthand.

“It’s the usual thing in Russia now to discard science for ideology.”

There are two models for fighting HIV, according to the RISR report. The Western model, it says, is made of “neoliberal ideological content, insensitivity towards national sensitivities and over-focus of certain at-risk groups such as drug addicts and LGBT people.”

The Russian model, on the other hand, “takes into account the cultural, historical, and psychological characteristics of the Russian population, and is based on a conservative ideology and traditional values.”

The Russian model

The science says the Russian model isn’t working.

Approximately 93,000 new HIV cases were reported in Russia in 2015 – a per-capita rate almost fifteen times that of Australia. A million people in Russia have HIV, including almost one percent of all pregnant women – the threshold for a generalised epidemic.
Dr Vadim Pokrovsky heads up Russia’s federal AIDS research centre and is a longtime critic of the Kremlin’s HIV policies.

“The last five years of the conservative approach have led to the doubling of the number of HIV-infected people” Pokrovsky told AFP last year.

The arch-conservative head of Moscow City Council’s health committee, Lyudmila Stebenkova, described Dr Pokrovsky last year as a “typical agent working against the national interests of Russia.”

Not surprisingly, Pokrovsky doesn’t think much of RISR’s report.

“They use some questionable sources of information and incorrectly interpret the data they present,” he tells me from Moscow.

“Their arguments are not convincing.”

The Western contraceptive industry

According to Igor Beloborodov, one of the RISR report’s co-authors, attitudes to condom use is one of the main factors behind Russia’s HIV epidemic.

“The [Western] contraceptive industry is interested in selling their products and encouraging underage people to engage in sex,” he told Moscow City Council.

He and his RISR colleagues argue in the report that condoms “create the illusion of the safety of sex” and should not be “gratuitously distributed” in Russian schools. The solution, they say, is to completely abstain from sex outside of (heterosexual) marriage.

Dr Pokrovsky is dismissive of arguments like these.

“This is traditional rhetoric that was used thirty years ago and still used in conservative circles throughout the world. It is not particularly troubling evidence.”

Pokrovsky also takes issue with the report’s argument that “risk elimination” – in other words, completely giving up drugs and extramarital sex – is superior to harm reduction approaches.

He stresses that “risk elimination” is actually impossible. “The experts who understand this know that people cannot just give up extramarital sex,” he says, “and people dependent on drugs cannot just stop using them.”

‘AIDS and drugs will solve each other’

Even though 60 per cent of HIV-positive Russians are injection drug users, methadone replacement therapy remains illegal in Russia despite clear evidence of its effectiveness.

It’s an approach that has led some commentators to question if the Kremlin is even that interested in the fight against HIV at all.

“I have had conversations with Russian government officials who have said things like ‘AIDS and drugs will solve each other,'” Daniel Wolfe, director of the International Harm Reduction Program at the Open Society Foundations (an organisation funded by frequent Kremlin target George Soros), told The Verge.

“So I think there’s some question about whether or not Russia is actually committed to protecting the lives of everyone, or whether drug users fall into the category of ‘socially unproductive citizens’ the state might just as well do without.”

In Moscow, Anya Sarang worries about the impact the “information war” rhetoric from the Kremlin and its allies might have on Russian with HIV.

“It send the completely wrong message,” she says of the RISR report. “Basically this report is saying that the HIV problem is something made up by western media.”

“The Russian population is really psyched up now with this whole anti-Western ideology and discourse,” she warns. “People might take this seriously and think ‘oh, HIV isn’t a Russian problem – it’s just a part of the information war.'”