Why isn’t Andrej Babis in Evropské hodnoty/European Values’ crosshairs?

Why isn’t Andrej Babis in Evropské hodnoty/European Values’ crosshairs?

The Czech Republic’s most active think-tank has barely criticized, let alone mentioned their future Prime Minister Andrej Babis – a man who isn’t exactly a shining example of western liberal democracy in action.

Remember, Babis is someone who’s:

  • The second-richest man in the country, with his Agrofert conglomerate having its hands in everything from fertilizers and farm equipment, to two of the largest Czech newspapers and its most popular radio station.
  • Been accused of having been a Communist-era Czechoslovak secret police agent (though an appellate court in Slovakia “affirmed Mr. Babis was not an agent of the secret police”).
  • Been caught on tape earlier this year coordinating coverage of his political opponents with a journalist at one of the purportedly independent newspapers he owns.
  • Accused of numerous conflicts of interest, and now someone who’s had his parliamentary immunity stripped over fraud allegations.
  • Been recently described to me as “Trump, Berlusconi and Orban all in one.”
IMG_20170918_111437-1
Babis’ paper: “Accept the Euro, fast!” Non-Babis paper: “Juncker: I don’t dictate anything to Czechia” (from https://twitter.com/FilipZajicek/status/908226594979958784)

Let’s also not forget some of the Russia-related allegations that have been thrown at Babis.

  • He’s called EU and US sanctions on Russia “nonsense” and said they’re against the country’s economic interests – a line I’ve personally heard from some Kremlin-friendly figures across Europe.
  • He’s dodged questions on whether Putin bore the blame for annexing Crimea, and has said NATO “cannot stay on this idea that Russia is the biggest problem.”
  • Under his watch the Czech finance ministry (more accurately, the Czech Export Guarantee Agency (EGAP)), underwrote a loan guarantee to PhosAgro, a Russian company co-owned by Putin pal Vladimir Litvinenko.
  • In 2007 Babis’ Agrofert tried to negotiate a gas deal with the Czech subsidiary of Gazprom instead of its then-current German supplier.

These aren’t necessarily super-Kremlin smoking guns, but I’d think a group of people who are dedicated to ferreting out Kremlin interference in their country and beyond would at least be asking a few questions about the guy who’s about to run the show.

Sure, Babis is intimidating and is the kind of guy who likes to go after people who talk shit about him – I mean, look at all the corrections Foreign Policy had to add under this 2015 article when Babis went full Babis on them.

DJ_9-8IXoAEWF1T
Full Babis.

I get why you’d want to be in his good side, but European Values isn’t exactly afraid to go after some other Czech and European political figures with less-than-subtle language: the German SPD and Sigmar Gabriel (Social Democrats), who want to “please the Kremlin;” the Czech Communists, guilty of “treason” for their broken record anti-NATO stance; and, least of all, Czech president (“rezident”) Milos Zeman, the “Kremlin’s Trojan horse.”

With elections/Babis’ coronation just over a month away I’m surprised European Values doesn’t have anything critical to say about Babis – or, really, anything about him at all.

Advertisements

Recent Czech survey data + elections = a disinformation site’s dream

Recent Czech survey data + elections = a disinformation site’s dream

The most recent round of Eurobarometer stats just came out, and they’re bad news for pretty much anyone in Czech politics right now.

Only 18% of Czechs trust their government right now, a decline of 10% from autumn 2016 – by far the sharpest decline in the EU – and only Greeks, Italians and Spaniards distrust their government as much as Czechs do.

DDBixubXYAAZSFv
Oy.

Still, this isn’t nearly as ugly as the table for the question on trust in parliament…

DDBjvMjW0AAi0f9
Double oy.

Nobody in the EU distrusts their country’s parliament more than Czechs do right now, all thanks to the Czech government’s farcical three-part comedy act/political crisis over the past few months (yeah I linked to a Wikipedia article I don’t care).

This level of (dis)trust shows up in recent Czech Public Opinion Research Centre (CVVM) survey data too – their numbers also show that trust in President Miloš Zeman , the government (“Vláda”) and the Chamber of Deputies (“Poslanecká sněmovna,” the lower house of the Czech parliament) has completely tanked.

may 2017
“Table 1a: Population’s confidence in constitutional institutions (%) – comparison over time”

Worse, look at the way Czech satisfaction with the current political situation has driven right off the cliff after a slow recovery from 2013’s scandals.

political cvvm
“Graph 4: Satisfaction with the current political situation from 2011-2017 (Satisfaction Index 0-100)”

And, as if you needed another graph to show how bad it is, look at the drop for both president and government here (the blue and red lines, respectively).

cvvm trends
“Graph 3: Confidence in institutions 2011-2017 (confidence index)”

Numbers like this should be worrisome for a country at any time, but remember the Czechs are going to the polls in just under four months to elect a new parliament – and at the polls again not long after to vote for president.

If I were, say, part of a government of an unnamed country’s efforts to interfere and meddle in other countries’ elections, I’d be all over the Czech Republic this summer.

…to that end, another set of numbers I’ve had kicking around for a few weeks from two previous Eurobarometer surveys (both autumn 2016) show just how said unnamed country’s efforts could actually work.

One, Czechs seem to trust social media more than most other Europeans. While it’s still a minority (40% disagreeing that “information on political affairs from online social networks cannot be trusted,” which is a mouthful of a double negative but yeah), it’s also more than any other EU country.

cz

Eurobarometer’s data is free to download for losers like me, so I took a look in more detail at who exactly in the Czech Republic thinks information on politics from social media can be trusted (to the extent the data can tell me – the sample size is ~1,000, so it can’t be parsed all that much, and Eurobarometer IMO doesn’t have the best questions about education level and essentially not much useful on income or a proxy for income).

As shouldn’t be any surprise, it’s the young: 60% of Czechs aged 15 to 24 disagreed that “information on political affairs from online social networks cannot be trusted” compared to 48% of those 25 to 39, 44% of those 40 to 54 and 27% of those 55 and older. Also interesting are the “don’t knows” – only 5% of 15 to 24 year olds compared to 19% of those 40 to 54 and 39% of those 55 and older.

The same trends show up in different questions about social media – in this one below, for example, Czechs are among the most likely in the EU to think social media is reliable.

social media

Interestingly, 49% of those 15 to 24 years old, 50% of those 25 to 39 and 46% of those 40 to 54 think social media is reliable – in other words, a similar if not identical proportion – but only 31% of those 55+ think social media’s reliable.

One of these Eurobarometer surveys, coincidentally, happened to ask people their attitudes about various countries, including Russia (my mention of Russia is, of course, purely hypothetical and definitely, definitely not related to the “unnamed country” above).

Run what Czechs think about social media (i.e., the agree/disagree question on whether it’s reliable) against what they think of Russia and the results are pretty interesting – Czechs who think social media is reliable also tend to be more positive towards Russia.

   Total “positive” towards Russia   Total “negative” towards Russia
 Social media reliable?  49% 49% 
 Social media unreliable?  35% 64% 

Caveat, though. This question about being positive/negative towards Russia isn’t necessarily a proxy for what they think about the Kremlin or, for that matter, anyone else in the world. It doesn’t necessarily mean the respondent is some sort of zombie “radicalized by Russian propaganda” or even necessarily positive towards the Kremlin or Russia’s foreign policy, etc. Also, some respondents may well have interpreted the question as being positive/negative towards Russian people in general. Still, it’s interesting that the data falls out this way – and falls out this way across many other EU countries – and merits a hell of a lot more study than it’s getting.

There’s a ton more numbers I haven’t mentioned here (e.g., Czechs get more news from websites and trust the Internet more than most other Europeans) that, in all, paint a potentially very ugly picture – a population that increasingly distrusts its politicians and tends to trust social media and the web more than most other people. It’s a disinformation site’s dream.

Another quick look at Bulgaria’s “Gallup International” (aka Part II)

Another quick look at Bulgaria’s “Gallup International” (aka Part II)

I’ll forgive you if you don’t remember Bulgaria’s Gallup International (no, not that Gallup – the Gallup International “being sued by Gallup Inc. for using its name without authorization”), who very briefly popped into western headlines in March thanks to a Wall Street Journal article.

Published only days before the election, the article alleged that the pro-Kremlin Bulgarian Socialist Party (BSP) received a secret strategy document proposing, among other things, that they “promote exaggerated polling data” to help build momentum for the party and get them elected. Gallup International was specifically named in the article, and referred to by Bulgaria’s former ambassador to Moscow Ilian Vassilev as one of Moscow’s “Bulgarian proxies”; Vassilev, talking about a poll Gallup International published on an apparent lack of NATO support in Bulgaria, accused the polling agency of making a “wrapped-in-secrecy poll [that] had no details on methodology nor funding sources.”

Of course, the WSJ article wasn’t without its critics, myself included. One journalist in Bulgaria said the WSJ article “completely ignored all the internal political factors that led to the landslide victory of [President] Rumen Radev and focused only on Russian influence in elections,” which ultimately, in his opinion “backfired, instead of there being alarms for (legitimate) concerns about Russian involvement in Bulgaria.” (To say nothing of the role of the “crackpot outfit” at RISI apparently being involved in producing the document.)

Debates about the scope and scale of Russian influence in Bulgaria aside, I decided to take a look at polling data in Bulgaria leading up to the parliamentary election on March 26. Is there anything to support this idea that Gallup International and potentially other Bulgarian pollsters have used and promoted “exaggerated polling data” to benefit the BSP?

There might be.

Here’s a table of all the polls I’ve been able to find in 2017 leading up to the election, with levels of party support noted and the final actual election results at the bottom.

bulgaria poll jpg

(larger pdf of same chart here: Bulgaria polls)

Almost all polls from polling firms like Trend Research (3 of 3), Alpha Research (3 of 3, and, full disclosure, whose managing director I interviewed as part of my April piece for Coda Story) and Estat (2 of 3) showed Boyko Borisov’s GERB having a lead.

To break it further down (remembering that GERB ended up winning by 5.6%)

  • Trend Research: average GERB lead of 1.93%
  • Alpha Research: average GERB lead of 2.77%
  • Estat: average GERB lead of 3.47%

But the story’s pretty different with Gallup International, who one Bulgarian source described to me as “the main political/sociological advisor of the [BSP] at election time.” They published four polls during the campaign, only one of which showed GERB in the lead – their final poll, about a week before the vote, which showed GERB with only a 0.6% lead. On average, Gallup International’s polls showed an average BSP lead of 0.7% throughout the campaign.

Again, GERB ended up winning by 5.6%.

The numbers are even stranger for AFIS, a pollster run by Yuriy Aslanov, a sociologist who has sat on the BSP board. Both its polls showed the BSP in the lead by an average of 1.3%.

Again, GERB ended up winning by 5.6%.

In sum – a pollster that’s been accused as being part of an effort to “promote exaggerated polling data” on behalf of the party it’s linked to consistently showed results that played up the support for said party, contradicting most other polling firms’ results throughout the campaign. As a western observer and polling nerd who’s worked for multiple firms in Canada and the UK, I feel I’m in a position to confidently say this isn’t normal.

Of course, this all needs a few caveats. There’s absolutely no evidence of anyone cooking up numbers or anything like that – I am accusing no one of that. I’m also well aware of how statistics work and am well aware of the (however unlikely) possibility that these figures from Gallup International and AFIS are all down to random survey errors or even differences in methodology. Still, something’s off here.

Above all though, I’d stress this message to other journalists who haven’t had the pleasure of drowning a considerable part of their adult lives in SPSS – if some polling numbers look consistently very different from what other polling numbers are saying, ask why. Sometimes it’s an easy answer – a different, new methodology that may or may not pan out, a rogue poll (i.e., the one out of 20) or the polling firm just really sucks. But sometimes it’s not that simple.

On “First the journalists, then tanks and bombs”

On “First the journalists, then tanks and bombs”

OK, I’d seen this article and graph kicking around Twitter for a day or two before I finally looked at it, and I’m both glad and not glad I did.

14596705_4d6b2eaed9829d836f3bdd7b03ca9ba4_wm
This impressive-looking graph. You’ve seen it, right?

For anyone who hasn’t already seen it or (like I had) has given it only a cursory weekend glance,  the graph is based on an analysis done by Semantic Visions, “a risk assessment company based in Prague” who “conduct…big data (meaning non-structured, large data requiring serious calculations) analyses with the aid of open source intelligence, on the foundation of which they try to identify trends or risk factors.” They also use a “private Open Source Intelligence system, which is unique in its category and enables solutions to a new class of tasks to include geo-political analyses based on Big Data from the Internet.”

OK, cool.

The gist in this case: Semantic Visions had algorithms read hundreds of thousands of online sources, including 22,000 Russian ones,  searching for different trends.

OK…though as someone who chose to suffer through a media content analysis as a thesis for some reason I have a number of methodology-related questions I don’t want to harp too much on (e.g., how is the algorithm actually designed to determine positive/negative stories vis-à-vis a human? how were the online sources chosen? etc.). A little transparency here would go a long way, proprietary nature of the algorithms notwithstanding.

What gets me is the conclusion they’ve drawn based on the data they’ve gathered and present here in this article.

The article says “the number of Russian articles with a negative tone on Ukraine [from February 2012] started to show a gradual and trend-like increase – while no similar trend can be found in English-language media.”

Yes, your data does show that. Got no problem there.

But it’s this (my emphasis in bold):

“Therefore, based on hundreds of millions of articles the possibility that the actual events in Ukraine could themselves be the reason for the increasing combativeness of Russian-language articles can be excluded. Moreover, the strongly pro-Russian President Yanukovych was still in government at the time and the similarly Eastern-oriented Party of Regions was in power. The explanation is something else: the Putin administration was consciously preparing for military intervention and the Kremlin’s information war against Ukraine started two years before the annexation of Crimea to turn Russian public opinion against Ukrainians…”

How can someone possibly draw that conclusion based solely on the numbers presented here?? Are you privy to other data or pieces of analyses that aren’t public? Because, based on the data that’s presented here, I see absolutely no justification for the conclusion that the Kremlin “was consciously preparing for military intervention.”

Consider:

  • A big part of the explanation for any apparent increase in negative coverage would be the EU Association Agreement being initialed in March 2012, right?
  • Why start the analysis at June 2011? I’d want to see the tone of coverage compared to the last bit of Yushchenko’s presidency through the beginning of Yanukovych’s – maybe the increase over 2012-2013 isn’t so much an increase as a return to “normal” negative coverage of Ukraine.
  • (OK, I lied about no more methodology questions) What about positive stories? Were negative stories about Ukraine taking up a greater share of overall coverage, or did the overall number of articles itself increase? Not being transparent on methodological nerdish issues like this really, really doesn’t help, guys.

Please – no more divining of Kremlinological intentions from incomplete, unclear sets of numbers.

Comparing Ukrainian & Russian attitudes toward each other (KIIS/Levada Centre data)

Comparing Ukrainian & Russian attitudes toward each other (KIIS/Levada Centre data)

KIIS and Levada released results this week from their regular surveys of Russians and Ukrainians and their attitudes towards each other (link to KIIS in Ukrainian, Levada’s link in Russian).

tl;dr: Ukrainians tend to have more positive attitudes towards Russia than vice versa.

  • Attitudes of Ukrainians towards Russia:
    • 40% of Ukrainians in September 2016 said their attitudes towards Russia were ‘good’ or ‘very good’ (an statistically insignificant change from May 2016)
    • 46% of Ukrainians in September 2016 said their attitudes towards Russia were ‘bad’ or ‘very bad,’ a significant increase from 43% in May 2016
  • Attitudes of Russians towards Ukraine:
    • One in four Russians (26%) said their attitudes towards Ukraine were ‘good’ or ‘very good’, a significant drop from 39% in May 2016 – which was itself a significant increase from 27% in February 2016. Some zigzaggin’ goin’ on here.
    • 56% of Russians said their attitudes towards Ukraine were ‘bad’ or ‘very bad,’ a significant increase from 47% in May.

The data over time since 2008 is pretty interesting, so interesting I decided to make a barely readable graph. Ukrainians’ attitudes to Russia = blue. Russians’ attitudes to Ukraine = yellow/…mustard?

image-29

A few observations, if you haven’t got a headache yet from having to squint at this thing:

  1. At no point are Ukrainians’ attitudes towards Russia worse than Russians’ attitudes towards Ukraine, even in the aftermath of the annexation of Crimea and the start of war in Donbas by May 2014. At every single data point Ukrainians have more positive and less negative feelings about Russia than Russians have for Ukraine.
  2. Russians’ attitudes towards Ukraine got really damn low in late 2008/early 2009. A function of Yushchenko’s presidency and the gas disputes?
  3. Once Yanukovych got elected in February 2010, Russians’ attitudes tend to even out (keeping in mind the gaps in actual survey dates in 2011).
  4. Ukrainians’ attitudes have got a bit better towards Russia recently but, not surprisingly, are still far below pre-Maidan levels.
  5. I can’t explain the zigzagging with Russians’ attitudes over 2015/2016. If you can, great.

Russian Duma elections: just how bad was voter turnout?

Russian Duma elections: just how bad was voter turnout?

According to Vladimir Putin, voter turnout in Sunday’s Duma elections – estimated at 39% as I write this – was “not the greatest, but high.”  Was it?

I took a look at IDEA’s Voter Turnout Database, which has data on all parliamentary, presidential and European Parliament elections across the world since 1945. Where does a 39% voter turnout in a national parliamentary/legislative elections rank?

Well, for starters:

  • The lowest turnout in an American congressional election was 2014, at 42.5%. Yes, that’s pretty close to 39% and might make easy fodder for the quick-to-false-equivalence crowd, but keep in mind that:

1. Americans vote in congressional elections every two years (all House of Representative seats plus 1/3 of the Senate) unlike the rest of us who go every four, five or six years. Voter fatigue much?

2. 2014 was a midterm election (i.e., not voting for a President at the same time) which always have markedly lower turnouts than in presidential years. Case in point: 2010: 48.6% / 2014: 42.5%. 2008: 64.4% / 2012: 64.4%, the years Obama was (re)elected.

  • The lowest turnout in recent Canadian history was 2008 (59.5%), if anyone other than me cares about Canada as a reference point. We’d had one less than three years before, both producing Stephen Harper (Conservative) minorities, or ‘hung parliaments’ for the more British among you.
  • As for the UK, the lowest was 59.4% turning out in 2001 for Tony Blair and Labour’s second straight victory.
  • France’s lowest was 55.4% in 2012. Legislative elections, since 2002, fall right after presidential elections in France (i.e., a month after you vote for president).
  • Next door in Ukraine, the October 2014 Rada elections had a turnout of 52.4%.

It gets worse when you look at the entire data set for parliamentary elections (excluding countries like Australia that have compulsory voting, and leaving out two outlier elections that had [!!!] 2.3% and 100.3% turnout)…out of more than 1400 national parliamentary elections worldwide, 4% had voter turnouts of 40% or less. Only 11% even had voter turnouts of 50% or less.

I guess it all depends on what your definition of высокой is.

Taking issue with the OSCE SMM’s report on IDPs in Ukraine

Taking issue with the OSCE SMM’s report on IDPs in Ukraine

Last Friday the OSCE Special Monitoring Mission (SMM) released a report on internal displacement in Ukraine and it seems like they want it to be read by as few people as possible.

The OSCE SMM didn’t just release this report on the last Friday in August – they buried it late in the afternoon on the last Friday in August (17.30 Ukraine time – 10.30 am Eastern in Canada/US). OK?

And looking at when the focus groups and interviews were actually done for the report, it’s not like they didn’t have time to release it when people might actually be paying attention:

“Focus group discussions and individual interviews were conducted between August and November 2015 in 19 regions across Ukraine” [emphasis mine]

Listen, I know there were a lot of focus groups and interviews – 161 groups and 39 individual interviews, to be precise, so more than 1,600 people in total. I’ve been that guy having to organize transcriptions and analysis of piles of focus group and interview findings. It takes time. But you mean to tell me it’s taken no less than nine months to do all this?

If they have, the quality of the report is pretty disappointing. This thing rambles on, with barely a signpost for the reader to know what the most important findings are. We don’t get any stand-alone block quotes from IDPs themselves to help contextualize and understand how they’re coping in new communities. We’re treated to vague discussions of IDP-community relations that could leave a reader thinking they’re far worse than they actually are. We get a conclusion (“Concluding Remarks”) that reads like it was pieced together the morning of (I know, cuz I’ve done it), a flimsy set of remarks that summarizes almost nothing of substance. If I ever handed a draft report like this to one of my old bosses I’d have had it handed back to me pretty quickly.

Read this report, then take a look at the UNHCR’s report from a few months ago about IDPs and host communities in Ukraine, and also one of the IOM’s regular reports every few months. I challenge you to reach a different conclusion than mine: that this report’s a watered-down stream of paragraphs that doesn’t really help us understand IDPs any better.

Should we be surprised? Probably not. According to one former OSCE observer:

“According to established OSCE practice, reports should not provoke major controversies. Instead, they should be politically acceptable to all member states, with the emphasis on ‘balance’ rather than ‘objectivity’. In addition to this approach, I also quickly learned that I was only one of several links in the chain of report preparation. Information provided by OSCE monitoring teams had been often already been ‘sterilized’ by the time it reached me. As a result, the reports posted on the OSCE website were often far removed from that what I personally wished to include, and what should have been included.”

I think we can add this report to that list.