The original CNN story referred to in that piece is here (“Stoking Islamophobia and secession in Texas — from an office in Russia”); a link to a local Houston TV station’s video from the protest is here.
I read the original story last night (and the Twitterverse’s reaction to it), read it again this morning, and also took a look at the original CNN story and the video.
That’s when I thought something was off.
First, the lede to the ThinkProgress story:
Last May, nearly 100 demonstrators gathered around the Islamic Da’wah Center in downtown Houston, squaring off against one another in competing camps.
I can’t be the only person who quickly glanced at this story and immediately thought something like “wow, “the Russians” got 100 armed white supremacists out via one of these dumbass Facebook groups – that’s better than a dozen.” Actually, in chatting with a few other journos, I know I’m not.
But take a look at the lede to the original CNN story (emphasis mine):
On May 21 2016, a handful of people turned out to protest the opening of a library at an Islamic Center in Houston, Texas. Two held up a banner proclaiming #WhiteLivesMatter. A counter-protest began across the street; video shows a noisy but non-violent confrontation.
OK. Now take a look at the actual video from the protest. You can see the white supremacist protesters at around 0:12 and 0:13 in front of the brick building; the counter-protesters are the larger group across the street.
That’s a handful, not a hundred. And phrasing your lede in a way that many readers will interpret as ‘a hundred’ armed white supremacist protesters is….yeah.
Listen, the Kremlin (or ‘the Russians,’ however we want to obsequiously phrase it) meddled in the 2016 election, just as they have and continue to do in other countries. But there’s this wave of coverage coming out of the US these days that’s hard to stomach, a wave that either assumes every single little thing they did has had a real impact – or worse, conveniently never bothers to ask the question in the first place. We can sex up our stories to ride the Russia wave, but it’s not going to make any of us look cooler once we get back to shore.
The Czech Republic’s most active think-tank has barely criticized, let alone mentioned their future Prime Minister Andrej Babis – a man who isn’t exactly a shining example of western liberal democracy in action.
2/3 he’s not exactly a beacon of liberal democracy in action, and he’s got no shortage of Kremlin-friendly statements under his belt.
The second-richest man in the country, with his Agrofert conglomerate having its hands in everything from fertilizers and farm equipment, to two of the largest Czech newspapers and its most popular radio station.
Been recently described to me as “Trump, Berlusconi and Orban all in one.”
Let’s also not forget some of the Russia-related allegations that have been thrown at Babis.
He’s called EU and US sanctions on Russia “nonsense” and said they’re against the country’s economic interests – a line I’ve personally heard from some Kremlin-friendly figures across Europe.
He’s dodged questions on whether Putin bore the blame for annexing Crimea, and has said NATO “cannot stay on this idea that Russia is the biggest problem.”
Under his watch the Czech finance ministry (more accurately, the Czech Export Guarantee Agency (EGAP)), underwrote a loan guarantee to PhosAgro, a Russian company co-owned by Putin pal Vladimir Litvinenko.
In 2007 Babis’ Agrofert tried to negotiate a gas deal with the Czech subsidiary of Gazprom instead of its then-current German supplier.
These aren’t necessarily super-Kremlin smoking guns, but I’d think a group of people who are dedicated to ferreting out Kremlin interference in their country and beyond would at least be asking a few questions about the guy who’s about to run the show.
Sure, Babis is intimidating and is the kind of guy who likes to go after people who talk shit about him – I mean, look at all the corrections Foreign Policy had to add under this 2015 article when Babis went full Babis on them.
I get why you’d want to be in his good side, but European Values isn’t exactly afraid to go after some other Czech and European political figures with less-than-subtle language: the German SPD and Sigmar Gabriel (Social Democrats), who want to “please the Kremlin;” the Czech Communists, guilty of “treason” for their broken record anti-NATO stance; and, least of all, Czech president (“rezident”) Milos Zeman, the “Kremlin’s Trojan horse.”
With elections/Babis’ coronation just over a month away I’m surprised European Values doesn’t have anything critical to say about Babis – or, really, anything about him at all.
The most recent round of Eurobarometer stats just came out, and they’re bad news for pretty much anyone in Czech politics right now.
Only 18% of Czechs trust their government right now, a decline of 10% from autumn 2016 – by far the sharpest decline in the EU – and only Greeks, Italians and Spaniards distrust their government as much as Czechs do.
Still, this isn’t nearly as ugly as the table for the question on trust in parliament…
Nobody in the EU distrusts their country’s parliament more than Czechs do right now, all thanks to the Czech government’s farcical three-part comedy act/political crisis over the past few months (yeah I linked to a Wikipedia article I don’t care).
This level of (dis)trust shows up in recent Czech Public Opinion Research Centre (CVVM) survey data too – their numbers also show that trust in President Miloš Zeman , the government (“Vláda”) and the Chamber of Deputies (“Poslanecká sněmovna,” the lower house of the Czech parliament) has completely tanked.
Worse, look at the way Czech satisfaction with the current political situation has driven right off the cliff after a slow recovery from 2013’s scandals.
And, as if you needed another graph to show how bad it is, look at the drop for both president and government here (the blue and red lines, respectively).
Numbers like this should be worrisome for a country at any time, but remember the Czechs are going to the polls in just under four months to elect a new parliament – and at the polls again not long after to vote for president.
If I were, say, part of a government of an unnamed country’s efforts to interfere and meddle in other countries’ elections, I’d be all over the Czech Republic this summer.
…to that end, another set of numbers I’ve had kicking around for a few weeks from two previous Eurobarometer surveys (both autumn 2016) show just how said unnamed country’s efforts could actually work.
One, Czechs seem to trust social media more than most other Europeans. While it’s still a minority (40% disagreeing that “information on political affairs from online social networks cannot be trusted,” which is a mouthful of a double negative but yeah), it’s also more than any other EU country.
Eurobarometer’s data is free to download for losers like me, so I took a look in more detail at who exactly in the Czech Republic thinks information on politics from social media can be trusted (to the extent the data can tell me – the sample size is ~1,000, so it can’t be parsed all that much, and Eurobarometer IMO doesn’t have the best questions about education level and essentially not much useful on income or a proxy for income).
As shouldn’t be any surprise, it’s the young: 60% of Czechs aged 15 to 24 disagreed that “information on political affairs from online social networks cannot be trusted” compared to 48% of those 25 to 39, 44% of those 40 to 54 and 27% of those 55 and older. Also interesting are the “don’t knows” – only 5% of 15 to 24 year olds compared to 19% of those 40 to 54 and 39% of those 55 and older.
The same trends show up in different questions about social media – in this one below, for example, Czechs are among the most likely in the EU to think social media is reliable.
Interestingly, 49% of those 15 to 24 years old, 50% of those 25 to 39 and 46% of those 40 to 54 think social media is reliable – in other words, a similar if not identical proportion – but only 31% of those 55+ think social media’s reliable.
One of these Eurobarometer surveys, coincidentally, happened to ask people their attitudes about various countries, including Russia (my mention of Russia is, of course, purely hypothetical and definitely, definitely not related to the “unnamed country” above).
Run what Czechs think about social media (i.e., the agree/disagree question on whether it’s reliable) against what they think of Russia and the results are pretty interesting – Czechs who think social media is reliable also tend to be more positive towards Russia.
Total “positive” towards Russia
Total “negative” towards Russia
Social media reliable?
Social media unreliable?
Caveat, though. This question about being positive/negative towards Russia isn’t necessarily a proxy for what they think about the Kremlin or, for that matter, anyone else in the world. It doesn’t necessarily mean the respondent is some sort of zombie “radicalized by Russian propaganda” or even necessarily positive towards the Kremlin or Russia’s foreign policy, etc. Also, some respondents may well have interpreted the question as being positive/negative towards Russian people in general. Still, it’s interesting that the data falls out this way – and falls out this way across many other EU countries – and merits a hell of a lot more study than it’s getting.
There’s a ton more numbers I haven’t mentioned here (e.g., Czechs get more news from websites and trust the Internet more than most other Europeans) that, in all, paint a potentially very ugly picture – a population that increasingly distrusts its politicians and tends to trust social media and the web more than most other people. It’s a disinformation site’s dream.
I’ll forgive you if you don’t remember Bulgaria’s Gallup International (no, not that Gallup – the Gallup International “being sued by Gallup Inc. for using its name without authorization”), who very briefly popped into western headlines in March thanks to a Wall Street Journal article.
Published only days before the election, the article alleged that the pro-Kremlin Bulgarian Socialist Party (BSP) received a secret strategy document proposing, among other things, that they “promote exaggerated polling data” to help build momentum for the party and get them elected. Gallup International was specifically named in the article, and referred to by Bulgaria’s former ambassador to Moscow Ilian Vassilev as one of Moscow’s “Bulgarian proxies”; Vassilev, talking about a poll Gallup International published on an apparent lack of NATO support in Bulgaria, accused the polling agency of making a “wrapped-in-secrecy poll [that] had no details on methodology nor funding sources.”
Of course, the WSJ article wasn’t without its critics, myself included. One journalist in Bulgaria said the WSJ article “completely ignored all the internal political factors that led to the landslide victory of [President] Rumen Radev and focused only on Russian influence in elections,” which ultimately, in his opinion “backfired, instead of there being alarms for (legitimate) concerns about Russian involvement in Bulgaria.” (To say nothing of the role of the “crackpot outfit” at RISI apparently being involved in producing the document.)
Debates about the scope and scale of Russian influence in Bulgaria aside, I decided to take a look at polling data in Bulgaria leading up to the parliamentary election on March 26. Is there anything to support this idea that Gallup International and potentially other Bulgarian pollsters have used and promoted “exaggerated polling data” to benefit the BSP?
There might be.
Here’s a table of all the polls I’ve been able to find in 2017 leading up to the election, with levels of party support noted and the final actual election results at the bottom.
Almost all polls from polling firms like Trend Research (3 of 3), Alpha Research (3 of 3, and, full disclosure, whose managing director I interviewed as part of my April piece for Coda Story) and Estat (2 of 3) showed Boyko Borisov’s GERB having a lead.
To break it further down (remembering that GERB ended up winning by 5.6%)
Trend Research: average GERB lead of 1.93%
Alpha Research: average GERB lead of 2.77%
Estat: average GERB lead of 3.47%
But the story’s pretty different with Gallup International, who one Bulgarian source described to me as “the main political/sociological advisor of the [BSP] at election time.” They published four polls during the campaign, only one of which showed GERB in the lead – their final poll, about a week before the vote, which showed GERB with only a 0.6% lead. On average, Gallup International’s polls showed an average BSP lead of 0.7% throughout the campaign.
Again, GERB ended up winning by 5.6%.
The numbers are even stranger for AFIS, a pollster run by Yuriy Aslanov, a sociologist who has sat on the BSP board. Both its polls showed the BSP in the lead by an average of 1.3%.
Again, GERB ended up winning by 5.6%.
In sum – a pollster that’s been accused as being part of an effort to “promote exaggerated polling data” on behalf of the party it’s linked to consistently showed results that played up the support for said party, contradicting most other polling firms’ results throughout the campaign. As a western observer and polling nerd who’s worked for multiple firms in Canada and the UK, I feel I’m in a position to confidently say this isn’t normal.
Of course, this all needs a few caveats. There’s absolutely no evidence of anyone cooking up numbers or anything like that – I am accusing no one of that. I’m also well aware of how statistics work and am well aware of the (however unlikely) possibility that these figures from Gallup International and AFIS are all down to random survey errors or even differences in methodology. Still, something’s off here.
Above all though, I’d stress this message to other journalists who haven’t had the pleasure of drowning a considerable part of their adult lives in SPSS – if some polling numbers look consistently very different from what other polling numbers are saying, ask why. Sometimes it’s an easy answer – a different, new methodology that may or may not pan out, a rogue poll (i.e., the one out of 20) or the polling firm just really sucks. But sometimes it’s not that simple.
OK, I’d seen this article and graph kicking around Twitter for a day or two before I finally looked at it, and I’m both glad and not glad I did.
For anyone who hasn’t already seen it or (like I had) has given it only a cursory weekend glance, the graph is based on an analysis done by Semantic Visions, “a risk assessment company based in Prague” who “conduct…big data (meaning non-structured, large data requiring serious calculations) analyses with the aid of open source intelligence, on the foundation of which they try to identify trends or risk factors.” They also use a “private Open Source Intelligence system, which is unique in its category and enables solutions to a new class of tasks to include geo-political analyses based on Big Data from the Internet.”
The gist in this case: Semantic Visions had algorithms read hundreds of thousands of online sources, including 22,000 Russian ones, searching for different trends.
OK…though as someone who chose to suffer through a media content analysis as a thesis for some reason I have a number of methodology-related questions I don’t want to harp too much on (e.g., how is the algorithm actually designed to determine positive/negative stories vis-à-vis a human? how were the online sources chosen? etc.). A little transparency here would go a long way, proprietary nature of the algorithms notwithstanding.
What gets me is the conclusion they’ve drawn based on the data they’ve gathered and present here in this article.
The article says “the number of Russian articles with a negative tone on Ukraine [from February 2012] started to show a gradual and trend-like increase – while no similar trend can be found in English-language media.”
Yes, your data does show that. Got no problem there.
But it’s this (my emphasis in bold):
“Therefore, based on hundreds of millions of articles the possibility that the actual events in Ukraine could themselves be the reason for the increasing combativeness of Russian-language articles can be excluded. Moreover, the strongly pro-Russian President Yanukovych was still in government at the time and the similarly Eastern-oriented Party of Regions was in power. The explanation is something else: the Putin administration was consciously preparing for military intervention and the Kremlin’s information war against Ukraine started two years before the annexation of Crimea to turn Russian public opinion against Ukrainians…”
How can someone possibly draw that conclusion based solely on the numbers presented here?? Are you privy to other data or pieces of analyses that aren’t public? Because, based on the data that’s presented here, I see absolutely no justification for the conclusion that the Kremlin “was consciously preparing for military intervention.”
A big part of the explanation for any apparent increase in negative coverage would be the EU Association Agreement being initialed in March 2012, right?
Why start the analysis at June 2011? I’d want to see the tone of coverage compared to the last bit of Yushchenko’s presidency through the beginning of Yanukovych’s – maybe the increase over 2012-2013 isn’t so much an increase as a return to “normal” negative coverage of Ukraine.
(OK, I lied about no more methodology questions) What about positive stories? Were negative stories about Ukraine taking up a greater share of overall coverage, or did the overall number of articles itself increase? Not being transparent on methodological nerdish issues like this really, really doesn’t help, guys.
Please – no more divining of Kremlinological intentions from incomplete, unclear sets of numbers.
KIIS and Levada released results this week from their regular surveys of Russians and Ukrainians and their attitudes towards each other (link to KIIS in Ukrainian, Levada’s link in Russian).
tl;dr: Ukrainians tend to have more positive attitudes towards Russia than vice versa.
Attitudes of Ukrainians towards Russia:
40% of Ukrainians in September 2016 said their attitudes towards Russia were ‘good’ or ‘very good’ (an statistically insignificant change from May 2016)
46% of Ukrainians in September 2016 said their attitudes towards Russia were ‘bad’ or ‘very bad,’ a significant increase from 43% in May 2016
Attitudes of Russians towards Ukraine:
One in four Russians (26%) said their attitudes towards Ukraine were ‘good’ or ‘very good’, a significant drop from 39% in May 2016 – which was itself a significant increase from 27% in February 2016. Some zigzaggin’ goin’ on here.
56% of Russians said their attitudes towards Ukraine were ‘bad’ or ‘very bad,’ a significant increase from 47% in May.
The data over time since 2008 is pretty interesting, so interesting I decided to make a barely readable graph. Ukrainians’ attitudes to Russia = blue. Russians’ attitudes to Ukraine = yellow/…mustard?
A few observations, if you haven’t got a headache yet from having to squint at this thing:
At no point are Ukrainians’ attitudes towards Russia worse than Russians’ attitudes towards Ukraine, even in the aftermath of the annexation of Crimea and the start of war in Donbas by May 2014. At every single data point Ukrainians have more positive and less negative feelings about Russia than Russians have for Ukraine.
Russians’ attitudes towards Ukraine got really damn low in late 2008/early 2009. A function of Yushchenko’s presidency and the gas disputes?
Once Yanukovych got elected in February 2010, Russians’ attitudes tend to even out (keeping in mind the gaps in actual survey dates in 2011).
Ukrainians’ attitudes have got a bit better towards Russia recently but, not surprisingly, are still far below pre-Maidan levels.
I can’t explain the zigzagging with Russians’ attitudes over 2015/2016. If you can, great.
According to Vladimir Putin, voter turnout in Sunday’s Duma elections – estimated at 39% as I write this – was “not the greatest, but high.” Was it?
I took a look at IDEA’s Voter Turnout Database, which has data on all parliamentary, presidential and European Parliament elections across the world since 1945. Where does a 39% voter turnout in a national parliamentary/legislative elections rank?
Well, for starters:
The lowest turnout in an American congressional election was 2014, at 42.5%. Yes, that’s pretty close to 39% and might make easy fodder for the quick-to-false-equivalence crowd, but keep in mind that:
1. Americans vote in congressional elections every two years (all House of Representative seats plus 1/3 of the Senate) unlike the rest of us who go every four, five or six years. Voter fatigue much?
2. 2014 was a midterm election (i.e., not voting for a President at the same time) which always have markedly lower turnouts than in presidential years. Case in point: 2010: 48.6% / 2014: 42.5%. 2008: 64.4% / 2012: 64.4%, the years Obama was (re)elected.
The lowest turnout in recent Canadian history was 2008 (59.5%), if anyone other than me cares about Canada as a reference point. We’d had one less than three years before, both producing Stephen Harper (Conservative) minorities, or ‘hung parliaments’ for the more British among you.
As for the UK, the lowest was 59.4% turning out in 2001 for Tony Blair and Labour’s second straight victory.
France’s lowest was 55.4% in 2012. Legislative elections, since 2002, fall right after presidential elections in France (i.e., a month after you vote for president).
Next door in Ukraine, the October 2014 Rada elections had a turnout of 52.4%.
It gets worse when you look at the entire data set for parliamentary elections (excluding countries like Australia that have compulsory voting, and leaving out two outlier elections that had [!!!] 2.3% and 100.3% turnout)…out of more than 1400 national parliamentary elections worldwide, 4%had voter turnouts of 40% or less. Only 11% even had voter turnouts of 50% or less.
I guess it all depends on what your definition of высокой is.