Another quick look at Bulgaria’s “Gallup International” (aka Part II)

Another quick look at Bulgaria’s “Gallup International” (aka Part II)

I’ll forgive you if you don’t remember Bulgaria’s Gallup International (no, not that Gallup – the Gallup International “being sued by Gallup Inc. for using its name without authorization”), who very briefly popped into western headlines in March thanks to a Wall Street Journal article.

Published only days before the election, the article alleged that the pro-Kremlin Bulgarian Socialist Party (BSP) received a secret strategy document proposing, among other things, that they “promote exaggerated polling data” to help build momentum for the party and get them elected. Gallup International was specifically named in the article, and referred to by Bulgaria’s former ambassador to Moscow Ilian Vassilev as one of Moscow’s “Bulgarian proxies”; Vassilev, talking about a poll Gallup International published on an apparent lack of NATO support in Bulgaria, accused the polling agency of making a “wrapped-in-secrecy poll [that] had no details on methodology nor funding sources.”

Of course, the WSJ article wasn’t without its critics, myself included. One journalist in Bulgaria said the WSJ article “completely ignored all the internal political factors that led to the landslide victory of [President] Rumen Radev and focused only on Russian influence in elections,” which ultimately, in his opinion “backfired, instead of there being alarms for (legitimate) concerns about Russian involvement in Bulgaria.” (To say nothing of the role of the “crackpot outfit” at RISI apparently being involved in producing the document.)

Debates about the scope and scale of Russian influence in Bulgaria aside, I decided to take a look at polling data in Bulgaria leading up to the parliamentary election on March 26. Is there anything to support this idea that Gallup International and potentially other Bulgarian pollsters have used and promoted “exaggerated polling data” to benefit the BSP?

There might be.

Here’s a table of all the polls I’ve been able to find in 2017 leading up to the election, with levels of party support noted and the final actual election results at the bottom.

bulgaria poll jpg

(larger pdf of same chart here: Bulgaria polls)

Almost all polls from polling firms like Trend Research (3 of 3), Alpha Research (3 of 3, and, full disclosure, whose managing director I interviewed as part of my April piece for Coda Story) and Estat (2 of 3) showed Boyko Borisov’s GERB having a lead.

To break it further down (remembering that GERB ended up winning by 5.6%)

  • Trend Research: average GERB lead of 1.93%
  • Alpha Research: average GERB lead of 2.77%
  • Estat: average GERB lead of 3.47%

But the story’s pretty different with Gallup International, who one Bulgarian source described to me as “the main political/sociological advisor of the [BSP] at election time.” They published four polls during the campaign, only one of which showed GERB in the lead – their final poll, about a week before the vote, which showed GERB with only a 0.6% lead. On average, Gallup International’s polls showed an average BSP lead of 0.7% throughout the campaign.

Again, GERB ended up winning by 5.6%.

The numbers are even stranger for AFIS, a pollster run by Yuriy Aslanov, a sociologist who has sat on the BSP board. Both its polls showed the BSP in the lead by an average of 1.3%.

Again, GERB ended up winning by 5.6%.

In sum – a pollster that’s been accused as being part of an effort to “promote exaggerated polling data” on behalf of the party it’s linked to consistently showed results that played up the support for said party, contradicting most other polling firms’ results throughout the campaign. As a western observer and polling nerd who’s worked for multiple firms in Canada and the UK, I feel I’m in a position to confidently say this isn’t normal.

Of course, this all needs a few caveats. There’s absolutely no evidence of anyone cooking up numbers or anything like that – I am accusing no one of that. I’m also well aware of how statistics work and am well aware of the (however unlikely) possibility that these figures from Gallup International and AFIS are all down to random survey errors or even differences in methodology. Still, something’s off here.

Above all though, I’d stress this message to other journalists who haven’t had the pleasure of drowning a considerable part of their adult lives in SPSS – if some polling numbers look consistently very different from what other polling numbers are saying, ask why. Sometimes it’s an easy answer – a different, new methodology that may or may not pan out, a rogue poll (i.e., the one out of 20) or the polling firm just really sucks. But sometimes it’s not that simple.

Surveying some surveys: Czechs & refugees, immigrants and Islam

Surveying some surveys: Czechs & refugees, immigrants and Islam

I’ve been spurred on by what I guess we can call some, um, “colourful” comments on Coda Story’s recent animation of my January story on Islamophobia in the Czech Republic to take a look at some recent public opinion data.

“Unsympathetic” towards Arabs

The Czech Public Opinion Research Centre (CVVM) asked a few questions in their March 2017 survey of ~1,000 Czechs about attitudes towards people from different nationalities/ethnic groups, including Arabs (who I think we can agree in most Czech minds means “Muslims”). They’re right at the bottom.


The numbers for Arabs look even worse over time….

Mean scores 1-5, where 1 is “very sympathetic” and 5 is “very unsympathetic.”

No other group has seen anything like this; as the CVVM’s summary report points out, the percentage of those saying they’re “very unsympathetic” (i.e., 5 on the 5-point scale) towards Arabs has gone up by 18 percentage points since 2014. Fortunately (?) that increase seems to have flatlined since 2016.

There’s also a few demographic differences of note in the CVVM’s summary report: 41% of those who declared a good standard of living said they were “very unsympathetic” towards Arabs compared to 55% who declared a poor standard of living; 36% of those with a higher level of education said they were “very unsympathetic” compared to 47% who had an apprenticeship. Still, it’s clear that a lack of sympathy towards Arabs is pretty strong among all parts of the Czech population.

Unfortunately the raw data set isn’t yet publicly available for me to screw around with so I took a look at the raw data from last year’s survey (March 2016) to see if there were any other differences of note that might (or might not) be seen in the 2017 data. There weren’t many:

  • Men had a slightly more negative score on average than women (4.26 compared to 4.15), and more likely to say they were “very unsympathetic” towards Arabs (49% compared to 44%)[both p< 0.1, which means it’s barely worth mentioning IMO but I’ve still done it so deal with it.]
  • Czechs aged 15-29 (41%) were less likely than those 45-59 (50%) or 60+ (50%) to say they were “very unsympathetic” towards Arabs [p< 0.05]

Fear of immigrants

CVVM also released some analysis yesterday from the March 2017 survey on attitudes towards foreigners in general – 64% of Czechs feel that newly-arrived immigrants are a problem for the Czech Republic as a whole. This figure’s shot up since last year, but had dipped from 2015 after a slow rise from 2011.

graf 2

CVVM also asked a few specific questions about the impact people think immigrants have on their country, and the results over time here have seen a drastic change. The belief that immigrants contribute to unemployment has dropped by 12% since 2016 (not that surprising in a country with low unemployment) and, as you can see below, the belief that immigrants are a threat to the Czech way of life has increased.

image (47)

A reason for those “Refugees not welcome” stickers I’ve seen

The most recent round of the Eurobarometer surveys (November 2016) asked a question of ~1,000 Czechs whether they think their country should help refugees. Czechs were the second most likely, behind Bulgaria, of any EU country to say their country shouldn’t help refugees (23% agree versus 72% disagree; EU average 66% agree versus 28% disagree).

Here, as with the CVVM surveys, there’s a few demographic breakdowns of note that I analyzed using the raw data:

  • Czechs who finished full-time education between the ages of 16 and 19 were less likely to agree the Czech Republic should help refugees (20%) compared to those who finished full-time education at 20 years old or older (31%)[p<0.01]
  • Czechs in rural areas (18%) were less likely than those in towns and suburbs (24%) and cities (28%) to agree the Czech Republic should help refugees [p<0.05]

Again, despite these differences, Czechs across all social divides tend not to think their country should help refugees…

Czech and Islam by the numbers, Parts 1 and 2

Last fall I analyzed European Social Survey (ESS) data from 2014 on Czech attitudes towards Muslims living in their country. Part 1, and Part 2.

If you’ve been following along nothing here will surprise you. Who doesn’t want any Muslims to come live in the Czech Republic (i.e., who’s less likely to want them)? Those who:

  • Feel unsafe after dark
  • Have the least contact with different races or ethnic groups
  • Feel the government treats new immigrants better than them
  • Distrust social/political institutions
  • Feel they have less ability to influence politics and have a say

Conversely, Czechs who had friends of different races and/or ethnic groups were more likely to be supportive of Muslims coming and living in the country.

A quick look at Bulgaria’s “Gallup International”

A quick look at Bulgaria’s “Gallup International”

Something in this WSJ story about Russian meddling in Bulgaria caught my eye – the bits about “exaggerated polling data” and this rather curiously-named polling firm:

“A Bulgarian polling company, Gallup International, which isn’t related to U.S. pollster Gallup Inc., accurately predicted Mr. Radev’s victory. The company, which is being sued by Gallup Inc. for using its name without authorization, also co-published a February poll that said citizens of four NATO members, including Bulgaria, would choose Russia, rather than NATO, to defend them if they were attacked. Those results were at odds with a similar poll by Gallup Inc., published a few days earlier, showing that most NATO members in Eastern Europe, including Bulgaria, see the alliance as protection.

Gallup International didn’t respond to requests for comment.

“This wrapped-in-secrecy poll had no details on methodology nor funding sources,” said Ilian Vassilev, Bulgaria’s former ambassador to Moscow. “Russian media strategists and their Bulgarian proxies used the Western name to fool people about its credibility and spread their message.””

So I decided to take a quick look at Gallup International’s website, here. Without digging into the actual polls they do themselves, not much to comment on here.

Things are a bit more interesting on the affiliations page.

esomar 2
Screenshot, March 24, 2017

First on their list of affiliations is the WIN/Gallup International, “made up of the 80 largest independent market research and polling firms in their respective countries.” It’s an affiliation I’ve never come across and has several members I’ve either never heard of or are minor players in their respective countries (not including Leger in Canada, who’s well established in Canada and, coincidentally, also part of ESOMAR), which I guess underscores the “independent” part of the description.

But it’s the ESOMAR bit which interests me most. ESOMAR is a large, well-respected international network of market research firms who abide by a code and guidelines on market research. Not everyone’s a member of ESOMAR (in fact, none of the big firms I worked for were members) but it’s a reassuring thing to see on a polling firm’s site if you know nothing about them.

But this is what you get when you do a search for Gallup International in the ESOMAR directory.

esomar 1
Hmm. (Screenshot, March 24, 2017)

I double-checked the list of Bulgarian members to make sure I didn’t miss them. There are eight Bulgarian members of ESOMAR, according to the directory. Gallup International isn’t one of them.



Ten things you should know about TB in Ukraine

Ten things you should know about TB in Ukraine

Did you know Friday is World Tuberculosis Day? You do now.

Ukrainian? In Ukraine? A Ukraine-watcher, whatever that means? Here’s a list of ten things you should know about TB in Ukraine:

  1. The TB incidence rate in Ukraine in 2016 was 67.6 per 100,000 persons – which, for perspective, is anywhere from ten to twenty times the rate in countries like the US, the UK or Canada (to say nothing of the absurdly high rates among First Nations and Inuit in Canada, but I digress).
  1. Fewer Ukrainians were diagnosed with TB in 2016 than in 2015 – a 4.3% decrease in the number of new diagnoses. Good.
  1. Ukraine has, alongside Russia, a spot on the World Health Organization’s (WHO) list of 20 countries with the highest estimated burdens of multidrug-resistant tuberculosis (MDR-TB). There’s more than 8,000 new cases of MDR-TB registered in Ukraine every year, and it’s increasing. That’s bad.
  1. Anyone can get TB in Ukraine, including children – especially if you don’t vaccinate them. OMFG BCG vaccine pls FFS.
  1. TB’s still a disease concentrated in at-risk groups in Ukraine. According to stats from Ukraine’s Public Health Center (thankfully renamed from the unwieldy “Ukrainian Center for Social Disease Control of the Ministry of Healthcare of Ukraine”), around 70% of new TB cases in 2014 were in so-called “socially vulnerable groups” like unemployed people of working age and drug/alcohol abusers. (NB. these are the most recent breakdowns they seem to have but I don’t see any reason why these would’ve changed at all over 2015/16).
  1. One the major groups of people at risk of TB, particularly MDR-TB, are people with HIV/AIDS. As I wrote about earlier this week, more than half (52%) of deaths from AIDS-related causes in Ukraine last year were from TB – much higher than the one-third of deaths globally from TB in people with AIDS.
  1. HIV/TB co-infection is increasing in Ukraine – a “noticeable increase” according to the Public Health Center, increasing year-on-year from 2013. All this “[reflects] the increasing burden of HIV infection in the country.”
  1. There aren’t any numbers on TB, HIV or anything coming out of the non-Ukrainian-government-controlled parts of Donetsk and Luhansk oblasts (“DNR”/”LNR”), but everyone assumes the TB situation there is pretty bad. One senior international official I spoke to last month told me “we hear about used needles, terrible conditions there” with the at-risk population in the east – largely in and around Donetsk, which has long been an HIV hotspot in Ukraine. “I’d say of course HIV is growing there, TB is growing there, because the conditions in which they are spending time in is terrible,” this official told me.
  1. As Oksana Grytsenko reported in the Kyiv Post a few days ago, Ukraine struggles to provide effective TB treatment. Read her piece. No point in me rehashing it here, other than to add this quote from the Public Health Center: “Especially dangerous is the untimely addresses for medical assistance, late TB diagnostics, and HIV/TB co-infection, which causes a high level of mortality due to TB and results from the lack of a comprehensive approach to the combination of preventive and treatment programs at the national and regional levels into a single system of counteraction”
  1. There’s cause for some cautious optimism, I think. To plug again what I wrote about HIV earlier this week, state funding for TB treatment is being increased in 2017 and activists I spoke to seemed confident that the Ministry of Health and the government as a whole is (re)recognizing HIV/TB as a priority. Still, we’ll see.

Seven statistics for Ukraine-watchers

Seven statistics for Ukraine-watchers

As Ukrainians remember the bloodiest days of the revolution three years ago, I’ve gone back into the last few months of poll/survey data and pulled out a few numbers that I think are worth keeping in mind, particularly for westerners and outsiders like me who are desperately trying to understand: what do Ukrainians think?

1. Barely anyone thinks life’s got better since Euromaidan

Some discomfiting numbers from a Sofia poll in November – 82% of Ukrainians think their lives have gotten worse since Euromaidan (29% ‘a little worse’; 53% (!!) ‘much worse’). Only 5% think life has improved.

2. Most Ukrainians think the country’s going in the wrong direction

From the same Sofia poll – 73% of Ukrainians think the country’s going in the wrong direction (30% ‘generally in the wrong direction’; 43% ‘definitely in the wrong direction’.

3. Barely anyone trusts the President, Rada, political parties or any politician at all, for that matter

And it’s gotten worse. As I wrote in December about a Razumkov Centre and the Ilko Kucheriv Democratic Initiatives Foundation year-end poll:

“…trust in the president (49% in 2014 to 24% in 2016) and in the Rada (31% to 12%) has tanked while trust in political parties (11% in 2016) is even lower. I haven’t graphed it out here but there’s obviously also been a corresponding increase in those who say they distrust the President (44% 2014: 69% 2016), the Rada (57% 2014: 81% 2016) and political parties (71% 2014: 78% in 2016). Keep in mind too that not a single individual Ukrainian politician is more trusted than distrusted (pages 5 and 6, question 7), so, ouch.”

4. Barely anyone’s satisfied with the President, Rada, etc.

In the aforementioned Sofia poll in November, 75% of Ukrainians disapproved of the job Poroshenko’s doing, and in a Rating poll from December 82% of Ukrainians surveyed said they were dissatisfied with him. The numbers from Prime Minister Volodymyr Hroisman and Speaker of the Rada Andriy Parubiy aren’t much better – 78% and 82% dissatisfied, respectively.

5. Some Ukrainians  still say the Euromaidan was ‘an illegal armed coup’, though most disagree 

This was a fascinating survey by KIIS for Detektor Media trying to unpack the influence of Russian propaganda in Ukraine. One of the tropes we’re all familiar with is that Euromaidan was totally some kind of Nazi-fascist-Junta-Banderite-Victoria Nuland’s cookies-Soros-Obama-NATO-CIA-drugged tea-EU coup (take your pick), and a good number of Ukrainians, it seems, buy it…34% of Ukrainians across the country agreed with the statement that ‘the events of 2014 in Kyiv were an illegal armed coup’, with numbers higher in the south (51%) and east (57%).

On the other hand, most Ukrainians (56%) agreed that ‘the events of 2014 in Kyiv were a peoples’ revolution’, with numbers highest in the west (81%) and centre (61%) of the country.

Weirdest, though, are the 9% of people who said ‘the events in Kyiv’ were both an ‘illegal armed coup’ and ‘a peoples’ revolution’. Yeah, I don’t get that.

6. Ukrainians don’t feel all that comfortable with their personal/family financial situations

A more recent poll from Rating showed that “half of…respondents considered their family’s financial status to be unsatisfactory whilst only 15% deemed that they had satisfactory finances for life, and one-third declared themselves to be at poverty level. The highest number of poor people being recorded in the East, among older people and those with a low education level.” [my bold]

7. Are there any silver linings here at all or just a list of depressing statistics?

Here’s an attempt to find a relevant silver lining from the Razumkov Centre and the Ilko Kucheriv Democratic Initiatives Foundation‘s year-end poll – the new patrol police are more trusted than mistrusted (46% trust, 41% mistrust), and the old militsiia are a bit less mistrusted than they used to be (23% trust in 2016, 11% in 2015 and 16% in 2014).

Feel free to look through the polls I’ve linked to here and tell me what you think I’ve missed.



On “First the journalists, then tanks and bombs”

On “First the journalists, then tanks and bombs”

OK, I’d seen this article and graph kicking around Twitter for a day or two before I finally looked at it, and I’m both glad and not glad I did.

This impressive-looking graph. You’ve seen it, right?

For anyone who hasn’t already seen it or (like I had) has given it only a cursory weekend glance,  the graph is based on an analysis done by Semantic Visions, “a risk assessment company based in Prague” who “conduct…big data (meaning non-structured, large data requiring serious calculations) analyses with the aid of open source intelligence, on the foundation of which they try to identify trends or risk factors.” They also use a “private Open Source Intelligence system, which is unique in its category and enables solutions to a new class of tasks to include geo-political analyses based on Big Data from the Internet.”

OK, cool.

The gist in this case: Semantic Visions had algorithms read hundreds of thousands of online sources, including 22,000 Russian ones,  searching for different trends.

OK…though as someone who chose to suffer through a media content analysis as a thesis for some reason I have a number of methodology-related questions I don’t want to harp too much on (e.g., how is the algorithm actually designed to determine positive/negative stories vis-à-vis a human? how were the online sources chosen? etc.). A little transparency here would go a long way, proprietary nature of the algorithms notwithstanding.

What gets me is the conclusion they’ve drawn based on the data they’ve gathered and present here in this article.

The article says “the number of Russian articles with a negative tone on Ukraine [from February 2012] started to show a gradual and trend-like increase – while no similar trend can be found in English-language media.”

Yes, your data does show that. Got no problem there.

But it’s this (my emphasis in bold):

“Therefore, based on hundreds of millions of articles the possibility that the actual events in Ukraine could themselves be the reason for the increasing combativeness of Russian-language articles can be excluded. Moreover, the strongly pro-Russian President Yanukovych was still in government at the time and the similarly Eastern-oriented Party of Regions was in power. The explanation is something else: the Putin administration was consciously preparing for military intervention and the Kremlin’s information war against Ukraine started two years before the annexation of Crimea to turn Russian public opinion against Ukrainians…”

How can someone possibly draw that conclusion based solely on the numbers presented here?? Are you privy to other data or pieces of analyses that aren’t public? Because, based on the data that’s presented here, I see absolutely no justification for the conclusion that the Kremlin “was consciously preparing for military intervention.”


  • A big part of the explanation for any apparent increase in negative coverage would be the EU Association Agreement being initialed in March 2012, right?
  • Why start the analysis at June 2011? I’d want to see the tone of coverage compared to the last bit of Yushchenko’s presidency through the beginning of Yanukovych’s – maybe the increase over 2012-2013 isn’t so much an increase as a return to “normal” negative coverage of Ukraine.
  • (OK, I lied about no more methodology questions) What about positive stories? Were negative stories about Ukraine taking up a greater share of overall coverage, or did the overall number of articles itself increase? Not being transparent on methodological nerdish issues like this really, really doesn’t help, guys.

Please – no more divining of Kremlinological intentions from incomplete, unclear sets of numbers.

Radical Party vs. far-right party support in Ukraine: how similar is it?

Radical Party vs. far-right party support in Ukraine: how similar is it?

(the title of this post explains it. there’s hockey on so I can’t be arsed with a preamble)

I did a bit of analysis using the most recent publicly available Ukraine data set I have – the May 2016 edition of the KIIS Omnibus survey of 2,000 Ukrainians in government-controlled parts of the country (as always, no Crimea, no unrecognized “DNR”/”LNR” statelets). The September 2016 data set will be available in February, they tell me.

I grouped together supporters of the three far-right parties that showed up in the May survey (Svoboda, Pravyi Sektor and Yarosh’s Governmental Initiative) and compared them to Radical Party supporters to see how similar they are.

The answer? Not very.

Age: Radical Party supporters tend to be older (average age 52.2 years old) than far-right party supporters (45.8 years old) [p≤.01].

Settlement size: Radical Party supporters are more likely to live in communities of less than 100,000 people (16.9%) than in cities with more than 100,000 people (10.3%) [p≤.01] – but there’s no significant difference for far-right party supporters.

Urban/rural: Obviously related to settlement size, Radical Party supporters were more likely to be from a rural area (18.6% compared to 11.1% urban; p≤.01). Again, there’s no significant difference for far-right party supporters, even if the numbers appear to slightly skew rural.

Attitudes towards Russia: Shocking no one, far-right party supporters are more likely to have bad/very bad views towards Russia (16.9% compared to 3.7% ‘good/very good’)[p≤.01]. Not so for Radical Party supporters – there’s no significant difference.

Will Ukraine be better/worse?: Far-right party supporters are more likely to think that the situation in Ukraine will be better (14.6%) or the same (15.7%) in a year’s time, compared to 6.9% ‘worse’. [p≤.01]. There’s no significant difference for Radical Party supporters.

Ukraine’s leaders moving country in right/wrong direction?: While Radical Party supporters are more likely to think the country’s going in the wrong direction (15.8% compared to 6.5% ‘right direction’; [p≤.01]), there’s no significant difference for far-right party supporters.

Perceived income: I’ve split the perceived income question in two (there’s five categories, with almost no one picking the fifth, ‘richest’ category) – think of it like ‘perceived lower income’ versus ‘perceived higher income’.

With that in mind, Radical Party supporters were more likely to describe themselves as part of that lower-income group; their household situations tend to be “lacking money for food” or “enough money for food but not for clothes” compared to having enough money for clothes or to buy expensive things (16.6% compared to 7.9%)[p≤.01]. As for far-right party supporters, there were no significant differences here.

Reported income: Radical Party supporters were more likely to report they earned less than 3000 UAH a month (16.6% compared to 10.1% more than 3,000 UAH/month)[p≤.05], while there was no significant difference for far-right party supporters. This question, FWIW, doesn’t appear on all KIIS surveys.

Region: Far-right support tends to come from western Ukraine (21.7%), compared to 7.0% in central Ukraine, 5.3% in southern Ukraine and 3.1% in eastern Ukraine [p≤.01]. Radical Party support, on the other hand, is actually pretty even across western, central and southern Ukraine (14%-16%).

Education: Like with perceived income I’ve had to split education into two broad categories – ‘lower-educated’ and ‘higher-educated’. Using those categories Radical Party supporters tend to be lower-educated (19.3% compared to 11.8%)[p≤.01], while no significant difference appears for far-right party supporters.

Gender: Far-right party supporters are more likely to be men (14.7% compared to 7.2%) – no such significant difference appears for Radical Party supporters.


Based on this data, far-right party supporters and Radical Party supporters don’t look too much alike.

  • Far-right party supporters, relatively speaking, are young, predominantly male and concentrated in western Ukraine and have much more negative attitudes towards Russia.
    • Are they more concentrated in rural areas? They may well be, but the stats from this survey alone don’t allow me to draw that conclusion. If they are I suspect it’s a weaker relationship than for Radical Party supporters.
  • Radical Party supporters, on the other hand, tend to be older Ukrainians who live predominantly in rural areas across different regions of Ukraine, have lower levels of income and education and feel more pessimistic about where Ukraine’s headed.


1. This is one poll, taken at one point in time more than seven months ago. I want to repeat this with more recent data to see if these trends hold or whether new ones emerge (or with different data if someone wants to give it to me).

2. The small sample size of decided voters (less than half of the original sample of 2,000 Ukrainians) really inhibits the amount of analysis I can run – thus why you see some of these oversimplified ‘higher/lower’ categories. This means some of the possible nuances between the cracks don’t get captured (e.g., between four levels of perceived income or education, etc.). This also so means that some differences that weren’t statistically significant here could show up as significant in different, larger surveys.

3. (2a?) The sample size isn’t remotely big enough to try and do more complex analysis (e.g., logistic regression) to determine what variables (e.g., gender, age, etc.) make the biggest impact on far-right or Radical Party support. Bah.

Thoughts welcome, errata mine.