Recent Czech survey data + elections = a disinformation site’s dream

Recent Czech survey data + elections = a disinformation site’s dream

The most recent round of Eurobarometer stats just came out, and they’re bad news for pretty much anyone in Czech politics right now.

Only 18% of Czechs trust their government right now, a decline of 10% from autumn 2016 – by far the sharpest decline in the EU – and only Greeks, Italians and Spaniards distrust their government as much as Czechs do.

DDBixubXYAAZSFv
Oy.

Still, this isn’t nearly as ugly as the table for the question on trust in parliament…

DDBjvMjW0AAi0f9
Double oy.

Nobody in the EU distrusts their country’s parliament more than Czechs do right now, all thanks to the Czech government’s farcical three-part comedy act/political crisis over the past few months (yeah I linked to a Wikipedia article I don’t care).

This level of (dis)trust shows up in recent Czech Public Opinion Research Centre (CVVM) survey data too – their numbers also show that trust in President Miloš Zeman , the government (“Vláda”) and the Chamber of Deputies (“Poslanecká sněmovna,” the lower house of the Czech parliament) has completely tanked.

may 2017
“Table 1a: Population’s confidence in constitutional institutions (%) – comparison over time”

Worse, look at the way Czech satisfaction with the current political situation has driven right off the cliff after a slow recovery from 2013’s scandals.

political cvvm
“Graph 4: Satisfaction with the current political situation from 2011-2017 (Satisfaction Index 0-100)”

And, as if you needed another graph to show how bad it is, look at the drop for both president and government here (the blue and red lines, respectively).

cvvm trends
“Graph 3: Confidence in institutions 2011-2017 (confidence index)”

Numbers like this should be worrisome for a country at any time, but remember the Czechs are going to the polls in just under four months to elect a new parliament – and at the polls again not long after to vote for president.

If I were, say, part of a government of an unnamed country’s efforts to interfere and meddle in other countries’ elections, I’d be all over the Czech Republic this summer.

…to that end, another set of numbers I’ve had kicking around for a few weeks from two previous Eurobarometer surveys (both autumn 2016) show just how said unnamed country’s efforts could actually work.

One, Czechs seem to trust social media more than most other Europeans. While it’s still a minority (40% disagreeing that “information on political affairs from online social networks cannot be trusted,” which is a mouthful of a double negative but yeah), it’s also more than any other EU country.

cz

Eurobarometer’s data is free to download for losers like me, so I took a look in more detail at who exactly in the Czech Republic thinks information on politics from social media can be trusted (to the extent the data can tell me – the sample size is ~1,000, so it can’t be parsed all that much, and Eurobarometer IMO doesn’t have the best questions about education level and essentially not much useful on income or a proxy for income).

As shouldn’t be any surprise, it’s the young: 60% of Czechs aged 15 to 24 disagreed that “information on political affairs from online social networks cannot be trusted” compared to 48% of those 25 to 39, 44% of those 40 to 54 and 27% of those 55 and older. Also interesting are the “don’t knows” – only 5% of 15 to 24 year olds compared to 19% of those 40 to 54 and 39% of those 55 and older.

The same trends show up in different questions about social media – in this one below, for example, Czechs are among the most likely in the EU to think social media is reliable.

social media

Interestingly, 49% of those 15 to 24 years old, 50% of those 25 to 39 and 46% of those 40 to 54 think social media is reliable – in other words, a similar if not identical proportion – but only 31% of those 55+ think social media’s reliable.

One of these Eurobarometer surveys, coincidentally, happened to ask people their attitudes about various countries, including Russia (my mention of Russia is, of course, purely hypothetical and definitely, definitely not related to the “unnamed country” above).

Run what Czechs think about social media (i.e., the agree/disagree question on whether it’s reliable) against what they think of Russia and the results are pretty interesting – Czechs who think social media is reliable also tend to be more positive towards Russia.

   Total “positive” towards Russia   Total “negative” towards Russia
 Social media reliable?  49% 49% 
 Social media unreliable?  35% 64% 

Caveat, though. This question about being positive/negative towards Russia isn’t necessarily a proxy for what they think about the Kremlin or, for that matter, anyone else in the world. It doesn’t necessarily mean the respondent is some sort of zombie “radicalized by Russian propaganda” or even necessarily positive towards the Kremlin or Russia’s foreign policy, etc. Also, some respondents may well have interpreted the question as being positive/negative towards Russian people in general. Still, it’s interesting that the data falls out this way – and falls out this way across many other EU countries – and merits a hell of a lot more study than it’s getting.

There’s a ton more numbers I haven’t mentioned here (e.g., Czechs get more news from websites and trust the Internet more than most other Europeans) that, in all, paint a potentially very ugly picture – a population that increasingly distrusts its politicians and tends to trust social media and the web more than most other people. It’s a disinformation site’s dream.

A lesson on how not to fight disinformation

A lesson on how not to fight disinformation

In the wake of its annual Bratislava Global Security Forum at the end of May, Slovak think tank GLOBSEC released a report of what it called a “comprehensive analysis of public opinion surveys” from surveys in seven central and eastern European (CEE) countries.

Compared to ugly Word reports I’ve spit out in my time, this one’s got no shortage of big bold headlines, shiny graphics and sexy graphs. But as someone who writes a lot about pro-Kremlin disinformation in Europe, it’s probably no surprise which page caught my eye.

fakenews
You’ve got my attention.

“Almost 10% of people in the CEE trust online disinformation outlets as relevant sources of information on world affairs,” they say.

Well. I’m hooked.

The next page was even better…but it’s when I started asking questions.

media d and e
Wait…what?

According to their surveys a range of 1% of people in Croatia to 31% (?!) in Romania “consider online disinformation websites as relevant sources of information.”

This is where I started realizing how many pieces of the puzzle are missing here.

1) How exactly did you ask this question? You can’t just ask someone “do you consider online disinformation websites as relevant sources of information?”

So how did you ask it? Was it a proxy question, like the way that the International Republican Institute (IRI) asked in their recent Visegrad surveys (i.e., “Do you watch or read media outlets that often have a different point of view than the major media outlets?”) Was it a series of questions, or what? Without the actual questioning wording here it’s hard for me to take this seriously.

2) How many people were asked the question? This is so ridiculously basic and yet it’s nowhere to be found.

The methodology “section” is up at the front of the report and it’s about as long as a calm, decidedly-non-mega Twitter thread:

“The outcomes and findings of this report are based on public opinion surveys carried out in the form of personal interviews using stratified multistage random sampling from February to April 2017 on a representative sample of the population in seven EU and NATO member states…

For all countries, the profiles of the respondents are representative of the country by sex, age, education, place of residence and size of settlement. „Do not know“ responses were not included in data visualizations.”

So we don’t know how many people were asked the question – and we don’t know how many people responded “don’t know,” so we have absolutely no idea how large or small the base are for the numbers they’ve graphed up for us here. And we don’t even know exactly what questions were asked. Weak.

3) What the hell is going on with Romania’s number? Look, in the dozens upon dozens of surveys I’ve run in my life, if I see six of seven figures on the low end and then one of them almost three times higher I’m going to ask questions. Sometimes there’s an obvious, easy explanation. Sometimes it’s a more complicated explanation. Sometimes you don’t have one. And, sadly, sometimes it’s because you screwed something up running the numbers or, worse, the whole lot of you muffed something up administering the survey.

Doesn’t look like anyone’s asking questions here. We get no explanation of why Romania’s figure should be that much higher. No explanation of why, apparently, almost a third of Romanians “consider online disinformation websites as relevant sources of information” when only 1% of Croatians (i.e., basically no one) do. Surely that merits at least some attempt at an a explanation.

4) How exactly did you arrive at the conveniently round “10 million” figure? OK, part of this is obvious – you took the percentages in each country of who said (“said”) they trusted disinformation websites and divided into the population of each country.

But what population? 18+? 16+? Official population figures? Registered voters?Transparency is a lovely thing, especially with survey numbers that you use to make bold, attention-grabbing claims.

This “10 million people in CEE trust fake news and disinformation websites” headline has already buzzed around CEE/disinfo-busting social media for a week now. It’s since made it into the East Stratcom Task Force’s Disinfo Review, and surely it’s going to find its way into a few more articles and probably even a few speeches and still more conference panels.

It shouldn’t. It’s a questionable claim based on a completely non-transparent survey analysis, delivered as part of a think tank’s glossy PR exercise. Bullshit’s no way to win the (dis)information war, guys.

The story behind that “terror attacks” map you keep seeing

The story behind that “terror attacks” map you keep seeing

You’ve seen this map somewhere on social media the last few weeks, haven’t you?

Here it is from Poland’s deputy justice minister, because somehow this is how you show solidarity with the citizens of a country millions of your own people live and work in.

2017-06-04 18.10.49

I’ve also seen it from Polish MEP and unsuccessful candidate for the presidency of the European Council Jacek Saryusz-Wolski, though my favourite version is the one tweeted out by that one guy who managed to get fired from The Rebel.

IMG_cyjezr
OOOOH this one has different colours!!

And last, a version I saw on Instagram this week.

IMG_3d1kpb
Cool.

So where the hell is this data even from? Turns out, as someone from the right-wing Polish Twitterati told me, it’s data from the reputable Global Terrorism Database (GTD) at the University of Maryland, which is an “open-source database including information on terrorist events around the world from 1970 through 2015.” (Notice right now that says “events,” not “attacks.” This will be important). I was further informed that, for some reason, the data on this map that’s been making the rounds is only from 2001 on. OK.

So I took a look through the GTD data on some of the countries (including Poland) on this map.  There are certainly no terror “incidents” (read, “incidents”) listed in Poland from 2001 on. OK, so that seems (seems) accurate.

But what about terror in other countries? I’m particularly interested in these apparent incidents in Iceland, which shows up in some versions of the map and, having been there, doesn’t exactly strike me as a terror hotbed.

Since 2001, there have apparently been two terror incidents in Iceland that explain the two Icelandic dots on the map:

  • In 2012, “An explosive device detonated near government offices in Reykjavik city, Reykjavik North Constituency, Iceland. The explosive device was partially detonated by a robot meant to deactivate it. No group claimed responsibility for the incident.” Property damage was listed as unknown.
  • In 2014, “Assailants attempted to set a Lutheran Church on fire in Akureyri city, Northeast constituency, Iceland. No one was injured in the attack; however, the building was damaged. No group claimed responsibility for the incident.” Property damage is listed as “minor.”

No one was killed or injured in these two incidents.

Again, “incidents” is the key word. These two big red Icelandic points, and many others on the map, don’t represent terror attacks at allMany of them, including these two in Iceland, represent vague criminal acts that may not actually have anything to do with terrorism (let alone jihadist terrorism), that have barely caused any property damage and, more importantly, haven’t killed or injured anyone.

Why no Polish incidents in the GTD since 2001? Surely there’s been at least one shitty attempt at something like a pipe bomb in a car that never went off (there was one in the Czech Republic database, as I discovered) that would merit a mention in this database, though presumably this will make it into 2017’s list for Poland, given the criteria for inclusion.

So the next time you see this map, you’ve got a few options. If it’s got no legend or title, you can always tell whoever shared it that the points represent vague definitions of criminal acts that don’t always seem to be reported consistently. If it says something about “terror attacks,” tell them they’re completely, 100% wrong, and tell them there’s more than enough data on the GTD website for them to make a proper map of actual terror attacks that isn’t just a cute meme for people who don’t like Muslims.

Another quick look at Bulgaria’s “Gallup International” (aka Part II)

Another quick look at Bulgaria’s “Gallup International” (aka Part II)

I’ll forgive you if you don’t remember Bulgaria’s Gallup International (no, not that Gallup – the Gallup International “being sued by Gallup Inc. for using its name without authorization”), who very briefly popped into western headlines in March thanks to a Wall Street Journal article.

Published only days before the election, the article alleged that the pro-Kremlin Bulgarian Socialist Party (BSP) received a secret strategy document proposing, among other things, that they “promote exaggerated polling data” to help build momentum for the party and get them elected. Gallup International was specifically named in the article, and referred to by Bulgaria’s former ambassador to Moscow Ilian Vassilev as one of Moscow’s “Bulgarian proxies”; Vassilev, talking about a poll Gallup International published on an apparent lack of NATO support in Bulgaria, accused the polling agency of making a “wrapped-in-secrecy poll [that] had no details on methodology nor funding sources.”

Of course, the WSJ article wasn’t without its critics, myself included. One journalist in Bulgaria said the WSJ article “completely ignored all the internal political factors that led to the landslide victory of [President] Rumen Radev and focused only on Russian influence in elections,” which ultimately, in his opinion “backfired, instead of there being alarms for (legitimate) concerns about Russian involvement in Bulgaria.” (To say nothing of the role of the “crackpot outfit” at RISI apparently being involved in producing the document.)

Debates about the scope and scale of Russian influence in Bulgaria aside, I decided to take a look at polling data in Bulgaria leading up to the parliamentary election on March 26. Is there anything to support this idea that Gallup International and potentially other Bulgarian pollsters have used and promoted “exaggerated polling data” to benefit the BSP?

There might be.

Here’s a table of all the polls I’ve been able to find in 2017 leading up to the election, with levels of party support noted and the final actual election results at the bottom.

bulgaria poll jpg

(larger pdf of same chart here: Bulgaria polls)

Almost all polls from polling firms like Trend Research (3 of 3), Alpha Research (3 of 3, and, full disclosure, whose managing director I interviewed as part of my April piece for Coda Story) and Estat (2 of 3) showed Boyko Borisov’s GERB having a lead.

To break it further down (remembering that GERB ended up winning by 5.6%)

  • Trend Research: average GERB lead of 1.93%
  • Alpha Research: average GERB lead of 2.77%
  • Estat: average GERB lead of 3.47%

But the story’s pretty different with Gallup International, who one Bulgarian source described to me as “the main political/sociological advisor of the [BSP] at election time.” They published four polls during the campaign, only one of which showed GERB in the lead – their final poll, about a week before the vote, which showed GERB with only a 0.6% lead. On average, Gallup International’s polls showed an average BSP lead of 0.7% throughout the campaign.

Again, GERB ended up winning by 5.6%.

The numbers are even stranger for AFIS, a pollster run by Yuriy Aslanov, a sociologist who has sat on the BSP board. Both its polls showed the BSP in the lead by an average of 1.3%.

Again, GERB ended up winning by 5.6%.

In sum – a pollster that’s been accused as being part of an effort to “promote exaggerated polling data” on behalf of the party it’s linked to consistently showed results that played up the support for said party, contradicting most other polling firms’ results throughout the campaign. As a western observer and polling nerd who’s worked for multiple firms in Canada and the UK, I feel I’m in a position to confidently say this isn’t normal.

Of course, this all needs a few caveats. There’s absolutely no evidence of anyone cooking up numbers or anything like that – I am accusing no one of that. I’m also well aware of how statistics work and am well aware of the (however unlikely) possibility that these figures from Gallup International and AFIS are all down to random survey errors or even differences in methodology. Still, something’s off here.

Above all though, I’d stress this message to other journalists who haven’t had the pleasure of drowning a considerable part of their adult lives in SPSS – if some polling numbers look consistently very different from what other polling numbers are saying, ask why. Sometimes it’s an easy answer – a different, new methodology that may or may not pan out, a rogue poll (i.e., the one out of 20) or the polling firm just really sucks. But sometimes it’s not that simple.

Surveying some surveys: Czechs & refugees, immigrants and Islam

Surveying some surveys: Czechs & refugees, immigrants and Islam

I’ve been spurred on by what I guess we can call some, um, “colourful” comments on Coda Story’s recent animation of my January story on Islamophobia in the Czech Republic to take a look at some recent public opinion data.

“Unsympathetic” towards Arabs

The Czech Public Opinion Research Centre (CVVM) asked a few questions in their March 2017 survey of ~1,000 Czechs about attitudes towards people from different nationalities/ethnic groups, including Arabs (who I think we can agree in most Czech minds means “Muslims”). They’re right at the bottom.

C-Rc2rnWsAUGhNz

The numbers for Arabs look even worse over time….

arabove
Mean scores 1-5, where 1 is “very sympathetic” and 5 is “very unsympathetic.”

No other group has seen anything like this; as the CVVM’s summary report points out, the percentage of those saying they’re “very unsympathetic” (i.e., 5 on the 5-point scale) towards Arabs has gone up by 18 percentage points since 2014. Fortunately (?) that increase seems to have flatlined since 2016.

There’s also a few demographic differences of note in the CVVM’s summary report: 41% of those who declared a good standard of living said they were “very unsympathetic” towards Arabs compared to 55% who declared a poor standard of living; 36% of those with a higher level of education said they were “very unsympathetic” compared to 47% who had an apprenticeship. Still, it’s clear that a lack of sympathy towards Arabs is pretty strong among all parts of the Czech population.

Unfortunately the raw data set isn’t yet publicly available for me to screw around with so I took a look at the raw data from last year’s survey (March 2016) to see if there were any other differences of note that might (or might not) be seen in the 2017 data. There weren’t many:

  • Men had a slightly more negative score on average than women (4.26 compared to 4.15), and more likely to say they were “very unsympathetic” towards Arabs (49% compared to 44%)[both p< 0.1, which means it’s barely worth mentioning IMO but I’ve still done it so deal with it.]
  • Czechs aged 15-29 (41%) were less likely than those 45-59 (50%) or 60+ (50%) to say they were “very unsympathetic” towards Arabs [p< 0.05]

Fear of immigrants

CVVM also released some analysis yesterday from the March 2017 survey on attitudes towards foreigners in general – 64% of Czechs feel that newly-arrived immigrants are a problem for the Czech Republic as a whole. This figure’s shot up since last year, but had dipped from 2015 after a slow rise from 2011.

graf 2

CVVM also asked a few specific questions about the impact people think immigrants have on their country, and the results over time here have seen a drastic change. The belief that immigrants contribute to unemployment has dropped by 12% since 2016 (not that surprising in a country with low unemployment) and, as you can see below, the belief that immigrants are a threat to the Czech way of life has increased.

image (47)

A reason for those “Refugees not welcome” stickers I’ve seen

The most recent round of the Eurobarometer surveys (November 2016) asked a question of ~1,000 Czechs whether they think their country should help refugees. Czechs were the second most likely, behind Bulgaria, of any EU country to say their country shouldn’t help refugees (23% agree versus 72% disagree; EU average 66% agree versus 28% disagree).

Here, as with the CVVM surveys, there’s a few demographic breakdowns of note that I analyzed using the raw data:

  • Czechs who finished full-time education between the ages of 16 and 19 were less likely to agree the Czech Republic should help refugees (20%) compared to those who finished full-time education at 20 years old or older (31%)[p<0.01]
  • Czechs in rural areas (18%) were less likely than those in towns and suburbs (24%) and cities (28%) to agree the Czech Republic should help refugees [p<0.05]

Again, despite these differences, Czechs across all social divides tend not to think their country should help refugees…

Czech and Islam by the numbers, Parts 1 and 2

Last fall I analyzed European Social Survey (ESS) data from 2014 on Czech attitudes towards Muslims living in their country. Part 1, and Part 2.

If you’ve been following along nothing here will surprise you. Who doesn’t want any Muslims to come live in the Czech Republic (i.e., who’s less likely to want them)? Those who:

  • Feel unsafe after dark
  • Have the least contact with different races or ethnic groups
  • Feel the government treats new immigrants better than them
  • Distrust social/political institutions
  • Feel they have less ability to influence politics and have a say

Conversely, Czechs who had friends of different races and/or ethnic groups were more likely to be supportive of Muslims coming and living in the country.

A quick look at Bulgaria’s “Gallup International”

A quick look at Bulgaria’s “Gallup International”

Something in this WSJ story about Russian meddling in Bulgaria caught my eye – the bits about “exaggerated polling data” and this rather curiously-named polling firm:

“A Bulgarian polling company, Gallup International, which isn’t related to U.S. pollster Gallup Inc., accurately predicted Mr. Radev’s victory. The company, which is being sued by Gallup Inc. for using its name without authorization, also co-published a February poll that said citizens of four NATO members, including Bulgaria, would choose Russia, rather than NATO, to defend them if they were attacked. Those results were at odds with a similar poll by Gallup Inc., published a few days earlier, showing that most NATO members in Eastern Europe, including Bulgaria, see the alliance as protection.

Gallup International didn’t respond to requests for comment.

“This wrapped-in-secrecy poll had no details on methodology nor funding sources,” said Ilian Vassilev, Bulgaria’s former ambassador to Moscow. “Russian media strategists and their Bulgarian proxies used the Western name to fool people about its credibility and spread their message.””

So I decided to take a quick look at Gallup International’s website, here. Without digging into the actual polls they do themselves, not much to comment on here.

Things are a bit more interesting on the affiliations page.

esomar 2
Screenshot, March 24, 2017

First on their list of affiliations is the WIN/Gallup International, “made up of the 80 largest independent market research and polling firms in their respective countries.” It’s an affiliation I’ve never come across and has several members I’ve either never heard of or are minor players in their respective countries (not including Leger in Canada, who’s well established in Canada and, coincidentally, also part of ESOMAR), which I guess underscores the “independent” part of the description.

But it’s the ESOMAR bit which interests me most. ESOMAR is a large, well-respected international network of market research firms who abide by a code and guidelines on market research. Not everyone’s a member of ESOMAR (in fact, none of the big firms I worked for were members) but it’s a reassuring thing to see on a polling firm’s site if you know nothing about them.

But this is what you get when you do a search for Gallup International in the ESOMAR directory.

esomar 1
Hmm. (Screenshot, March 24, 2017)

I double-checked the list of Bulgarian members to make sure I didn’t miss them. There are eight Bulgarian members of ESOMAR, according to the directory. Gallup International isn’t one of them.

Hmm.

 

Ten things you should know about TB in Ukraine

Ten things you should know about TB in Ukraine

Did you know Friday is World Tuberculosis Day? You do now.

Ukrainian? In Ukraine? A Ukraine-watcher, whatever that means? Here’s a list of ten things you should know about TB in Ukraine:

  1. The TB incidence rate in Ukraine in 2016 was 67.6 per 100,000 persons – which, for perspective, is anywhere from ten to twenty times the rate in countries like the US, the UK or Canada (to say nothing of the absurdly high rates among First Nations and Inuit in Canada, but I digress).
  1. Fewer Ukrainians were diagnosed with TB in 2016 than in 2015 – a 4.3% decrease in the number of new diagnoses. Good.
  1. Ukraine has, alongside Russia, a spot on the World Health Organization’s (WHO) list of 20 countries with the highest estimated burdens of multidrug-resistant tuberculosis (MDR-TB). There’s more than 8,000 new cases of MDR-TB registered in Ukraine every year, and it’s increasing. That’s bad.
  1. Anyone can get TB in Ukraine, including children – especially if you don’t vaccinate them. OMFG BCG vaccine pls FFS.
  1. TB’s still a disease concentrated in at-risk groups in Ukraine. According to stats from Ukraine’s Public Health Center (thankfully renamed from the unwieldy “Ukrainian Center for Social Disease Control of the Ministry of Healthcare of Ukraine”), around 70% of new TB cases in 2014 were in so-called “socially vulnerable groups” like unemployed people of working age and drug/alcohol abusers. (NB. these are the most recent breakdowns they seem to have but I don’t see any reason why these would’ve changed at all over 2015/16).
  1. One the major groups of people at risk of TB, particularly MDR-TB, are people with HIV/AIDS. As I wrote about earlier this week, more than half (52%) of deaths from AIDS-related causes in Ukraine last year were from TB – much higher than the one-third of deaths globally from TB in people with AIDS.
  1. HIV/TB co-infection is increasing in Ukraine – a “noticeable increase” according to the Public Health Center, increasing year-on-year from 2013. All this “[reflects] the increasing burden of HIV infection in the country.”
  1. There aren’t any numbers on TB, HIV or anything coming out of the non-Ukrainian-government-controlled parts of Donetsk and Luhansk oblasts (“DNR”/”LNR”), but everyone assumes the TB situation there is pretty bad. One senior international official I spoke to last month told me “we hear about used needles, terrible conditions there” with the at-risk population in the east – largely in and around Donetsk, which has long been an HIV hotspot in Ukraine. “I’d say of course HIV is growing there, TB is growing there, because the conditions in which they are spending time in is terrible,” this official told me.
  1. As Oksana Grytsenko reported in the Kyiv Post a few days ago, Ukraine struggles to provide effective TB treatment. Read her piece. No point in me rehashing it here, other than to add this quote from the Public Health Center: “Especially dangerous is the untimely addresses for medical assistance, late TB diagnostics, and HIV/TB co-infection, which causes a high level of mortality due to TB and results from the lack of a comprehensive approach to the combination of preventive and treatment programs at the national and regional levels into a single system of counteraction”
  1. There’s cause for some cautious optimism, I think. To plug again what I wrote about HIV earlier this week, state funding for TB treatment is being increased in 2017 and activists I spoke to seemed confident that the Ministry of Health and the government as a whole is (re)recognizing HIV/TB as a priority. Still, we’ll see.