Belief is stronger than analysis. This is well-known to those who study cognitive biases (Kahneman, Slovic, Ariely, etc.) but interesting to see it confirmed with an experiment involving politics and math: moderately challenging calculations more likely to be "done wrong" if the results go against the subject's political beliefs than if the results are politically neutral. And the more "numerate" the subject, the stronger the effect.
I think this is an important part of the answer for those of us who look at the political situation (anywhere...) and are amazed how "sane" people robustly maintain their belief system impervious to "evidence", and the importance of patently flimsy echo-chamber talking points to anchor those beliefs.
The underlying research is at http://static1.1.sqspcdn.com/…/138…/wp_draft_1.5_9_14_13.pdf The math task was Bayesian inversion of a contingency table, something humans are bad at (e.g. health treatments). I'm not 100% convinced the hypothesis "people do math wrong" (a quality observation about Kahneman's thinking-slow system) is proven versus an alternative hypothesis of "people don't bother to engage thinking-slow if political beliefs provide a salient thinking-fast answer".