What is it with the general religious fucknuttery in the US? Do these people genuinely still think Santa Claus is real?
Check out this thing. Fascinating read, though it doesn’t speculate on the causes.
Might be because the people in America (as a colony, way back) felt insulated from Europe and Middle East so they needed some religion that’s a bit closer to home? That was always my take on Mormonism, especially the part where Jesus lived in the USA.
What just happened here? o.0