My own thought is that it's all about whiteness. And specifically whiteness deriving from a northern European genetic heritage. Which, for similar reasons, they don't want to be right out front with.
Tan won't do, and Marco "my first AND last names end in a vowel" Rubio may need to watch his back.
4 weeks ago
I'd say glib is sufficiently perjorative, and captures the idea you are describing here.
My question about all of this is "what is this Western civilization you speak of?".
Did "the West" begin with the Romans? Or the Greeks? Would they have thought of themselves as being "the West"?
Does it begin with Europe's early and growing awareness of itself as an entity that *wasn't* Rome, or some descendant of Rome? Like, maybe 11th and 12th C. Europe?
Are we meant to preserve the concept of nation states that emerged from the centuries of non-stop warfare over religious issues and competing wanna-be empires?
Do we get to include the Enlightenment in all of this, or do we need to, a la Rod Dreher, throw all of that away?
Is it capitalism? Christianity? If Christianity, is it just the Western traditions - Roman Catholicism and the Protestant movements that emerged from that? Do the various Eastern traditions get included? African Christianity? South American evangelicalism?
Is it just being white? Who gets to be white?
Oddly, to me, all of this blather comes in the context of the US basically telling Europe to fuck off. Which seems... inconsistent with an emphasis on "preserving our Western identity".
But which Christianity? There are a lot of them.
My own thought is that it's all about whiteness. And specifically whiteness deriving from a northern European genetic heritage. Which, for similar reasons, they don't want to be right out front with.
Tan won't do, and Marco "my first AND last names end in a vowel" Rubio may need to watch his back.
I'd say glib is sufficiently perjorative, and captures the idea you are describing here.
My question about all of this is "what is this Western civilization you speak of?".
Did "the West" begin with the Romans? Or the Greeks? Would they have thought of themselves as being "the West"?
Does it begin with Europe's early and growing awareness of itself as an entity that *wasn't* Rome, or some descendant of Rome? Like, maybe 11th and 12th C. Europe?
Are we meant to preserve the concept of nation states that emerged from the centuries of non-stop warfare over religious issues and competing wanna-be empires?
Do we get to include the Enlightenment in all of this, or do we need to, a la Rod Dreher, throw all of that away?
Is it capitalism? Christianity? If Christianity, is it just the Western traditions - Roman Catholicism and the Protestant movements that emerged from that? Do the various Eastern traditions get included? African Christianity? South American evangelicalism?
Is it just being white? Who gets to be white?
Oddly, to me, all of this blather comes in the context of the US basically telling Europe to fuck off. Which seems... inconsistent with an emphasis on "preserving our Western identity".