As a child in public schools and now as a graduate student in history, I have learned one thing to be true about the United States: It is a White country. The founders were White, White men established its core principles and political system, and White men and women built the nation into what it is today.
Even before I became aware of the importance of race, I never thought this was “racist” or unfair; it was simply a fact. America has always been a White country and always should be. Why, then, … are White people giving away their country?
Immigration and assimilation
My maternal great-great grandfather had a saying in Spanish about the immigrants who began coming to the United States after the 1960s. In English it would be:…
View original post 1,466 more words