Anonymous

How is the USA not a white Western country?

Serious answers, please. Also, please refrain from racist answers or name-calling. This question is being asked not with racist intents but to hear different views because I am genuinely curious. Also, I'm specifically talking about the USA country, not the land (which Europeans won through war). Everyone understands that native Americans inhabited the land in various tribal nations before white people came here and built the USA nation on the land. 

The USA is historically and culturally a white country. The legal system is based on English common law. The dominant religions are Protestant Christianity sects from Europe. The language is English which is European. The government structure such as a bicameral governing body derives from Europe. The philosophies such as the concept of inalienable rights to freedom are European. The literary, artistic, and philosophical movements are European. Whites have historically been the most populous people of the USA country, and initially USA citizenship was limited to white people (until the 14th amendment). The naturalization act of 1790 allowed only free white people to become citizens of the USA.So, why do a lot of people ignore the above and disagree with the assertion that the USA (at least from a historical perspective) has been a white country from the beginning? 

11 Answers

Relevance
  • ?
    Lv 7
    2 months ago

    'Western countries' in everyday conversation is generally taken to exclude those countries which, while they are in the West geographically, are not part of what we used to call the 'third world'.    Third world countries these days are taken to mean those lacking such things as basic infrastructure and the framework of a modern state:  for example a functioning political system,  a health service,  and so on.

  • Anonymous
    2 months ago

     Because majority of her land isn’t covered with snow zxjq

     . . . . . . . . . .

     , , , , , , , , , ,

  • 2 months ago

    So you think the USA is a white western country.  So what?  How does either a YES or NO comment help or hurt you in any way?

    Your language in your question is subtly racist even though you asked for US to avoid any racist intent.  OK, I'm serious.  To me it is a Zen attitude.  It is what it is.  What did you intend to do about it?

    To deny a racist intent  but to then dwell on the history that the U.S.A. started out with more restrictive views than it has now is to ignore the fact that people change over time.  As we learn more and analyze more, we learn to respond better to things that we finally can see as a violation of the spirit of "All Men are created equal."

    So I repeat... what's it to you?

  • 2 months ago

    The founders did not believe in freedom, as evidenced by their many, many slaves.

  • How do you think about the answers? You can sign in to vote the answer.
  • 2 months ago

    The US is a majority white country and will always be. 

  • Anonymous
    2 months ago

    It's the result of the current anti-white propaganda. Our kids aren't being taught history. Yes the USA was created as a white nation. The land it currently occupies was multicultural and multiracial, not the country itself. However with the amendment to make non white people citizens it's not really exclusively a white country today. But yes it still has a Anglo culture and European history.

  • Anonymous
    2 months ago

    I think it has to do with people confusing country with land. The amerindians were here first, so a lot of people assume the current country must be native American in origin and therefore not white. Also, a lot of people deny the USA being a white nation because they don't want to offend non-whites and don't want non-whites to feel like they don't belong.

  • Anonymous
    2 months ago

    @ Stephen Weinstein - you are a troll. The Asian population was just  0.5 percent when the 1965 immigration act was passed. No country can be 100 percent one racial group. Not even Japan. The point is America was majority white until very recently

  • 2 months ago

    Only guessing...I think it boils down to simple political correctness with people's focus being on a desired diverse future. In such cases there is a natural tendency to ignore what is or was to make the transition easier? 

    Please note this is, to a large degree, ignoring the prominent role of African Americans to which played a dominant, but unfortunate, role in forming pre-industrial economy. The contribution was real and highly significant, even if not formally accounted for by most of USA racist history. It may even be the success of 'white' America may not have been possible without this 'black' effort.

  • 2 months ago

    Most people never doubted that until 1965 immigration law. Yes, America had always been a nation of immigrants. But they were European immigrants. It was a melting pot of European immigrants.

Still have questions? Get your answers by asking now.