Anonymous
Anonymous asked in SportsFootball (Soccer)Other - Soccer · 2 months ago

Why do Americans think soccer is feminine?

6 Answers

Relevance
  • 2 months ago
    Favorite Answer

    It’s typically played by girls and young kids  in America. Boys play American football and baseball. At least this is traditionally the case, football has become more popular in the States. It could one day be a Big 4 sport, although it’ll never overtake the NFL.

  • 2 months ago

    Because Americans are stupid.

  • 2 months ago

    I don't think so. Soccer is the most popular sports in the world to me

  • 2 months ago

    I don't think that's necessarily the case anymore but there is still lingering bias that is bred during our primary school years.

    During that time the best male athletes are usually guided towards football, basketball and baseball.  Many boys, therefore grow up with the attitude that soccer is something to keep in shape during their off-seasons and that "only girls take it seriously."

    This seems to have changed greatly over the past 20-30 years, especially here in Southern California thanks to the influence of the Hispanic community where boys choose soccer as their primary sport from childhood to high school.

  • How do you think about the answers? You can sign in to vote the answer.
  • Not American, but when I see soccer players feign injury and roll around like they've been shot when someone comes within a few inches of contacting them, it's not 'gamesmanship' but it's embarrassing.  I grew up playing collision sports (hockey) where you NEVER feign injury. You get hit, you get up and get on with it.  You don't roll around like you're dying.  That's why.  Soccer is a great game, but if they'd start handing out straight red cards to players who do this, you'd make the game a lot better.

  • Anonymous
    2 months ago

    Do they? It is a fast growing sport in America. They have a national team.

Still have questions? Get your answers by asking now.