Been seeing a lot about how the government passes shitty laws, lot of mass shootings and expensive asf health care. I come from a developing nation and we were always told how America is great and whatnot. Are all states is America bad ?
Been seeing a lot about how the government passes shitty laws, lot of mass shootings and expensive asf health care. I come from a developing nation and we were always told how America is great and whatnot. Are all states is America bad ?
If you think I’m denying racism, you read that comment incredibly wrong. I’m saying in comparison to other coutnries.
Your point was that we’re better because we talk about it.
All over the country legislatures are banning books, and curriculums that even mention racism. It isn’t an isolated incident either.
I agree with you on the exaggeration of racism.
I don’t think you bring valid points to why the United States are better than portrayed, but nonetheless believe so myself.