this is not to create controversy, i just want to know peoples opinion on why? when the word americans or america is mentioned everyone thinks rigth away USA or people from USA, when in fact everyone from this continent are americans(this includes news all over the world. google,wikipedia in fact anything that you research with americans or america will give you the same result USA).
And like i said this thread is just to know peoples opinion, nothing else<><>
And like i said this thread is just to know peoples opinion, nothing else<><>