A lot of people are simply replying by saying it's their right to do this. Well obviously, but what is your opinion about it? People should be speaking out against this kind of thing. I'm not saying it should be stopped but people should be letting them know that it's not cool and that they don't represent all Americans. I agree that the media played a big role in blowing this out of proportion, but the fact remains that the world now knows about it and how other Americans decide to speak up about it (or not) will reflect on the nation.
Simply saying that it's their right to do it seems rather apathetic to me. Do you care that this kind of hate exists in your country or not?