[citation][nom]azxcvbnm321[/nom]Hate to inform those in foreign countries, but there are certain groups and people who try to play up racism here in America. Why? Because it's their job to do so. Let me explain. Decades ago there certainly was a need for groups that would fight against racism, especially government sponsored racism through unequal laws. But since that time, the laws have been cleaned up and a new generation of Americans who never experienced institutional racism, grew up. Let's say you've been fighting racism for 10, 20, or 30 years and let's just say that you've won, there isn't really any racism that can stop a determined person from being successful anymore. Well what the hell are you going to do? Your job pays pretty well and gets lots of donations and funding. Should you admit that there's nothing much more that needs to be done and declare victory, well you're going to be unemployed. And since you spent the past few decades fighting racism, there's not that many other fields you can go into. Maybe PR, but you would have to start at an entry level salary. It's like a vampire hunter who's killed all the vampires, what's he going to do with the rest of his life? A lot of this might be subconscious, it's hard to change your ways, but wouldn't you, if you were in that position, just continue on and try to fight racism wherever you can? And since there isn't large institutional racism anymore, you have to dig deep and find petty cases, maybe you have to dig so hard that you end up seeing racism that's not really there. But all that's because you've won, racism has been defeated. Now in the movies, the story would end right there with you standing tall and proud. But reality doesn't fit neatly into a perfect script. Yeah you've won but your life continues, and you need money and a job to live. Ah reality, so inconvenient.[/citation]
Send 'em to Australia; racism is unfortunately still a social issue here, and not a historical one